I'm using opencode on an amazon ec2 instance with bedrock inference. I got it all set up and working, but every few seconds I get:
AI_RetryError: Failed after 3 attempts, Last error: undefined: Too many tokens, please wait before trying again.
Seems to be worse with opus than sonnet (4.0 for both)
I'm using opencode on an amazon ec2 instance with bedrock inference. I got it all set up and working, but every few seconds I get:
AI_RetryError: Failed after 3 attempts, Last error: undefined: Too many tokens, please wait before trying again.
Seems to be worse with opus than sonnet (4.0 for both)