Model Request Error - Failed to generate an LLM response.
It was working fine till yesterday and stop working suddenly, tried to update the version but already on latest but still the same issue, can you fix this
Thank you for jumping in @Ash McConnell to help out @chintan_patel !
Apologies for getting back to this so late!
Sorry you’re running into this issue! The error “Model Request Error - Failed to generate an LLM response” can be caused by several things:
Backend issues or outages – Sometimes this error is due to a temporary outage or incident on our side.
Token/rate limits – If you’ve hit your daily or monthly usage limit, the CLI may stop responding. Try running /usage in the CLI to check your quota.
Local config issues – Sometimes, local configuration files can become corrupted. Renaming or deleting your ~/.rovodev folder (or just the config.yml inside) will reset your CLI to a fresh state. You’ll need to re-authenticate after this step.
If you’ve tried all the above and are still seeing the error, please send us the output of acli rovodev log so we can investigate further.
Thank you for your patience!
Jov