The LLM proxy server is not accessible.
This application needs to connect to an LLM proxy server to access AI models.
To fix this:
- Ensure the LLM proxy is running at:
http://localhost:8081
- Check that you have API keys configured in the proxy
- Verify there are no network or firewall issues
Once the proxy is running with configured API keys, click Retry to continue.