$ cat /docs/how-this-works.txt

we-dont-need-no-web-dev emergency docs

The AI docs page failed to manifest, so this backup note is filling in.

How the site usually works:
- every request hits the edge function
- the URL path is sent to OpenRouter
- a free chat model generates an entire HTML page on the fly
- there are no static frontend files for normal routes

What changed:
- StepFun is no longer in the default free-model chain
- the server now tries multiple free providers in sequence
- timeout, rate limit, missing-model, and upstream 5xx failures now fall through to the next provider
- if every free model fails, the server still returns a backup page instead of a raw error

Power user knobs:
- ?long=true asks for the more expensive free-model chain
- ?model=provider/model-name forces a specific model
- ?key=sk-or-v1-... overrides the server key for testing

Latest failure summary: nvidia/nemotron-3-nano-30b-a3b:free: 429 {"error":{"message":"Rate limit exceeded: free-models-per-min. ","code":429,"metadata":{"headers":{"X-RateLimit-Limit":"20","X-RateLimit-Remaining":"0","X-RateLimit-Reset":"1775812800000"},"provider_name":null}},"user_id":"user_2urGunRG98sW | openai/gpt-oss-120b:free: 429 {"error":{"message":"Rate limit exceeded: free-models-per-min. ","code":429,"metadata":{"headers":{"X-RateLimit-Limit":"16","X-RateLimit-Remaining":"0","X-RateLimit-Reset":"1775812800000"},"provider_name":null}},"user_id":"user_2urGunRG98sW | minimax/minimax-m2.5:free: 429 {"error":{"message":"Rate limit exceeded: free-models-per-min. ","code":429,"metadata":{"headers":{"X-RateLimit-Limit":"16","X-RateLimit-Remaining":"0","X-RateLimit-Reset":"1775812800000"},"provider_name":null}},"user_id":"user_2urGunRG98sW | z-ai/glm-4.5-air:free: 429 {"error":{"message":"Rate limit exceeded: free-models-per-min. ","code":429,"metadata":{"headers":{"X-RateLimit-Limit":"16","X-RateLimit-Remaining":"0","X-RateLimit-Reset":"1775812800000"},"provider_name":null}},"user_id":"user_2urGunRG98sW

The website remains committed to the idea that a browser deserves HTML even when the robots are unavailable.