OpenAI
APIs / ChatGPT / Codex
Official status link available. Open the provider status page for the latest live status.
LLM API Status & Debugging Toolkit
See official provider status, diagnose API errors, generate cURL test commands, and understand whether the issue is your code, your key, your network, or the provider.
Provider status matrix
MVP uses official links and static troubleshooting data. It does not claim live monitoring.
APIs / ChatGPT / Codex
Official status link available. Open the provider status page for the latest live status.
API / Console / Claude.ai
Official status link available. Open the provider status page for the latest live status.
Gemini API / AI Studio / Model availability
Official status link available. Open the provider status page for the latest live status.
API / Routing / Credits
Official status link available. Open the provider status page for the latest live status.
API / Console / Model serving
Official status link available. Open the provider status page for the latest live status.
API / Inference / Models
Official status link available. Open the provider status page for the latest live status.
API / Predictions / Model hosting
Official status link available. Open the provider status page for the latest live status.
API / Search grounding / Models
Official status link available. Open the provider status page for the latest live status.
Debugging tools
Select provider + error code and get likely causes, checks, fixes, retry strategy, and related links.
Explain an errorGenerate copyable OpenAI, Claude, Gemini, and OpenRouter test commands for terminal debugging.
Generate cURLWalk through provider, symptom, and environment to identify likely root causes and next checks.
Start diagnosticNeed real monitoring?
Join the early access list for server-side checks, Slack or email alerts, model availability monitoring, and latency history.