Open Responses API Standard Launches
- •Open Responses launches as a vendor-neutral API standard for interacting with hosted AI models.
- •The specification derives from OpenAI's Responses API to support advanced reasoning traces and features.
- •Major partners including Hugging Face, Ollama, and Vercel have committed to adopting the new protocol.
Simon Willison (a prominent software developer and tech blogger) has introduced "Open Responses," a major effort to standardize how software communicates with different Artificial Intelligence models. Currently, every AI provider often has its own way of sending and receiving data, which creates extra work for developers. This new specification provides a universal language that allows a single piece of code to talk to many different AI services without needing custom modifications for each one. The standard is based on the Responses API from OpenAI, rather than their older chat system. This choice is strategic because the newer format is designed to handle "reasoning traces" (often referred to as Chain-of-Thought)—the step-by-step logic a model shows when solving complex problems. By using this as a foundation, the standard ensures it can support the most advanced features of modern models. Launch partners like OpenRouter and Hugging Face mean that the protocol will immediately work with a vast majority of existing AI tools. To help developers adopt the standard, the project includes a compliance test suite. This tool is available as a web application that can interact with servers using CORS to ensure they follow the protocol correctly. While a web-based tool exists to test servers, Willison notes that similar tools for testing client-side software are still needed. Standardizing these connections is a critical step toward a more open ecosystem where users can switch between different AI providers as easily as changing a web host. The involvement of major Inference Framework providers like vLLM and Ollama suggests that Open Responses could become the default way developers interact with both local and cloud-based models. This reduces "vendor lock-in," a situation where a company becomes trapped using one provider's services because switching would be too technically dif