TechFlow News: On May 7, the B.AI API model library underwent a major expansion, with the official launch of four new models: GPT-5.5 Instant, DeepSeek-v3.2, MiniMax-M2.7, and GLM-5.1.
Notably, GPT-5.5 Instant was fully adapted at the infrastructure level and integrated into the API within 48 hours of its release by OpenAI—enabling zero-latency access to this next-generation computing capability. Compared to its predecessor, GPT-5.3 Instant, the newly launched GPT-5.5 Instant reduces hallucination rates by 52.5% in high-stakes domains such as healthcare, law, and finance, while significantly improving logical coherence and output conciseness—greatly enhancing factual reliability for complex tasks.
All the above models are now fully available via the B.AI API. Developers can consult the official documentation to quickly integrate the latest model interfaces and experience more stable, trustworthy AI capabilities—with zero latency.




