Microsoft integrates OpenAI-powered Deep Research into Azure AI Foundry agents: public preview launched with pricing and APIs available
- Graziano Stefanelli
- Jul 9
- 3 min read

The new Deep Research tool arrives on Azure to build AI agents capable of analysis, citations, and automated synthesis.
Microsoft has officially announced the integration of Deep Research into its Azure AI Foundry Agent Service, a new feature made available in public preview on July 8, 2025. This is an advanced capability, developed in collaboration with OpenAI, that enables the creation of intelligent agents capable of executing structured research, generating cited reports and in-depth summaries, and using in real time the combination of GPT-4o models, Bing-based web access, and a developer-controlled pipeline via SDK or API.
The “o3-deep-research” model handles complex inputs, clarifies user goals, and produces structured and cited documents.
At the heart of the initiative is the OpenAI o3-deep-research model, now accessible via Azure. This model is designed to:
• clarify the user's initial intent through a first GPT pass;
• consult real-time web sources via Bing Grounding;
• generate cited, coherent, and reusable content in a format suited for professional agents;
• provide modular outputs (text, key points, summaries, references) customizable by the developer.
The entire flow can be orchestrated through the Azure SDK, REST API, Logic Apps, or Azure Functions, making Deep Research suitable for enterprise contexts, knowledge management, CRM systems, dynamic intranets, or training platforms.
Token pricing is transparent and reflects the service’s advanced capabilities.
Microsoft has also released the preview pricing details, highlighting a consumption-based model similar to other GPT endpoints but with differentials for grounding and caching:
• $10 per 1M input tokens
• $40 per 1M output tokens
• $2.50 per 1M cached input tokens
• Additional charges apply for Bing Grounding, GPT clarification stage, and optional modules
The pricing is clearly aimed at enterprise users and professional developers, aligning with Azure AI Foundry’s positioning as an "Enterprise AI agent platform."
The feature positions Azure AI Foundry as a platform suited for mission-critical and knowledge-intensive contexts.
With Deep Research, Microsoft expands the capabilities of Azure agents far beyond conversational completion or basic automation. These new agents can:
• manage executive briefings and board-level summaries;
• aggregate large-scale sources for technical or scientific documentation;
• generate personalized reports with verified citations and custom styles;
• be orchestrated in complex business workflows where the output must be reliable, traceable, and up-to-date.
This reflects a fundamentally different vision than general-purpose chatbots: here, AI acts as a vertical research assistant, developer-controlled and fully integrable into existing enterprise logic.
The public preview is now active and usable via Azure subscriptions, with support for open-source tools and Python libraries.
Developers using Azure AI Studio or Azure CLI can already access the preview and test the /deep-research API, officially documented under the AI Foundry section. It is possible to define prompts, multi-step pipelines, grounding controls, precision thresholds, and output formatting. Some public demos have been released on GitHub, and more detailed guides are expected soon on Microsoft’s Medium and DevBlog channels.
Compared to alternatives from Google and Amazon, this solution offers greater architectural flexibility and modularity.
Unlike the AI solutions offered by Google Gemini or Amazon Bedrock, Deep Research presents a model of true agent composability: each function (input clarification, grounding, output synthesis) is separate and configurable, with direct control. While Google leans toward integrated but less programmable tools, and Amazon toward prebuilt hosted models, Azure enables developers to build tailor-made agents for knowledge bases, governance, or internal operations.
This integration strengthens the Microsoft–OpenAI alliance and lays the groundwork for future GPT-native functions within Azure environments.
The arrival of Deep Research shows how Microsoft is increasingly moving advanced GPT capabilities not only into Copilot, but also into programmable and composable Azure services. The ability to access a specialized, controllable GPT model for research, synthesis, and documentation marks a strategic move toward multi-agent orchestration, semantic pipelines, and knowledge-aware AI in complex enterprise environments.
_______
FOLLOW US FOR MORE.
DATA STUDIOS




