Introduction
On February 2, 2026, Snowflake announced a multi-year, $200 million partnership with OpenAI — the second $200 million AI deal the data cloud company had struck in three months. The first, with Anthropic in December 2025, was already reshaping how Snowflake’s 12,600 enterprise customers access frontier models. Together, these two deals signal something more specific than a data warehouse adding an AI layer: they mark a deliberate attempt to make the data platform the operational core of enterprise AI agents, not just their storage backend.
From Azure Proxy to Direct Integration
Until this deal, Snowflake customers who wanted OpenAI models had to go through Microsoft Azure — a partnership intermediated by a competitor with its own data platform ambitions. The February 2026 agreement changes that structure entirely. As Baris Gultekin, Snowflake’s VP of AI, explained in the announcement: “This reflects Snowflake’s conviction that providing OpenAI technology to our enterprise customer base, at scale and with enterprise-grade reliability, is strategically important.”
The practical effect is that GPT-5.2 and other OpenAI models are now available natively inside Snowflake Cortex AI across all three major cloud providers — AWS, Azure, and GCP — without routing traffic or data through a Microsoft-managed endpoint. For enterprises with strict data residency requirements, this matters. The data never needs to leave the Snowflake perimeter to be processed by the model.
What Cortex AI Actually Does
Snowflake Cortex AI is not a single product — it’s a suite of AI services embedded directly into the data platform. After the OpenAI integration, it covers three distinct use patterns:
The first is Cortex AI Functions, which let data teams call GPT-5.2 (or Claude Opus 4.5 from the Anthropic deal) from SQL across structured data, text, images, and audio. A data engineer can run a sentiment analysis on millions of customer support tickets with a single SQL query that calls the model inline, without spinning up a separate inference pipeline.
The second is Snowflake Intelligence, the natural-language interface aimed at non-technical users. A finance analyst can ask “what drove the spike in customer churn last quarter?” and get an answer grounded in the company’s actual data — not a generic LLM response hallucinated from training data. Snowflake Intelligence uses OpenAI models when accessed via the OpenAI integration and Claude Sonnet 4.5 when routed through the Anthropic one.
The third — and arguably most consequential — is Cortex Code, which Snowflake unveiled alongside the OpenAI partnership on February 3, 2026. Unlike GitHub Copilot or other generic coding assistants, Cortex Code understands the enterprise’s specific data context: which tables are production-critical, which transformations are expensive, which columns contain PII that triggers governance rules. Christian Kleinerman, Snowflake’s EVP of Product, described it as enabling teams to “use natural language to build, optimize, and deploy data pipelines, analytics, machine learning workloads, and AI agents.”
Cortex Code: The Governance-Aware Coding Agent
The distinction that matters for Cortex Code is not what it can generate — SQL, Python, ML pipelines — but what it refuses to generate blindly. A standard LLM-based coding assistant will happily produce a full-table scan on a 500TB production table. Cortex Code, trained on Snowflake’s operational context, knows that query is expensive and will push back or suggest an incremental alternative.
Robert Kramer, an analyst at Moor Insights and Strategy, highlighted the CLI deployment as the key differentiator: it allows “developers to retain their enterprise’s data context, specifically data stored in Snowflake, while still being available in the code editor of their choice, locally on their machines.” Cortex Code works inside VS Code and Cursor as well as Snowflake’s own Snowsight interface.
On February 23, 2026, Snowflake extended Cortex Code further — moving beyond Snowflake-native workflows to support dbt and Apache Airflow, two of the most common orchestration tools in enterprise data stacks. That expansion, now generally available, means teams running heterogeneous pipelines across multiple systems can get context-aware assistance without migrating everything to Snowflake first.
The scale of adoption suggests this is landing. As of early 2026, over 9,100 Snowflake accounts are actively using Cortex products, with AI-related workloads growing more than 200% year-over-year.
The Model-Agnostic Bet
Spending $400 million in three months on AI model partnerships — $200 million with Anthropic in December 2025, $200 million with OpenAI in February 2026 — is a clear statement of strategy: Snowflake is not picking a model winner. It is positioning itself as the neutral platform layer that can run whichever model its customers prefer.
This is not merely a hedge. Anthropic’s business development team explicitly noted that the Snowflake deal is non-exclusive — Anthropic also has a partnership with Databricks, Snowflake’s primary competitor. Snowflake, for its part, also maintains partnerships with Mistral AI and Microsoft’s Azure OpenAI Service. The point is interoperability: give enterprise customers model choice while locking them into the data platform that governs access to their proprietary data.
It also reflects a real production constraint that has emerged as enterprises move beyond proof-of-concept deployments. Different tasks have different model requirements. A legal review pipeline might benefit from Claude’s longer context window and reasoning depth; a real-time customer-facing chatbot might prioritize GPT-5.2’s speed and tool-use reliability. An enterprise that locked into a single model provider in 2024 is now re-architecting. Snowflake’s bet is that the pain of switching data platforms is higher than the pain of switching models.
If you’ve followed how enterprise AI agents moved from pilots to production in 2026, this partnership structure makes sense: the bottleneck was never model quality, it was data access, governance, and deployment reliability. Snowflake is addressing all three simultaneously.
Project SnowWork and What Comes Next
In March 2026, Snowflake announced Project SnowWork, described as an “autonomous enterprise AI platform” that extends the agentic capabilities beyond technical users. The pitch is direct: a business analyst can ask for a board-ready revenue forecast or a supply chain bottleneck analysis, and SnowWork will execute the multi-step data workflow — pulling data, running queries, generating visualizations, and producing a structured output — without requiring any code.
This is the logical endpoint of the Cortex AI roadmap: a platform where the model, the data, and the governance layer are co-located, and where non-technical employees can invoke complex analytical workflows through natural language. Whether SnowWork delivers on that promise in practice is still an open question — most enterprise GenAI projects still fail to reach production, and autonomous multi-step agents remain brittle outside tightly controlled environments. But the architecture Snowflake is building with OpenAI and Anthropic is at least structurally designed to address the root causes of those failures: model isolation from real data, lack of governance, and the gap between prototype and production.
Conclusion
The Snowflake-OpenAI deal is not just a distribution agreement — it is part of a deliberate repositioning of the data warehouse as the operational layer for enterprise AI. By integrating GPT-5.2 directly into Cortex AI alongside Claude, Snowflake gives 12,600 enterprise customers model optionality without fragmenting their data architecture. Cortex Code makes the coding agent governance-aware; Snowflake Intelligence makes the data layer accessible to non-engineers; Project SnowWork sketches the agentic endpoint. The critical test is whether these products hold up when deployed at scale across messy, heterogeneous enterprise data — but the structural logic behind the investment is sound.
Further Reading
- Snowflake spends $200M to bring OpenAI to customers — The Register — Covers the deal mechanics and what it means for Snowflake’s existing Azure relationship.
- What Snowflake’s deal with OpenAI tells us about the enterprise AI race — TechCrunch — Broader analysis of multi-provider AI strategies and what they signal about enterprise buying behavior.
- Snowflake debuts Cortex Code — InfoWorld — Technical breakdown of how Cortex Code integrates governance context into the coding agent workflow.
