Parameters Are a Commodity. Tool Access Isn't.

Google Deep Research Max ships on MCP. Not a proprietary connector. The real AI moat moved from model parameters to enterprise tool access.

3 min read

Google Deep Research Max ships with MCP as the integration layer. Not a bespoke Google connector. Not the Vertex AI toolkit. The open protocol Anthropic released and the Linux Foundation now governs.

That's the signal. Not the launch. The choice.

Google is the last company on earth that needs to use somebody else's protocol. It has Gemini, it has Vertex, it has the ads graph, it has the largest internal data pipeline in the industry. And when the time came to wire up its flagship enterprise research product (the one that's supposed to replace the intern tier at every consultancy), Google picked the open standard.

The cheap reading is "protocol won, good for the ecosystem." The harder reading is different. Google just told the market where the moat actually is.

Here is what the moat is not: the model. Parameter counts are running into a wall most people haven't priced in yet. Open weights keep closing the gap on real tasks. Distillation shortens every lead the frontier labs think they have. Inference costs are deflating at a pace that makes "our model is smarter" a slide you can't defend anymore. Google knows this better than anyone. They built half the math that made it true.

Here is what the moat is: the tool graph. The set of clean, authenticated, rate-limited, compliance-cleared pipes into the customer's actual data. Salesforce records. Snowflake warehouses. Epic medical systems. SAP transactions. The plumbing that makes a research agent useful instead of ornamental.

The people who own that plumbing aren't the model providers. They're the tool providers, the teams shipping MCP servers for each enterprise system. The choke point moved. 110 million SDK downloads. Linux Foundation governance. Microsoft native support. Google now signing the same treaty. No one company owns the protocol, which means every model provider has to talk to every tool builder through the same pipe. That's not a gift to the ecosystem. That's a gravity well.

For anyone building, the operational read is simpler than the narrative implies. Most of us are on the wrong side of the moat.

Training a model without shipping MCP servers is selling horsepower into a market that pays for tires. Your enterprise agent will hit the ceiling of whatever MCP servers already exist in the customer's stack. Audit that ceiling before you ship, or you are building on the wrong layer.

The contrarian play isn't "train a better model." Almost no one will win that. The contrarian play: build the MCP server for the enterprise system everyone has and no one can cleanly connect to. Pick SAP. Pick Epic. Pick Workday. Pick the boring, regulated, heavily-credentialed system where the real money lives and the model providers can't help you. That's the moat. It has been hiding in plain sight because it looks like plumbing.

Google just confirmed it by plugging in.