Home
/
Podcast
/
How Block Deployed AI Agents Company-Wide in 2 Months

How Block Deployed AI Agents Company-Wide in 2 Months

December 11, 2025
AI
Block’s VP of Engineering, Angie Jones, reveals how a single internal experiment became Goose, one of the first MCP clients, and how Block deployed AI agents across every department in just eight weeks.
Hosted by
Ankit Jain
Co-founder at Aviator
Guest
Angie Jones
VP of Engineering, AI Tools & Enablement

About Angie Jones

Angie Jones is the Vice President of Engineering, AI Tools & Enablement at Block, Inc.

She is an award-winning teacher and international keynote speaker. As a Master Inventor, Angie is known for her innovative and out-of-the-box thinking style, which has resulted in more than 25 patented inventions in the areas of virtual worlds, collaboration software, social networking, smarter planet, and software development processes.

The Origin Story of Goose

Block’s Goose, an open-source AI agent, started with an IC—a machine learning engineer named Bradley Axton tinkering with the idea of building an agent that could automate the boring engineering tasks we all have to do: environment setup, repetitive workflows, things that get in the way of the “real” engineering work.

Bradley built Goose, engineers loved it and immediately wanted to connect it to all sorts of internal systems—except back then, each integration was a bespoke one-off. That’s when Bradley and a few others started thinking, We need a protocol. They reached out to Anthropic, who showed them an early MCP spec—before launch. It was exactly what we needed. We rebuilt Goose as an MCP client and open-sourced it, making it one of the very first MCP clients available.

Open-Sourcing Goose

Goose is completely open source, under an extremely permissive license. That openness has been a magnet for community innovation. Goose is usually the first agent to ship any new pattern—sub-agents, repeatable workflows, UI rendering inside the chat, you name it. The community experiments and contributes, and eventually every other agent in the ecosystem adopts the same patterns.

We think of Goose not just as a tool engineers use but as a trendsetter for what agentic AI should look like.

From Tool for Engineers to Tool for Everyone

​​It started as an engineering tool. But when it blew up—number one trending on GitHub, trending on Twitter—everyone across Block noticed.

And because the interface is just natural language, Goose became accessible to anyone. Suddenly marketing, finance, product, customer service, design, even executive assistants were using Goose. They began requesting MCP servers for the tools they use every day. What we quickly realized is that Goose isn’t a developer tool. It’s a general-purpose agent.

This is why I push back on the narrative that MCP is an “engineering thing.” APIs aren’t just for developers, and neither is MCP. It's an implementation detail enabling agents to connect to tools—and everyone benefits from that.

Security and Scale

Block is a financial services company, so our security team rightfully freaked out seeing employees pulling random MCP servers off GitHub.

So we built our own. This all happened during our internal Hack Week, and engineers from across the company jumped in and built 60+ MCP servers in a week. Now we have more than twice that. Every internal app we use has an MCP server, and Goose, or any other agent, can connect to them. MCP is agent-agnostic.

To add a new MCP server to our internal allow list, it must pass two reviews:

  1. A review by engineers deeply familiar with MCP and sound engineering practices.

  2. A review by our security team.

Solving Context Window Bloat 

Internally, we ship Goose preconfigured with all our MCP servers. Every employee can toggle on what they need without touching JSON configs, scopes, API keys—none of that. Naturally, people turn everything on.

That meant huge tool descriptions being sent to the LLM every call. So we built a system where the LLM inspects the user request, determines which tools might be needed, and only enables those servers for that specific query.

This eliminated the context bloat problem and made the system far more efficient.

Tool Overload and Fragmentation

We don’t try to force a single tool. It's too early, and the landscape shifts daily.

Instead, we give teams a toy box of options.

When you force a tool on engineers, they resist. Give them choices, and adoption follows naturally.

Eventually, I’d love for adoption to converge to one or two tools. But today? It's far too early to place any bets.

Internal Adoption for Non-Engineers 

Fortunately, one of my teams is Developer Relations. They were already teaching external engineers how to use Goose—and the principles apply to all coding agents.

We realized we needed internal DevRel. So DevRel engineers started teaching non-engineers how to use agents, and we realized they had to do it in plain language, no CLI jargon.

We also set up two Slack channels. One is for help, where anyone can ask for help and our internal power users assist them, and the other is an inspiration channel, where people share cool workflows and automations. This created organic momentum.

Measuring the Success of Agentic Adoption

We treat everything as an experiment. Block has always had a bold innovation culture. Goose's open-source success gave us early insights from real users—engineers and non-engineers—which shaped our thinking.

We listen closely to frustrations, friction, and feedback. That informs the bigger bets we’re making now, things that go far beyond just AI for coding.

How to Drive Internal AI Adoption

You can’t uplift everyone at once. It’s too hard.

Find champions, people who won’t be discouraged by early failures. I assembled a cohort of 50 engineers whose repos collectively cover ~60% of Block’s code. They dedicate ~30% of their time to AI enablement, experimentation, and pattern creation.

One example: we let teams assign Goose to Jira or Linear tickets. A sprint team tried it: after week one, they were done with three weeks worth of work. They pulled more tasks in. Week two, same thing. This is real, measurable developer velocity. We’re rolling this out to the entire company next year.

Predictions for Software Engineering

I try not to make predictions anymore, things just move too fast. But one shift I’m noticing: companies are hiring junior engineers again.

With AI tools, junior developers can produce incredibly strong work. Block is launching Builder Fellowship for junior engineers soon. I don’t look at resumes anymore. I look at your portfolio and want to see what you’ve built with AI.

You can’t uplift everyone at once. It’s too hard. Find champions, people who won’t be discouraged by early failures

Get notified of new episodes

Listen on
Join Hangar DX
A vetted community of developer-experience (DX) enthusiasts.

Chapters

00:00 Introduction to AI Transformation at Block

04:18 Building Goose: The AI Agent

08:50 Adoption Across Departments

11:59 Scaling MCP Servers

13:17 Technical Challenges with MCPs

16:33 Governance and Security of MCPs

17:35 Tool Overload and Centralization Strategies

20:09 Overcoming Cold Start Problems

23:34 Evolving Goose for Different Departments

25:58 Open-Source and Internal Development

28:05 Measuring Success in AI Initiatives

30:02 Recommendations for AI Adoption

33:19 Future Predictions and Initiatives

You can’t uplift everyone at once. It’s too hard. Find champions, people who won’t be discouraged by early failures

Get notified of new episodes

Subscribe to receive new Hangar DX podcast releases.

We’ll be in touch with new episodes!