Wardley mapping the LLM ecosystem. @ Irrational Exuberance

Hi folks,

This is the weekly digest for my blog, Irrational Exuberance. Reach out with thoughts on Twitter at @lethain, or reply to this email.


Posts from this week:

- Wardley mapping the LLM ecosystem.
- Wardley mapping of Gitlab Strategy.


Wardley mapping the LLM ecosystem.

In How should you adopt LLMs?, we explore how a theoretical ride sharing company, Theoretical Ride Sharing, should adopt Large Language Models (LLMs). Part of that strategy’s diagnosis depends on understanding the expected evolution of the LLM ecosystem, which we’ve build a Wardley map to better explore.

This map of the LLM space is interested in how product companies should address the proliferation of model providers such as Anthropic, Google and OpenAI, as well as the proliferation of LLM product patterns like agentic workflows, Retrieval Augmented Generation (RAG), and running evals to maintain performance as models change.


This is an exploratory, draft chapter for a book on engineering strategy that I’m brainstorming in #eng-strategy-book. As such, some of the links go to other draft chapters, both published drafts and very early, unpublished drafts.

Reading this document

To quickly understand the analysis within this Wardley Map, read from top to bottom to understand this analysis. If you want to understand how this map was written, then you should read section by section from the bottom up, starting with Users, then Value Chains, and so on.

More detail on this structure in Refining strategy with Wardley Mapping.

How things work today

If Retrieval Augmented Generation (RAG) was the trending LLM pattern of 2023, and you could reasonably argue that agents–or agentic workflows–are the pattern of 2024, then it’s hard to guess what the patterns of tomorrow will be, but it’s a safe guess that there are more, new patterns coming our way. LLMs are a proven platform today, and now are being applied widely to discover new patterns. It’s a safe bet that validating these patterns will continue to drive product companies to support additional infrastructure components (e.g. search indexes to support RAG).

Current state of LLM ecosystem.

This proliferation of patterns has created a significant cost for these product companies, a problem which market forces are likely to address as offerings evolve.

Transition to future state

Looking at the evolution of the LLM ecosystem, there are two questions that I believe will define the evolution of the space:

  1. Will LLM framework platforms for agents, RAG, and so on, remain bundled with model providers such as OpenAI and Anthropic? Or will they, instead, split with models and platforms being offered separately?
  2. Which elements of LLM frameworks will be productizable in the short-term? For example, running evals seems like a straightforward opportunity for bundling, as would providing some degree of agent support. Conversely, bundling RAG might seem straightforward but most production usecases would require real-time updates, incurring the full complexity of operating scaled search clusters.

Depending on the answers to those questions, you might draw a very different map. This map answers the first question by imagining that LLM platforms will decouple from model providers, while also allowing you to license with that platform for model access rather than needing to individually negotiate with each model provider. It answers the second question by imagine that most non-RAG functionality will move into a bundled platform provider. Given the richness of investment in the current space, it seems safe to believe that every plausible combination will exist to some degree until the ecosystem eventually stabilizes in one dominant configuration.

Current state of LLM ecosystem.

The key drivers of this configuration is that the LLM ecosystem is investing new patterns every year, and companies are spinning up haphazard interim internal solutions to validate those patterns, but ultimately few product companies are able to effectively fund these sorts of internal solutions in the long run.

If this map is correct, then it means eventual headwinds for both model providers (who are inherently limited to providing their own subset of models) as well as narrow LLM platform providers (who can only service a subset of LLM patterns). The likely best bet for a product company in this future is adopting the broadest LLM pattern platforms today, and to explicitly decouple pattern platform from model provider.

User & Value Chains

The LLM landscape is evolving rapidly, with some techniques getting introduced and reaching wide-spread adoption within a single calendar year. Sometimes those widely adopted techniques are actually being adopted, and other times it’s closer to “conference-talk driven development” where folks with broad platforms inflate the maturity of industry adoption.

The three primary users attempting to navigate that dynamism are:

  1. Product Engineers are looking for faster, easier solutions to deploying LLMs across the many, evolving parameters: new models, support for agents, solutions to offload the search dimensions of retrieval-augmented-generation (RAG), and so on.
  2. Machine Learning Infrastructure team is responsible for the effective usage of the mechanisms, and steering product developers towards effective adoption of these tools. They are also, in tandem with other infrastructure engineering teams, responsible for supporting common elements for LLM solutions, such as search indexes to power RAG implementations.
  3. Security and Compliance – how to ensure models are hosted safely and securely, and that we’re only sending approved information? how do stay in alignment with rapidly evolving AI risks and requirements?

To keep the map focused on evolution rather than organizational dynamics, I’ve consolidated a number of teams in slightly artificial ways, and omitted a few teams that are certainly worth considering. Finance needs to understand the cost and usage of LLM usage. Security and Compliance are really different teams, with both overlapping and distinct requirements between them. Machine Learning Infrastructure could be split into two distinct teams with somewhat conflicting perspectives on who should own things like search infrastructure.

Depending on what you want to learn from the map, you might prefer to combine, split and introduce a different set of combinations than I’ve selected here.


Wardley mapping of Gitlab Strategy.

Gitlab is an integrated developer productivity, infrastructure operations, and security platform. This Wardley map explores the evolution of Gitlab’s users’ needs, as one component in understanding the company’s strategy. In particular, we look at how Gitlab’s strategy of a bundled, all-in-one platform anchors on the belief that build and security tooling is moving from customization to commodity.


This is an exploratory, draft chapter for a book on engineering strategy that I’m brainstorming in #eng-strategy-book. As such, some of the links go to other draft chapters, both published drafts and very early, unpublished drafts.

Reading this document

To quickly understand the analysis within this Wardley Map, read from top to bottom to understand this analysis. If you want to understand how this map was written, then you should read section by section from the bottom up, starting with Users, then Value Chains, and so on.

More detail on this structure in Refining strategy with Wardley Mapping.

How things work today

Today, managing build, deploys and security are somewhat custom endeavors. The kind of work that even small technology companies dedicated teams to operating smoothly.

Wardley map of developer productivity space.

The value chains across users are highly coupled: there is no value chain that doesn’t overlap across users. For example, debugging a failed build is important to both the developers and to the developer experience team. Similarly, understanding attribution of costs is essential to both the developer experience team and to the finance team.

Because of that bundling, teams that buy best-in-breed solutions rather than a bundled stack spend significant time integrating them together to work properly. It’s not uncommon for teams to spend a day a month on just the finance and developer experience integration. This sort of customization is unique for each company, but is rarely the company’s special sauce. Rather, it’s the result of poor interoperabiltiy between many tools in the people systems and developer systems space.

Transition to future state

It’s fairly clear that more and more components of this map are shifting from custom to product. Gitlab has a clear point of view in these ecosystem standardizing, evolving up from custom implementations and toward products and commoditization.

Wardley map of developer productivity space.

These shifts will bring an increasingly large number of companies into Gitlab’s addressable market, including annoying but low value problems like storing build and deploy logs for future access. Most markets vacillate between pursuing “best of breed” (you buy a number of specialized vendors) and “all-in-one” (you buy one, comprehensive and highly integrated solution).

Gitlab has placed a clear bet on being an all-in-one solution by solving for both the traditional developer and developer experience users as well as the security user. This appears to reflect a belief that security tooling is quickly moving towards becoming a commodity solution, an interesting view, and one whose validity we’ll see negotiated in the market as Gitlab competes with companies like Wiz and Snyk for marketshare.

User & Value Chains

Gitlab describes itself as “most comprehensive AI-powered DevSecOps platform.” This is a broad ambition, and consequently there are quite a few users for the platform. For this mapping exercise, we are going to focus on four users:

  1. Developers at the company. The product and infrastructure engineers who are using the Gitlab platform as a tool within their workflows. These are the developers responsible for creating and running the company’s product.

    The value chains they’re focused on are deploying software, debugging failed deploys, and optimizing the speed at which builds and deploys occur. Underneath those needs are a number of infrastructure components performing the actual deploy, collecting logs for debugging, and so on.

  2. Developer Experience who are responsible for selecting, onboarding and operating the deployment infrastructure in the company. More broadly, this team is responsible for the overall productivity of the company’s developers.

    They don’t have any value chain that is unique to them, but they are tightly involved in every other users’value chains. This creates a unique broad view of the map. Further, the developer experience team is generally the expert on each value chain, having the deepest view.

  3. Security & Compliance who maintain the security infrastructure and compliance postures for your company. They require vulnerability scanning to detect supply chain security attacks, as well as identifying common issues in developed software such as the OWASP Top Ten.

    The value chain they’re focused on is software vulnerability scanning, which in turn depends on a database of package vulnerabilities and a scanner for detecting those packages and other common vulnerabilities.

  4. Finance who monitor the cost and usage of your platform. They’re most focused on the projection and attribution of costs represented by the platform. For example, they would want to model the infrastructure costs of hiring an additional 50 product engineers in terms of the additional builds, deploys, and so on they would consume.

    The value chain they’re focused on is understanding attribution and usage, which in turn relies on an ownership graph mapping each piece of software (and each build, and each test run, and each security issue, etc) to a concrete team within the company.

There are more users we could dig into, but these are the four most important customers in evaluating Gitlab’s strategic approach.


That's all for now! Hope to hear your thoughts on Twitter at @lethain!


This email was sent to you
why did I get this?    unsubscribe from this list    update subscription preferences
Will Larson · 77 Geary St · co Calm 3rd Floor · San Francisco, CA 94108-5723 · USA

Email Marketing Powered by Mailchimp

Older messages

2024 in review. @ Irrational Exuberance

Thursday, December 19, 2024

Hi folks, This is the weekly digest for my blog, Irrational Exuberance. Reach out with thoughts on Twitter at @lethain, or reply to this email. Posts from this week: - 2024 in review. 2024 in review. A

Measuring developer experience, benchmarks, and providing a theory of improvement. @ Irrational Exuberance

Wednesday, December 11, 2024

Hi folks, This is the weekly digest for my blog, Irrational Exuberance. Reach out with thoughts on Twitter at @lethain, or reply to this email. Posts from this week: - Measuring developer experience,

Rough notes on learning Wardley Mapping. @ Irrational Exuberance

Tuesday, December 10, 2024

Hi folks, This is the weekly digest for my blog, Irrational Exuberance. Reach out with thoughts on Twitter at @lethain, or reply to this email. Posts from this week: - Rough notes on learning Wardley

Video of practice run of QCon SF 2024 talk on Principal Engineers. @ Irrational Exuberance

Wednesday, November 27, 2024

Hi folks, This is the weekly digest for my blog, Irrational Exuberance. Reach out with thoughts on Twitter at @lethain, or reply to this email. Posts from this week: - Video of practice run of QCon SF

How to get more headcount. @ Irrational Exuberance

Wednesday, November 20, 2024

Hi folks, This is the weekly digest for my blog, Irrational Exuberance. Reach out with thoughts on Twitter at @lethain, or reply to this email. Posts from this week: - How to get more headcount. How to

You Might Also Like

• Promo Super Package for Authors • FB Groups • Email Newsletter • Tweets • Pins

Wednesday, December 25, 2024

Newsletter & social media ads for books. Enable Images to See This "ContentMo is at the top of my promotions list because I always see a spike in sales when I run one of their promotions. The

🤝 Where the Magic Happens as a Biz Buyer

Tuesday, December 24, 2024

For Goodness' Sake, Know Why You're Doing It Biz Buyers, If you're new here, welcome to Main Street Minute — where we share some of the best tips, tools, and ideas from inside our Main

• New Year's Day Newsletter Promo for Authors ● Reserve Your Spot

Tuesday, December 24, 2024

Book Your Spot Now in Our New Year's Day Holiday Books Email Newsletter ! ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ Book Your Spot in Our New Year's Day Email Newsletter Enable Images

24 SEO Tips for 2025 (Free PDF Inside)🎄

Tuesday, December 24, 2024

New year, new strategy – your guide to 2025 SEO success is here! Hi Reader, Merry Christmas! 🎅✨ I hope you're enjoying a cozy holiday season. As we look toward 2025, it's the perfect time to

Why The American Military Pretends to Track Santa Claus

Tuesday, December 24, 2024

They know when he's been sleeping, they know when he's awake. ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌

• Affordable Kindle Unlimited Promos for Authors •

Tuesday, December 24, 2024

Affordable KU Book Promos "I'm amazed in this day and age there are still people around who treat you so kindly and go the extra mile when you need assistance. If you have any qualms about

Level up like you are using cheat codes 🎓

Monday, December 23, 2024

Become CXL certified ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏

Reminder - Our best price yet💥

Monday, December 23, 2024

Get 13 months for $99 per month ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏

Jólabókaflóð

Monday, December 23, 2024

It's Icelandic and it's awesome. ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌