In this issue: - Fertilizer and Toxins—
- Margins
- A Better, Costlier Alexa
- Elasticity
- Counterparty Risk, Served Two Ways
- The Universal Denominator
Fertilizer and Toxins
"Ecosystem" is a really wonderful way to describe sectors of the economy. It's one of those metaphors that gets better the more you overthink it. An actual ecosystem describes a set of relationships that organize how matter and energy circulate in some area—the actual energy and matter required to produce or sustain an acre of rainforest or an acre of the Gobi desert. In both cases, there's rocky sediment and a constant flow of energy from the sun, but in the former case, that solar energy is doing something more interesting than warming up sand.
Thinking of a market as an ecosystem means thinking about what the inputs it needs are, and which entities' outputs are someone else's inputs. Nature has clearly engaged McKinsey many times, and loves complex supply chains in which every possible process is outsourced, so you have crazy chains of symbiotes designed to convert raw materials and sunlight into useful proteins. And natural ecosystems usually operate in an econ 101 environment of perfect competition, where just about everything is either at its maximum extent or fluctuating around it long-term.
But ecosystems aren't static. They're a convenient way to isolate some set of living things and pay attention to their interactions, but the global set of inputs can change. There are two great examples of this, offset by two billion years, which analogize nicely to two kinds of technology transitions. In reverse-chronological order, one of these was the discovery of the Haber-Bosch process, which converts atmospheric nitrogen, hydrogen, and energy into ammonia. It's on the short list of inventions to which you, statistically, owe your life; The Alchemy of Air estimates that without the artificial fertilizer this enables, the world's carrying capacity would be about four billion people, as long as we didn't eat meat and didn't eat much, period. This still fits within the extended-ecosystem model—the nitrogen and hydrogen exist, just not in an optimally useful form, and the energy-intensive process of making ammonia was and is fueled by solar power that's stored in the form of hydrocarbons. This process ends up being a dividend for anything calorie-intensive, and arguably bootstrapped our species from the relative poverty of the late 19th century to the modern, materially-abundant world.
There are darker instances of wholesale ecological change following the creation of novel inputs. For example, there was that time cyanobacteria developed photosynthesis, started emitting oxygen as a waste product, and possibly killed most life on earth, since oxygen was directly or indirectly toxic to so many of the species that flourished up to that point. Now, a more oxygenated atmosphere is great for us, of course, but we're the descendants of whoever tried to rebuild in the world's first known post-apocalyptic hellscape.
It's a useful framing to divide technologies into Haber-Bosch-style ones, that mostly make the rest of the economy function better, and Great Oxygenation Event technologies, which rapidly kill off any company that doesn't directly adopt or adapt to the technology in question.
Air conditioning, for example, is mostly a Haber Bosch-style tool: it was a competitive advantage to the handful of retailers who initially adopted it, then it became a necessity. There's a wonderful study that compared transcripts of legislative debates to show that speeches start to use simpler words when it's hot out. (Yes, they control for average temperature by month, so it's not just picking up on legislators phoning it in when they'd rather be on vacation.) AC ends up being a general thought-dividend, whose upside is diffuse enough that it's mostly captured by people who bought real estate in Singapore, Dubai, Atlanta, etc.
Factory electrification was closer to a Great Oxygenation Event. It changed where factories should be located (cheap land within commuting distance mattered more than proximity to a river for power). It changed how they expanded, by reducing the minimum increment of growth. It made them more fault-tolerant at the level of individual processes, since these processes weren't all connected by belts and pulleys, but the resulting efficiency improvement led to organizational changes that made them more obsessed with controlling variance. And the change in the chunkiness of expansion also helped create the concept of a growth stock; if a company could profitably retain its earnings, and keep investing them in expanding what it already had, shareholders didn't have to insist on the highest possible dividend at all times. These changes took place over a generation, and they're invisible today because we don't have any of the mental cruft that a 19th century factory manager might have. But that's another way of framing how big a change it was: whole domains of expertise rendered obsolete!
There are smaller-scale changes like the Great Oxygenation Event worth discussing. Two with close parallels are Netflix's introduction of ad-supported plans and large airlines' use of Basic Economy. In both cases, the company that made the change was not the first, and was actually reacting to someone else—there were plenty of providers of ad-supported video and of cheap flights with separate charges for everything a flyer would need. In both cases, the market was roughly segmented between companies that offered a product to price-sensitive customers and companies that targeted everyone else. When the biggest everyone-else company starts aiming for those same price-sensitive customers, it means the industry has switched from a discontinuous pricing strategy to a continuous one. Netflix did not invent the idea of charging less money for a slightly crappier viewing experience, but they did eventually decide that they mostly wouldn't let competitors beat them on cost. And since Netflix was operating in a high fixed-cost space, that meant it was pricing some of its smaller competitors out of long-term viability. Similarly, the network carrier airlines were making it so that Spirit, Southwest, etc. did not enjoy an automatic price advantage flying the same routes, but were instead competing with United-but-worse, and dealing with the latter's larger scale.
This is an old idea; it's mentioned in the unabomber manifesto, for example. But it's a powerful one, because it's a framework for deciding what to ignore. Some technology transitions are mostly rising tides that lift many boats, but don't have a direct impact on most of them; for industries that are generally levered to GDP growth, the main focus is on absorbing new demand rather than adopting the tools directly. In other cases, though, turnover within industries is driven by some companies asking how they'd have been designed if some new tool had been available all along, while others forget to ask this and get left behind. The contexts where AI has been adopted fastest are the ones where it’s a direct substitute for existing jobs: creating a simple frontend, pulling a few specific facts from a long document, transpiling across programming languages or formats, creating forgettable stock photos. That's a massive dividend to people who have six-figure skills but spend some fraction of their day on tasks that can also be done for something close to minimum wage. Longer-term, though, there's an evolution; The Diff has already noted that LLM-based support means that it's a good idea to merge customer support teams with product teams—if a customer has a question that can't be answered with references to existing documentation or previous customer reports, it's really a feature request. There will be some organizational flattening, a higher ratio of documentation to code, and more reorganizations. (Mutable AI—disclosure, I'm an investor—sees the endpoint of all of this as creating a digital twin for every organization.) AI is an improvement in the accessibility of computation, but within companies, its main effect is an increase in their internal bandwidth: it's faster to transmit knowledge to where it needs to be, which makes companies more agile but also places a much higher premium on clarity and accuracy in internal communications. As with many other technologies, what starts in the lab or in the text editor ends up living forever in the org chart.
Diff JobsCompanies in the Diff network are actively looking for talent. See a sampling of current open roles below: - A mission-driven ed-tech startup is deploying AI tutors to revolutionize education. They have a strong customer base in Latin America and they're in need of a driven founding engineer. Key skills: TypeScript, React, Node.js, Postgres. (Remote, asynchronous)
- A startup building the world’s most performant parallel-EVM is looking for a low level engineer with C++/Rust/CUDA experience. (Remote, EU preferred)
- A blockchain-focused research and consulting firm is looking for an infrastructure engineer to secure their clients’ networks. Deep experience in DevOps, Linux systems, and Infrastructure-as-Code required; previous crypto experience preferred. (Remote)
- A company building the new pension of the 21st century and building universal basic capital is looking for a senior frontend engineer. (NYC)
- A private credit fund denominated in Bitcoin needs a credit analyst that can negotiate derivatives pricing. Experience with low-risk crypto lending preferred (i.e. to large miners, prop-trading firms in safe jurisdictions). (Remote)
- A newsletter, recruiting service, and early-stage venture investor—The Diff, if you must know—is looking for an associate. Python experience and the ability to juggle multiple projects is ideal. Interest in finance and tech required; experience in either/both a plus. (Remote, though Austin is a plus)
Even if you don't see an exact match for your skills and interests right now, we're happy to talk early so we can let you know if a good opportunity comes up. If you’re at a company that's looking for talent, we should talk! Diff Jobs works with companies across fintech, hard tech, consumer software, enterprise software, and other areas—any company where finding unusually effective people is a top priority. Elsewhere
Margins
The Verge has a really wonderful piece on Netflix's efforts to nail video encoding. Bandwidth is in the category of products that are basically free, until you start building a business that does best with unlimited bandwidth, at which point it becomes a binding constraint. There are benefits to scale beyond the fact that saving a fraction of a penny per stream means something at Netflix scale that it doesn't for smaller companies—a vertically-integrated company like Netflix can actually create its own twelve-minute test movie, complete with lots of different shots that do best with many different kinds of compression. The article cites 50% compression for 4k streams, but it's always a good idea to take such metrics with a grain of salt: the company's incentive is to release the numbers that 1) make them look good, and 2) would be either impossible or ruinously expensive to copy.
A Better, Costlier Alexa
One of the mysteries of Amazon is that they deployed tens of millions of smart speakers and did not turn this into a monopoly on the AI assistant business. One pretty decent reason is that an installed base that big, coupled with the marginal cost of inference, could make Alexa a giant liability if it were reimagined as the frontend for an AI tool. Amazon can advance that model in increments, and it will, by launching an enhanced Alexa with a $5-10 monthly subscription. Alexa is an annoying edge case where monetizing through purchases is incredibly tempting, but easy to get wrong given the non-deterministic nature of LLMs and the difficulty of getting voice recognition exactly right. So a safer model is to ship an improvement, put a price tag on it to gate usage, and then iterate on both the product and the business model from there.
Disclosure: Long AMZN.
Elasticity
Cities that set order fees or high minimum wages for delivery services have seen a drop in order volume—Uber orders in Seattle dropped 45% year-over-year ($, WSJ). What tends to happen is that the law mandates something about the company's cost structure, and the company has some flexibility to figure out where and how its pricing will reflect this. There are two competing incentives:
- Higher costs are an absolute disadvantage, but they're a relative advantage for the biggest companies, because those companies can get the highest utilization. There's potentially some optimal-for-Uber tax that prices smaller operators out entirely and means that a given city is a duopoly or monopoly.
- When one city passes such a law, it starts a negotiation between the delivery platforms and every other city they operate in. This gives them a short-term incentive to respond as clumsily as possible. There is still some revenue in providing ludicrously expensive food to office workers who have subsidized meal allowances, and eliminating delivery as a slightly premium-priced occasional indulgence for everyone else. And doing that vastly improves the platforms' negotiating position: they can honestly tell other cities that a handful of well-off people will shrug that they're paying an extra $20 for sushi, and meanwhile a few hundred gig workers will have to find another job.
One limit on the second factor is that it's a competitive industry, and in a two-sided market you don't want to give your competitors a chance to get critical mass. But it's also really hundreds of mostly-separate two-sided markets, and losing share in one city might be a reasonable price for reducing the odds of a worse cost structure in every other city.
Counterparty Risk, Served Two Ways
In late 2022, a number of crypto market participants discovered that some of the assets they thought they owned—various crypto tokens, stablecoins, and the like—were, in fact, credit instruments. They really owned a 0%-interest note, denominated in these assets, and issued by a crypto exchange that turned out to be insolvent. When FTX went under, a lively market in bankruptcy claims sprang up, allowing some customers to get out with a small amount of their money left and for speculators to bet that FTX would recover. And, between a rally in crypto and appreciation in the value of FTX's stake in Anthropic, that happened, which led to another problem: sellers are backing out or trying to rearrange deals now that claims are trading at over 100 cents on the dollar ($, WSJ). Usually, discussions of market plumbing are pretty boring, because the vast majority of the time, what happens when you make a trade is exactly what you expect would happen. But the markets where this is true are markets that learned the hard way that it's important for deals to close in a timely fashion, and very important for both sides to know where they stand and when they're out of options to change their minds.
The Universal Denominator
The dollar is outperforming emerging market currencies this year ($, FT), because the US economy refuses to slow down enough to justify rate cuts, while emerging markets have been able to do this. The US's macro role is usually to stabilize global demand while being a conduit for magnifying financial instability elsewhere. That stabilizing works both ways: when the US economy is outperforming, it keeps US rates relatively high, and that causes the dollar to appreciate, which makes exporting to the US a better deal.
|