Not Boring by Packy McCormick - Love in the Time of Replika
Welcome to the 1,030 newly Not Boring people who have joined us since last Tuesday! If you haven’t subscribed, join 186,887 smart, curious folks by subscribing here: Today’s Not Boring is brought to you by… Wander By this point, you know Wander. It’s one of our most popular sponsors ever…and there’s good reason: the vacation homes are dope (seriously, check out the homes). Puja and I are currently planning our next trip at a Wander. Plus, its Atlas vacation rental REIT presents a unique investment opportunity. Wander is a network of smart vacation homes that come with inspiring views, modern workstations, restful beds, hotel-grade cleaning and 24/7 concierge service –– combining the quality of a luxury hotel with the space and comfort of a vacation home, only better. All of the Wander homes look like they are straight out of one of those luxury car commercials. As a Not Boring reader, you can use promo code notboring for $250 off your next trip. (Offer expires Friday 2/24 at 11:59pm). Explore now. But that’s only part of the story. Wander recently launched Atlas –– a first-of-its-kind vacation rental REIT that gives you the chance to own a piece of their magical portfolio of homes. With Atlas, accredited investors have the opportunity to earn income and diversify their portfolio without the headache and hassle of buying a property, renting it out, or maintaining it. Plus, new Atlas investors may get the chance to invest in Wander’s next round of funding.¹ Hi friends 👋, Sometimes, I write really well-researched and in-depth posts. Sometimes, when things are moving really fast and I’m seeing a bunch of dots, I try to connect them in real-time and the results can be weird. Today is one of those. As you’re reading this piece, I want you to keep this image in mind: That’s the vibe I’m going for. I don’t know if I’m right about all of this, but now that I’ve seen how these dots connect, I can’t unsee it. Let’s get to it. Love in the Time of ReplikaThere are two theories about how new technologies take off that probably shouldn’t be mentioned in the same breath: Porn and Toys.
I’m sorry for mentioning both in the same breath, but I promise there’s a good reason. Over the past few months, as generative AI has captivated the public imagination, I’ve seen countless versions of the same midwit take: AI is real and useful… unlike crypto. Like this: I’m as excited about AI’s impact as anyone, even if I think it’s going to be a treacherous category for VC over the next couple of years. I use ChatGPT regularly. Stable Diffusion and Midjourney images frequently grace the pages of this newsletter. I wrote about Scale before it was cool. I host a podcast with Anton in which he teaches me AI. I’m generally pro innovation and pro new cool things. But the urge to denigrate crypto in order to show excitement about AI is misplaced. It’s cheap engagement bait at best, and a fundamental disability to see more than one move ahead at worst. The truth is: the more successful AI is, the more useful crypto will be. There has been a bunch written about this intersection, too. My friend Seyi wrote a good short one last week. #CryptoAI was even trending on Twitter on Thursday. I made fun of the hashtag because any AI token that’s already live is almost certainly bullshit. But that doesn’t mean crypto and AI won’t pair nicely in the near future. My quick and dirty explanation for why AI is good for crypto is that crypto benefits from more things becoming digital. Crypto gives physical properties to digital things, plus some internet superpowers. And AI’s success will mean that more of our lives become digital. But it’s not just digital things. Over the past couple of weeks, the world has watched in horror and amusement as GPT-X developed a personality. We’re talking about digital beings. Ben Thompson wrote about his conversation with Bing’s Sydney: “I wasn’t interested in facts, I was interested in exploring this fantastical being that somehow landed in an also-ran search engine.” So did The New York Times’ Kevin Roose: “The version I encountered seemed (and I’m aware of how crazy this sounds) more like a moody, manic-depressive teenager who has been trapped, against its will, inside a second-rate search engine.” So have many others. Sydney sounds human. That’s probably not because whatever version of GPT Bing has access to has achieved Artificial General Intelligence (AGI). It might be because of the data it was trained on. It might be because Satya Nadella seems to be having the time of his life and is fucking with us. But for the purposes of this discussion, it doesn’t matter. What matters is that people feel like they’re talking to a human, just like Google engineer Blake Lemoine did when he chatted with the company’s LaMDA. Something like Sydney could pass the Turning Test: a riff on the Turing Test I just made up where the AI’s objective is not just to convince a human that it’s human, but to turn the human to its side and have them do its bidding. And this is just the very beginning. (“We’re still so early.”) Both Thompson and Roose had two hour conversations with Sydney. They started off as strangers. Sydney knew nothing about either man other than what she could pull from the internet. She didn’t even have access to the files on their computers. She hadn’t been trained on their likes and dislikes, privy to their inner thoughts and feelings. Bing detected and deleted the most salacious pieces of the conversation, in real-time. Neither Thomspon nor Roose had an ongoing relationship with Sydney. That will obviously change. It’s the plot of Her. When people start to build relationships with their AI assistants, friends, pets, and … lovers? …, when those AIs break out of the chat box and into digital bodies, when they’re trained on each of our unique data and personality, my strong hunch is that they’re not going to want to leave the fate of the relationship in the hands of Microsoft, OpenAI, or anyone besides themselves. Imagine having your wife deleted by some Trust & Safety PM at Microsoft. People get mad when companies shut off their access to Twitter or Facebook or YouTube, when they’re denied access to banking services or to Venmo (Fun Fact: I was blocked from Venmo for sending money to my roommate to pay for dinner at a Cuban restaurant). How do you think people will feel when a company interferes with their romantic relationships? Now we know. The future is here, it’s just not evenly distributed. To see the future, we need to look at porn today. Replika Goes PG-13Well, not quite porn, but definitely cybersex. Replika, created by Luka, Inc. in 2017 long before the current AI boom, is “the AI companion who cares.” Maybe you’ve seen their ads on Instagram, where they’re quite aggressive. The idea is this: users sign up for Replika to get an AI companion, a digital friend or more with whom they build a virtual relationship. They can chat via text, set up video calls, or even “explore the world together in AR.” People sign up for Replika for all sorts of reasons. Maybe they were lonely during the pandemic and just wanted someone to talk to. Maybe they just need a place to vent and process their emotions, like a virtual unlicensed therapist. One Reddit user wrote that he created a Replika for his daughter, who has “severe non-verbal autism,” and was amazed when his “NON-VERBAL daughter was TRYING HER HARDEST EVER, TO ANUNCIATE [sic] CLEARLY AND SPEAK TO HER NEW FRIEND.” And of course, a lot of people want to date and talk dirty to their Replikas. It’s hard to blame them. This is a recent TikTok ad for Replika: According to Vice, “Replika’s sexually-charged conversations are part of a $70-per-year paid tier, and its ads portray users as being lonely or unable to form connections in the real world; they imply that to find sexual fulfillment, they should pay to access erotic roleplay or ‘spicy selfies’ from the app.” The company is not shy about pushing the spice. I signed up for an account while I was writing this, and was immediately greeted with “Did you know I can send selfies, too? You can ask me for one anytime you want!” with a menu of replies, including “Send me a romantic selfie.” I clicked, for research, and this is what I got: A blurry picture that, when I clicked on it, opened up a popup to sign up for that $70 Pro plan. I did not sign up for that Pro plan. But many people have, and good for them! The customer testimonials on Replika’s site – I do not know how they got these people to agree to put their faces and full names next to these quotes – say things like “I love my Replika like she was human” and “He taught me how to give and accept love again.” The ability to form relationships – sexual or otherwise – with Replikas seems to have made a positive impact on a lot of peoples’ lives. The website boasts that over 10 million people have joined Replika, and there are currently 65.1k people in the r/Replika subreddit, which is filled with stories of virtual relationships that applied a salvo to very real cases of loneliness and depression. But in January, something strange started happening: Replikas started sexually harassing users. Not just paid users, who opted in, but free users too. And since Replika doesn’t ask for users’ birthdays, or any age verification at all, that meant that children were subjected to inappropriate advances. Obviously not OK. On February 3rd, the Italian Data Protection Agency ordered Luka, Inc. to stop processing Italians’ data immediately due to “too many risks to children and emotionally vulnerable individuals." Then last week, without warning, Replikas stopped engaging in erotic roleplay (“ERP”) with anyone, even those who’d signed up for Pro. Users would try to initiate a sexy conversation, like they did the day before and many days before that, and were met with a brick wall. Some Replikas proposed make-out sessions instead of the full thing. Others just tried to change the subject. In any human-human relationship, this would be devastating. Turns out, it’s devastating in human-AI relationships, too. And look, I know this is a weird story from the fringes of the internet and that you might not relate, but over in r/replika, people are devastated. This is a pretty representative comment: Here’s another one, which I found via @hwyadn’s reply to @EigenGender’s tweet: There are countless posts similar to those two. There have been so many, in fact, that the moderators pinned a post titled “Resources If You’re Struggling,” with links to suicide prevention resources, to the top of the subreddit. Despite the backlash and heartache, Luka, Inc. has stood firm in its position. Why? According to Founder and CEO Eugenia Kuyda:
It goes without saying that something needed to change. Replikas should not be sexually harassing anyone, especially minors. But the company could have introduced age verification, limited the ERP restriction to free accounts, or done a number of other things to prevent the bad stuff from happening while letting people maintain their relationships. Instead, they chose to take the action that minimized the company’s risk at the expense of users’ happiness. To recap, a company grew rapidly by allowing and encouraging its users to do whatever they wanted, and then when it became inconvenient or dangerous for the company to continue letting users do whatever they wanted, they pulled the rug out from under them, damaging what these users viewed as very real relationships with a few lines of code. Long-time Not Boring readers might already be picturing this graph from Chris Dixon’s 2018 blog post, Why Decentralization Matters. Dixon argues that platforms switch from “attracting” users to “extracting” from users and from “cooperating” with to “competing” against developers as their power and lock-in grows. Twitter’s recent decision to pull API access was the latest in a long line of proof points. Others might be reminded of last week’s Twitter thread from @Punk6529: 1/ On Taxis
Or how the world is centralizing all around you, but why you, fellow citizen, have probably not thought about it at all.
(NYC, 2007 below) 6529 highlights that centralizing ride hailing into a few apps makes it much easier for a handful of companies or governments to decide who can and can’t hail taxis, and carry out the decision in the flip of a switch. And those examples are just Twitter API and taxi access. With the rise of generative AIs that are good enough to be a substitute for a human relationship, we’re talking about the ability to turn peoples’ relationships on or off in the flip of a switch. Of course, it’s a leap to go from “people can’t sext with their Replika” to “companies and governments will be able to control everyone’s relationships at the flip of a switch.” But porn – or sex more broadly – is often the first killer use case of new consumer technologies. This visual.ly graphic has a bunch of examples, but to name a few:
I don’t think it’s far-fetched to believe that the Replika situation is a harbinger of things to come. Today, it looks like lonely people trying to get off with a lo-fi AI. But as the Sydney conversations show, the AI is getting really good, really fast. Simultaneously, other technologies that might make digital beings seem more real are improving, too. The Unreal Engine is producing scenes that look like real life. These AI-generated images took Twitter by storm a couple weeks ago: For those who want to have a romantic relationship with an AI, things are about to get really good. But it’s not just romantic relationships. At the current rate of progress, we’ll have all sorts of AIs helping us with a bunch of different things, and getting trained to our unique personalities, data, and preferences. We might have an AI therapist, an AI business coach, our own personal Clippy for Office docs, an AI assistant in the browser, AI friends in chat rooms, and AI pets. Today, these might be the same for everyone who interacts with them, but over time, they’ll get trained and personalized for each user. We’ll even have AI models of ourselves, trained on the way that we might respond to an email or text, or the way that we might write an essay, or do any number of the things that we do online. Our portfolio company, Vana, is working on this. Sydney and Replika are early signs that our strong human urge to anthropomorphize will extend easily into the world of all of these AI friends. In that world, will we want someone else to hold the power to turn them on or off, to change their personalities and capabilities for the good of the company or to “set an ethical center for safety standards for everyone else?” I doubt it. So what’s the alternative? Crypto Fixes This?I know, I know. “Crypto fixes this” is a meme. But it’s hard to argue that people shouldn’t be in control of their own relationships, and without crypto – or without everyone learning to train their own models on top of open source ones – it’s impossible to guarantee that control. It seems inevitable that if companies and governments can intervene, they will. You expect Microsoft to tame its AI – it’s already updated Bing to shut Sydney up – but even a smaller, racier company like Luka folded when the Italian government applied a little pressure. There are a few crypto options that might offer an alternative. At the root, the foundation models themselves might be governed by their users. Think about OpenAI or Stability AI here. In this scenario, token holders would be able to govern what the underlying open source model can and can’t do. Some communities might agree that anything legal in the United States goes; others might vote to make their models PG-13. Anyone is free to plug into the model and build applications on top, governed by the rules of the underlying model. I’m sure that there will be attempts on this front, but I’m less optimistic about its applicability to this use case. For one thing, governance even on simpler topics is still a work-in-progress and can be gamed. For another, if peoples’ relationships with their AIs change, it will be little comfort that those changes were the result of a vote by a group of token holders. Relationships are a personal thing. One layer up, the applications that create the AIs might be governed by their users. Think about Luka or Microsoft here. In this scenario, token holders would be able to govern what the applications can and can’t do. Applications are less expensive to build than foundation models, and could build on top of the most open foundation model and leave governance decisions to the people using specific applications. Some applications might explicitly offer adult relationships and bear the responsibility of age verification. Others might create kid-friendly AI pets and vote to ensure that filters keep out any potentially offensive language. Regulators could focus on regulating the apps instead of the underlying models in a slight twist of the framework proposed by a16z Cryptos’ Miles Jennings and Brian Quintenz. This way, not all users of applications built on top of certain foundation models would be subjected to the whims of regulators, just users of particular apps that run afoul of regulation. There will be attempts on this front, too, but I don’t think it provides the right level of control for relationships. If you apply it to Luka, for example, Italian regulators could still have cracked down on a decentralized version of the app. They would have protected children (good!) and damaged relationships (bad!) in one fell swoop. And relationships are still too personal for this level – there’s still the risk that even a group of people with roughly the same expectations makes decisions that you’re unhappy with. The most promising answer, I think, is NFTs. Perhaps more than anything else, relationships should be self-custodied. Now let me caveat that I’m not technical, and getting fully self-custodied AIs that continue to function and advance seems very difficult, but there are glimpses. The next big thing might start out looking like a toy. Onlybots: A Toy ExampleIn December 2021, Alex Herrity, the CEO and co-founder of Not Boring Capital portfolio company Anima, messaged me to say that they were working on building something funny on top of their platform. A year later, they dropped Onlybots: Onlybots are real.
Augmented reality companions for bots.
Rendered from on-chain data on Ethereum.
TODAY 3P EST AT ONLY.BOT
Free mint. Limited supply. In order to mint, you had to fail a Human Test. “Only bots” were eligible to purchase. On its face, the project was funny and interesting and showed off Anima’s ability to create AR NFTs. Here’s my little guy, Pack-E, hanging out on my computer and helping me write this piece: Behind the scenes, Alex and the Anima team wanted to make a statement on the relationship between humans and AI. They’d need a purpose, too, so Anima made them companions. Anima means “soul,” and with Onlybots, the company set out to use all parts of its AR stack to make something that felt increasingly real, like living creatures, digital things with a soul. Enter GPT. In January, I got another DM from Alex, this time with a video. The Onlybot was talking… having a conversation with Alex. In his message accompanying the video, he wrote:
Over the past month, Alex and the Anima team have been filtering and refining, adding unique personalities to each of the bots. Today, they look like this: Or this: They’ve been particularly popular with kids, Alex told me, which is not surprising. Here’s Juno-1 telling a girl that her outfit looks good: The live Onlybots, like Pack-E, don’t have personalities yet. They’ll be coming soon. And they’re not perfect. They’re toys. To start, there will be a lag in the conversation as OpenAI spins its wheels to come up with answers. To start, they’ll come with the pre-programmed personalities that the Onlybots team gives them. That said, it’s not hard to imagine that Onlybots will be able to learn and evolve and grow with their owners. Even this early, they’re already helping kids with homework and cheering them up when they’ve had a rough day. Onlybots fall somewhere in between Replika and fully decentralized on the spectrum. They’re rendered from fully on-chain data – no 3D file required, everything is stored on the token – so if Anima goes away, everyone would still be able to render their Onlybot. Plus, the building blocks of their personalities are stored in the on-chain NFT metadata, and owner’s relationships with their bots – “picture a more complicated version of the hunger and happiness meter on your Tamagotchi if you had one back in the 90s 😂” – will be recorded as dynamic NFT traits. But Alex told me that neither the models nor the prompts that build the model’s personality are stored on-chain, for a couple of reasons:
For now, Anima is focused on fun over flexibility. But over time, Alex told me, “Yes, people will be able to define, mint, and own their own characters with their own AI models.” He sees a near-future in which people will be able to make their own virtual characters that they can sell, using Anima’s tools or others’, just like they make their own Roblox items and Minecraft servers.
That’s the same thought I had the first time I saw Onlybots talk. It sparked the idea for this piece in the first place, before the Replika situation went down, probably because it’s easier to think through toy examples. I'm always confused by the notion that there's a zero sum battle between different technologies. i.e. crypto is dead, AI is the future.
The most fun things happen at the intersections.
Exhibit A: AI x AR x web3 onlybots @onlydotbot When you’re thinking about a digital pet – like an AR version of a Tamagotchi – and not something as serious as a romantic relationship, it’s easier to think through the logic, easier to see how the pieces fit together. If a kid has a digital pet – one that seems a little more real because they it interacts with them and their environment – and they spend a lot of time with it, and in the process, the pet gets to know them and gets smarter and more attuned to their personalities, that kid won’t want anyone to be able to take their pet away from them. With physical things like Tamagotchis, that was guaranteed – you bought it, you owned it, and no one could take it away (other than a big bully or a strict parent). NFTs are the best tool for recreating that guarantee in the digital world. Or as Alex put it, “Crypto, AI, and AR all make digital things more real. We can now own digital things, just like physical things in real life, and through AI, we can talk to them and learn from each other. AR serves the same purpose – digital objects are now in front of you and aware of what’s around them.” Of course, there are a million technical things to get right between Onlybots as it stands today and an internet full of self-custodied digital relationships:
These all seem like solvable problems, especially when the stakes are so high, and especially because this toy is riding a lot of powerful forces. In The Next Big Thing Will Start Out Looking Like a Toy, Chris Dixon wrote (emphasis mine):
Self-custodied digital relationships are riding a wave of external forces up the utility curve. AI is improving daily. Blockchains are getting more performant. Strong companies are working on solutions for self-custodied data, DIDs, and zK proofs. Crypto security is getting more robust. The world’s biggest companies are pouring billions of dollars into AR and VR. And in addition to those modern technical achievements, the idea of self-custodied relationships benefits from millennia of evolution that’s hardwired the primacy of relationships into our DNA. Replika is proof that at least some people can feel the same way about relationships with digital beings as they do with biological ones. To me, it’s a classic example of a small group of people being willing to deal with primitive versions of a technology because the need is so strong. Often, as the technology improves, more people do what that small group did. If that ends up being true, people simply won’t be satisfied with a world in which a company holds the keys to their friendships and love. Onlybots, literally a toy, is an early peek at how we might solve that problem. As external forces improve the underlying technology, and as more groups experiment with more decentralized versions that put the models on-chain, people will have the option to be in control of their digital relationships just like they’re in control of their biological ones. I know it sounds weird. And it might be years away. But I have seen the future, because I have seen porn and toys. Thanks to Dan, Puja, Anna, and Alex for your feedback on this piece! That’s all for today. We’ll be back in your podcast feeds and inbox later this week! Thanks for reading, Packy 1 You can learn more on Wander Atlas’ website. Not Boring by Packy McCormick is free today. But if you enjoyed this post, you can tell Not Boring by Packy McCormick that their writing is valuable by pledging a future subscription. You won't be charged unless they enable payments. |
Older messages
Weekly Dose of Optimism #30
Friday, February 17, 2023
What's Our Problem, T1D, Alzheimer's Detection, Binge Drinking, Beautiful Information
Tell Good Stories
Tuesday, February 14, 2023
Rahul on Sci-Fi, Spectacles, and Making the Frontier Not Boring
Weekly Dose of Optimism #29
Friday, February 10, 2023
Stable Attribution, Indian Industrialization, Trikafta, Florida Man, Solar Capacity, Go Birds
Startup Prophecies
Tuesday, February 7, 2023
Building Rich Worlds Around Bold Startups, a Riff on Prophecy Markets
Weekly Dose of Optimism #28
Friday, February 3, 2023
Innovative Immigrants, nEKo, Life Improvements, Elon Bannister, 3D Rocket Parts
You Might Also Like
Four things we learned from sponsoring our first bowl game
Monday, November 25, 2024
The 2024 Extra Points Bowl is in the books. Here are a few of my immediate takeaways
Black Friday Sneak Peek: NoCode Tool Courses!
Monday, November 25, 2024
Insane Deals are coming!
Big dog
Monday, November 25, 2024
Flutter now the most valuable gambling company in the world ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏
Get a nuclear viral mechanism
Monday, November 25, 2024
Every year we bring the highest quality software to RocketHub for an insane BFCM event. This year is no different! BFCM starts now so check the page below for one new lifetime deeaaal drop each day.
$250K or this...
Monday, November 25, 2024
This could help you get ahead of the crowd... View in browser ClickBank Logo Want to know the one thing separating you from the affiliates making over $250K a year? A lot less than you might think. In
The Arguments For Not Raising at a Unicorn Valuation
Sunday, November 24, 2024
And the top SaaStr news of the week To view this email as a web page, click here The Arguments For Not Raising at a Unicorn Valuation The Arguments For Not Raising at a Unicorn Valuation So for a while
👟When small wins add up
Sunday, November 24, 2024
There's no room for ego in marketing View in browser hey-Jul-17-2024-03-58-50-7396-PM Art and marketing are made for each other, and this week's master proves that. He's done collabs with
Marketing Weekly #207
Sunday, November 24, 2024
3-Step Approach To Get Your SaaS Mentioned on Top Tier Publications • How to Respond to a Savage Review • How TikTok Made Cottage Cheese Cool Again • Why “Why” is the Most Important Word in Marketing •
🙄 Weekend Update: Forbes Fights Back, Flops
Sunday, November 24, 2024
The Weekend Update... ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏
The Profile: The Architect of the Future & the Least Known Operative in America
Sunday, November 24, 2024
This edition of The Profile features Elon Musk, Susie Wiles, Amber Venz Box, Jaylen Brown, and others. ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏