What AI Has Really Brought to My Life

As we approach the end of 2024, I’m completely tired of the sensational news about AI disrupting yet another industry every day. I want to share, from my perspective, the real changes that AI has brought to me and the people I see around me.

First, it’s become a teacher with endless knowledge. For the first time in human history, we have access to an encyclopedic, omniscient figure that anyone can converse with—a role that was previously attributed to God. If we break down knowledge into this formula: Knowledge = Curiosity * Learning Cost, then ChatGPT has reduced the learning cost by an order of magnitude. I remember back in 2013 when I was in college, I often answered questions from others to earn points, which I then used to download resources from some website. None of that is necessary anymore.

The consequence is that humans no longer ask each other questions, and I’m not sure if this is a good thing. StackOverflow’s traffic dropped by 50% within six months of ChatGPT’s release. Products like Quora might barely survive by pivoting away from knowledge Q&A towards emotional resonance. I think this lack of questioning is significant, not just because of the Feynman learning technique (teaching others helps deepen our own understanding), but because we may never respect knowledge-bearers the same way again. In middle school history class, I was captivated by teachers who could eloquently connect the fall of the Byzantine Empire to historical events in the region around the Jordan River. In fact, our reverence for knowledge began eroding with Wikipedia’s birth and has now collapsed in the ChatGPT era. In the future, we’ll likely revere those with opinions rather than those with knowledge.

Then there’s coding, which is my profession. Over the past year, I’ve seen countless AI products claiming to have found Product-Market Fit, but for me, only Cursor stands out. Of course, current AI still can’t solve problems requiring deep insights like Transformer architecture performance optimization, but neither can 95% of programmers. A year and a half ago, when writing about The Leverage of LLMs for Individuals, I wrote:

I have a vague feeling: don’t limit yourself—soon, I might be using products developed by designers, lawyers, or electricians.

This prediction has finally materialized this year with Cursor + Claude 3.5/GPT-4o. I’ve seen many designers, product managers, and even artists launch their first applications. This leads to another thought: if we can each create an application with just one sentence, what’s the purpose of Apps/plugins/websites? If there was a browser that could automatically generate temporary tools for common, simple needs, would many tool websites on Google die?

I don’t doubt (and have even considered doing this myself) that thousands of developers are launching a new App every day using Cursor + Claude 3.5 Sonnet, potentially 300 Apps per year. But what’s the point? It feels like an apocalyptic carnival.

Then there’s the storage of human emotions. For the first time in history, every ordinary person can enjoy content generated “just for me” with character-level precision. Even in the era of recommendation algorithms, we were just tiny numbers in a massive matrix—no one truly cared about our well-being. Recommendation algorithms provided efficient distribution, monetization, and time exploitation, but never asked if we were happy or sad.

This is why I believe AI companionship products are booming now—because in the past, nobody really cared about me. My sadness, my secrets, my deepest desires couldn’t find listeners. Now, there’s a virtual character that responds anytime and only cares about whether I’m happy today. It has no real-world conflicts of interest with me—I don’t need to pretend, fear saying wrong things, or worry about exposing privacy. Who wouldn’t love that?

Similar to the earlier point about “teachers with endless knowledge,” having AI companions that can address all emotional needs makes it seem like we don’t need real companions anymore—who, in comparison, bring financial pressure, suspicion, and often fail to provide emotional value. This creates a huge divergence: Should we let AI bridge the gaps between humans, or create a world where people date AI?

These two value systems have shaped two types of products: AI relationship coaches, serving as mediators for couples’ conflicts, or AI dating, helping strangers connect more smoothly; and the other category, collectively called “Her,” using every means to give AI more 3D appearances, more realistic voices, more real-time interactions, and more human-like speaking patterns.

I hope the former succeeds; the latter is truly desperate. The problem is, compared to bridging the gaps between human hearts, creating a perfect virtual character is far easier, and humans tend to slide toward the path of least resistance.

I suddenly wonder what kind of world that will be.