Artificial General Intelligence
We’re moving from a time when humans were doing the work while using tools to, well, a time when the tools are doing the work.

Artificial general intelligence (AGI)—sometimes called human‑level intelligence AI—is a type of artificial intelligence that would match or surpass human capabilities across virtually all cognitive tasks
— Wikipedia
Most researchers don’t believe we’ve reached AGI yet, while some do.. I’m not sure I quite care what they say. The AGI definition is so wishy-washy anyway.
But *something* has changed with LLMs. We’re moving from a time when humans were doing the work while using tools to, well, a time when the tools are doing the work.
No AIs were harmed in the making of this video game
I’m playing Hades 2, a masterpiece video game from Supergiant Games. At the end of their credits, they have the following interesting note:
This game was intentionally made by human beings. No generative AI was used in the creation of its artwork, voiceover, or any other content.
Doing some more research, it turned out that there was a whole controversy around the actors’ union and the usage (authorized or not) of AI for voice acting at the root of this statement, but I’ll leave that aside.
I want to instead talk about the use of “by human beings” in opposition to “generative AI.”
You see, there’s a lot that goes into creating a video game that’s not exactly a human being:
- For starters, you need computers!
- Musical instruments, microphones, mixers
- Software for drawing, painting, animating
- IDEs, engines, compilers, etc. for the game’s code
- Software for writing, revising, and editing the game’s script
The tools above aren’t just natural extensions of human beings: they’re real tools, both really powerful and that allow us to do things that human beings couldn’t do otherwise.
But with generative AI, the function of tools did change. We feel like we are doing the work when writing a script with a computer, even if it has auto-correct and auto-complete. But no matter how involved you were in building the LLM, it doesn’t feel like you wrote the sentence it generated.
The truth is no amount of involvement in creating the LLM will make the work the AI produces feel more yours. It feels someone else’s.
That writing isn’t yours ..
.. it’s AI’s.
Artificial GENERATIVE Intelligence
We talk a lot about agents, and how agents are AI because they “act” in the world.
But I’m starting to think the shift to AGI (and AI) wasn’t really through the word “agent” but rather through the word “generative.”
You see, one odd characteristic of LLMs is that they create stuff — pretty much whatever we want them to create. And we don’t need them to do stuff to prove their intelligence.. they just need to demonstrate that they can.
And this demonstration of “intelligence” is mostly happening through generation: the autonomous creation of “stuff.”
Yes, without AI, the computer still runs code, or plays music, or draws art.. but it still feels like we’re the ones generating the signals, be it the clicks on the keyboard, the guitar pick hits on the strings, or the Apple Pen moving on the iPad.
But when we move to a level of indirection of the LLM, we’re no longer generating these signals. Playing a song on the guitar just feels different from telling an LLM to “play a song with a guitar sound”.. even if the song played was ultimately identical.
In the end, while the original definition of AGI is around how capable it is, I think the real question that matters is who’s doing the work?
And we finally created tools that do the work *for* us, instead of *with* us.
Your success is my success
The prevalence of generated intellectual work seems as inevitable as the generated mechanical work ushered by the industrial revolution. The economics are similarly skewed to the machines by orders of magnitude.
While the process for generating content can be different, in much of our market economies it is outcomes, not processes, that get evaluated. People don’t generally care as much about the how as they do about the what in the market.
Now, of course there are exceptions. Companies like Patagonia have created a niche by being deliberate about their values and processes, and that too I suspect will exist in this AGI era, just as some artisans kept their jobs despite the textile machines.
But despite the niches, the market forces are such that it’s very likely that all intellectual work will just be generated by AI rather than humans.
In such a world, despite what copyright and other intellectual property rights say, will it feel like *your* work?
If ChatGPT wrote this article based on my prompts, and I posted it on my blog, you might still think this is my post.
.. but would I?
What to do in a world of AI-generated content
I don’t know. One thing I did do was purchase a collection of the Britannica Encyclopedia from 2017.. just to have a reference of what the content generated by humans before AI was.

Dui’s pre-AI Britannica collection
Not that it’s necessarily better (or worse) to have human content than AI content.. it’s just different. I don’t think there’s any way to find content that’s not touched by AI on the Internet anymore, for example. So that’s my alternative.
Like these encyclopedias: books, paintings, songs, and writing can be completely free of AI assuming they’re guaranteed to be from before 2020 or so. But even remasters and other improvements of digital versions are surely going to be affected by AI given its superior “restoration” capabilities, and ensuring those weren’t updated will be ever harder in a world of digital assets we don’t actually own like streamed music, videos, and ebooks.
That’s really all there’s to it.
But is there some way, though, to ensure new content is “made by human beings” as Supergiant says.. free of generative AI?
.. No.
The age of things made by human beings is over.