‍

If you've ever seen Elvis' face on a piece of toast, it was probably just pareidolia, the tendency to see meaningful patterns in random stimuli. Lately, I've been experiencing acute pareidolia. Three passions of mine—physics, machine learning, and Minecraft—have collided in an undeniable way.

Not long ago, I found myself at a Warriors game with a founder, talking intently about physics (in this case, the fine-structure constant). It’s not the first time this has happened. The thinking that governs physics—a curiosity about how and why things work—underpins all innovation, and it goes without saying a founder with a good idea has probably assessed "how things work" and decided on a better way.

Many have noticed, myself included, that there's a robust and invisible path from Physicsland to MLworld. The endeavor of AGI requires deep, first-principles thinking with an equal dose of appreciation for emergent phenomena. Moreover, machine learning represents a greenfield of complex problems to solve, and the math and logic of physics port easily.

A few examples and coincidences:

  1. DeepMind optimized nuclear fusion reactions by using reinforcement learning to stabilize plasma, a task my physics professor once described as “holding jello together with rubber bands.”
  2. Alexander Alemi, PhD physicist and Sr. Research Scientist at Google, co-authored a seminal AI paper, Inception-v4 (which is now cited over 16K times). He had been an active contributor to Physics Stack Exchange, where he once wrote about computing the mass of an arbitrary coin based on the sound it made upon landing on his table (a fantastic read). 
  3. Following in the footsteps of his grandfather, Paul Dirac, who won a Nobel Prize for his work on quantum mechanics, Leo Dirac studied physics and became an AWS engineering lead for what would become Sagemaker. Today, he is the CEO of Groundlight, an ML startup.

Companies and great thinkers are using the constructs of the physical world and applying them to intelligence. Yet learning and thinking have no physical limit, artificial or not. Enter: Minecraft.

It turns out that Minecraft, a sandbox game I've played for more than a decade, is the perfect test bed for AI agents, something I wrote about last April. Last fall, while watching machine learning expert Károly Zsolnai-Fehér break down Nvidia's Voyager paper, my jaw dropped. The video explained how researchers created an agent in Minecraft that "continuously explores the world, acquires diverse skills, and makes novel discoveries without human intervention." The machine was able to grasp the tasks you'd expect, like collecting resources or avoiding dangers, but Nvidia's model was also able to learn how to complete more complicated and coordination-dependent aspects of the game, such as building a base, hunting, or fighting. It even learned to defeat one of the deadliest mobs in the game, Endemen. Simply put, Nvidia's automated agent became very good, very quickly. 

You might say: "So what? Computers master video games all the time." The catch was that Nvidia's agent was based on an LLM. The same framework that enables OpenAI's ChatGPT to answer questions in a text world spawned a highly skilled player in a three-dimensional "real life" world. The agent learned how many of us learn: not by picking up a book and reading the rules, but by building associations and reactions through many experiences—just like learning a language. The agent looked at certain aspects of the code—the name of a resource or a threat—and quickly learned the "most likely best action" the same way a person learns the "most likely best word" when speaking. Recently, Altera took this to the next level by building a Minecraft bot that acts and chats like a human player.

At the risk of over-paradolia, I'll leave this parting thought. The best ideas—theories, technologies, businesses—tend to sprout at the intersection of disciplines. I'm excited to watch how these worlds will continue to create new ways of thinking and new sets of rules. If you've got one, get in touch, even if we get to talk about the fine structure constant for a minute or two.