Over the last six months, weâve seen LLMs permeate the enterprise. It started with employees using ChatGPT to write emails, answer questions, generate copy, and summarize articles. Then, out of nowhere, customer support teams started using LLMs to triage tickets and developers were hooked on GitHub Copilot. Early-stage startup founders got the memo: âLLMs are core to what we do,â âwe make tools that make building LLM apps easier,â âwe make tools that help those making tools that make building LLM apps easier,â and so on.Â
Over the next six months, I expect weâll hear a lot more about âagents.â This is the next level-up for AI, perhaps the one weâll look back on as having the most profound impact on our lives.
Matt Schlicht wrote a great piece that gets at the heart of what agents are: âAutonomous agents are programs, powered by AI, that when given an objective are able to create tasks for themselves, complete tasks, create new tasks, reprioritize their task list, complete the new top task, and loop until their objective is reached.â
Workflow automation isnât new, but the recent Generative AI boom is breathing new life into the idea. At a recent fireside chat, Inflection AIâs co-founder, Mustafa Suleyman, remarked, âin the next few years, hundreds of millions of people will have intelligent companions that are with them 24/7. [They] will be your coach, therapist, negotiator, travel planner, teacher⊠Weâll get used to the idea of saying, âHold on, let me ask my AI.ââ Inflection AI has publicly announced $265M in funding, presumably to pursue this idea. Adept, a newly minted unicorn, recently closed on a $350M financing round to build âAI teammates for everyone.â Langchain, a popular open-source framework for building with LLMs, has already built integrations to agentic projects like BabyAGI, CAMEL, Generative Agents, AgentGPT, and AutoGPT. By the time youâre reading this article, AutoGPT will have surpassed 100K stars on GitHub, making it the fastest-growing open source repository ever. On top of this, according to TechCrunch, DeepMind, now Google DeepMind, âhas explored an approach for teaching AI toâŠ[complete] âinstruction-followingâ computer tasks, such as booking a flight.â The agent race is on.
The winners of this race will put a fleet of agents in your pocket and on your desktop. Five prerequisites for getting this right include privacy preservation, on-device inference, agent-to-agent interaction, persistent memory, and quality controls.
- Privacy preservation is important because agents will likely have access to vast amounts of personal data.
- On-device inference is essential for cost-effectively scaling agents, and ensuring minimal latency. Also important here is ensuring agents retain a low-memory footprint.
- Agent-to-agent interaction is crucial for creating a cohesive and efficient network of agents. If someone builds an agent for calendar actions, for example, it should seamlessly interact with someone elseâs calendar agent, even if another party runs that agent.Â
- Persistent memory is highly important for agents to be useful. They should also get better over time and take increasingly personalized actions, sometimes in a proactive manner.
- Quality controlsâlike hallucination guardrails (see Guardrails AI)âensure that agents donât generate false or misleading information.
Unless the above prerequisites are met, claims of agents changing your life should be taken with a fistful of salt.Â
We believe agents represent the next great paradigm in AI. Language and diffusion models made us rethink everything from advertising, medical research, software design, and even what it means to be creative. Agents will extend this impact, redefining how we interact with technology and each other. A recent tweet by programming legend Kent Beck captures the sentiment well: âThe value of 90% of my skills just dropped to $0. The leverage for the remaining 10% went up 1000x.â
At RunwayMLâs inaugural AI film festival, we heard a quote that stuck with us: âEvery abundance creates a new scarcity.â The task for builders is to figure out which resources AI makes abundant, which are rendered scarce, and where an edge can be formed. So far, for those who have found an edge, the business results are incredible. Weâve seen multiple startups achieve top decile growth while remaining profitable. Publicly available data shows that ChatGPT is the fastest-growing consumer product of all time. Weâve seen other non-public examples of similar growth rates in sectors like legaltech, customer support, and healthcare. Demand for this technology is unprecedented. Every type of organization, from SMBs to Enterprises, is clamoring to inject AI into their products and workflows.Â
Weâre approaching a âChatGPT momentâ for agents. Whether from one or a few companies, we expect personal digital assistants to reach millions of users in short order. If youâre building for this future, please reach out.
â