Alphabet Inc.’s Google DeepMind unit today detailed AlphaEvolve, an artificial intelligence agent that can tackle complex programming and math challenges.
The company says that it has used AlphaEvolve to make its data centers more efficient. Additionally, the AI agent is showing promise as a tool for mathematical research and chip development.
AlphaEvolve carries out processing in multiple steps. When it’s given a programming task, the agent uses Google LLC’s lightweight Gemini 2.0 Flash language model to generate multiple pieces of code. An automated evaluation mechanism then ranks those code snippets by quality. From there, AlphaEvolve takes the best code snippets and asks Gemini 2.0 Flash to improve them.
The agent makes optimizations to the AI-generated code over multiple rounds. When Gemini 2.0 Flash can no longer suggest improvements, AlphaEvolve switches to Gemini 2.0 Pro, a more capable model that trades off some speed for increased output quality.
“The evolutionary process in AlphaEvolve leverages modern LLMs’ ability to respond to feedback, enabling the discovery of candidates that are substantially different from the initial candidate pool in syntax and function,” DeepMind researchers detailed in a research paper.
Google has already put AlphaEvolve to use in multiple internal projects. Several of these initiatives focused on matrix multiplications, the mathematical operations that AI models use to process data. A matrix is a collection of numbers organized into spreadsheet-like rows and columns.
Chip designers don’t draw processor blueprints but rather write them using a programming syntax called Verilog. In one project, AlphaEvolve helped Google engineers enhance the Verilog code for a circuit optimized to perform matrix multiplications. The company has incorporated the circuit into an upcoming addition to its TPU line of AI processors.
In another internal project, AlphaEvolve developed methods that allow Google’s Gemini models to break down matrix multiplications into smaller, more manageable calculations. The search giant says that those improvements sped up one of Gemini’s most important components by 23%.
AlphaGo has also helped the company make its data centers more efficient. Google manages its infrastructure resources using a software platform called Borg. AlphaEvolve suggested an improvement to the platform that currently “recovers on average 0.7% of Google’s fleet-wide compute resources,” DeepMind’s researchers detailed.
According to the search giant, the reasoning capabilities that enable AlphaEvolve to optimize data centers and chip designs make it useful for mathematical research. “To investigate AlphaEvolve’s breadth, we applied the system to over 50 open problems in mathematical analysis, geometry, combinatorics and number theory,” the researchers wrote in a blog post that accompanied the paper. “The system’s flexibility enabled us to set up most experiments in a matter of hours. In roughly 75% of cases, it rediscovered state-of-the-art solutions, to the best of our knowledge.”
Google plans to make the AI agent available to academics through an early access program. Additionally, the company is studying the possibility of broadening access to additional users down the line.
“While AlphaEvolve is currently being applied across math and computing, its general nature means it can be applied to any problem whose solution can be described as an algorithm, and automatically verified,” DeepMind’s researchers wrote. “We believe AlphaEvolve could be transformative across many more areas such as material science, drug discovery, sustainability and wider technological and business applications.”
Image: Google
Your vote of support is important to us and it helps us keep the content FREE.
One click below supports our mission to provide free, deep, and relevant content.
Join our community on YouTube
Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.
THANK YOU