Researchers at Duke University have developed Text2Robot, an innovative platform that utilizes generative AI to design and deliver functional robots based on simple text prompts. This advancement aims to democratize robot creation, making it accessible to individuals without extensive engineering backgrounds.

Example of a 3D-printed quadruped robot generated by the Text2Robot platform using a natural language design prompt. (Photo courtesy of Duke University Pratt School of Engineering)

Text2Robot operates by converting user-provided descriptions into 3D physical designs. A text-to-3D generative model interprets the input to create a basic robot body, which is then refined into a functional model considering real-world manufacturing constraints, such as electronic component placement and joint functionality. The system employs evolutionary algorithms and reinforcement learning to optimize the robot’s shape, movement, and control software, ensuring efficiency and effectiveness in task performance.

“Building a functional robot has traditionally been a slow and expensive process requiring deep expertise in engineering, AI, and manufacturing,” said Boyuan Chen, the Dickinson Faculty Assistant Professor of Mechanical Engineering and Materials Science, Electrical and Computer Engineering, and Computer Science at Duke University. “Text2Robot is taking the initial steps toward drastically improving this process by allowing users to create functional robots using nothing but natural language.”

The platform has demonstrated its capabilities by designing and 3D printing a variety of animal-inspired mobile robots based solely on user requests. For instance, prompts like “a frog robot that tracks my speed on command” or “an energy-efficient walking robot that looks like a dog” result in manufacturable designs that can be simulated within an hour and physically assembled within a day.

Co-first authors Ryan Ringel and Zachary Charlick, undergraduate students in Chen’s laboratory, emphasized the platform’s practicality. “This isn’t just about generating cool-looking robots,” said Ringel. “The AI understands physics and biomechanics, producing designs that are actually functional and efficient.”

Text2Robot’s rapid prototyping capability opens new possibilities for robot design and manufacturing, making it accessible to anyone with a computer, a 3D printer, and an idea. Potential applications range from children designing their own robot pets to artists creating interactive sculptures, and even custom-designed home assistants.

Currently, the framework focuses on quadrupedal robots, but future research aims to expand its capabilities to a broader range of robotic forms and integrate automated assembly processes to further streamline the design-to-reality pipeline.

“By harnessing the power of generative AI, this work brings us closer to a future where robots are not just tools but partners in creativity and innovation,” added Chen.

Text2Robot will be showcased at the upcoming IEEE International Conference on Robotics and Automation (ICRA 2025) in Atlanta, Georgia, from May 19 to 23. The project has received support from the DARPA FoundSci program and the Army Research Laboratory STRONG program.

About Text2Robot:
Text2Robot is a generative AI-based robotics platform developed by researchers at the Duke University Pratt School of Engineering. It allows users to design functional 3D-printable robots using natural language prompts. By combining text-to-3D modeling, evolutionary optimization, and simulation, the platform streamlines the design-to-manufacturing process—making robot prototyping faster and more accessible. Text2Robot is supported by the DARPA FoundSci program and the Army Research Laboratory STRONG program.

For more information, please click here:

Source/Photo Credit: Duke University Pratt School of Engineering General Robotics Lab

Citation: “Text2Robot: Evolutionary Robot Design from Text Descriptions.” Ryan P. Ringel, Zachary S. Charlick, Jiaxun Liu, Boxi Xia, and Boyuan Chen. IEEE International Conference on Robotics and Automation (ICRA 2025).


(Editor’s Note: All trademarks mentioned in this article, including company names, product names, and logos, are the property of their respective owners. Use of these trademarks is for informational purposes only and does not imply any endorsement.)

Molly Bakewell Chamberlin
Latest posts by Molly Bakewell Chamberlin (see all)
Tagged