Breeding robots

Chip-maker NVIDIA has developed a deep-learning system that allows a robot to learn to carry out a task, just by watching what a person does. Developers trained a series of neural networks, arrays of computer memory units that mimic the way human brain cells work, to carry out certain tasks around perception, structuring a program of instructions, and then carrying them out. Next, a camera takes a live video of a process, while neural networks judge the location of objects in the video. Another network programs a recreation of those spatial relationships, and yet another creates and carries out a plan to guide the robot’s actions. The system then delivers a list of the steps it plans to take to do the task, so humans can review the program and make needed corrections.

Skip to content