Using an LLM to control our robot Digit. #AI #GPT

Using an LLM to control our robot Digit. #AI #GPT

HomeAgility RoboticsUsing an LLM to control our robot Digit. #AI #GPT
Using an LLM to control our robot Digit. #AI #GPT
ChannelPublish DateThumbnail & View CountDownload Video
Channel AvatarPublish Date not found Thumbnail
0 Views
In this demonstration, Digit starts by knowing that there is trash on the floor and that bins are being used for recycling/waste. We use a voice command /"clean up this mess/" to get Digit to help us. Digit hears the command and uses an LLM to interpret how best to achieve the stated goal with its existing physical capabilities.

At no point is Digit given instructions on how to clean or what constitutes a mess. This is an example of bridging the conversational nature of Chat GPT and other LLMS to generate physical action in the real world.

—————————————— —– ——————

At Agility we make robots that are made for work. Our robot Digit works alongside us in spaces designed for people. Digit takes care of the boring and repetitive tasks meant for a machine, allowing companies and their people to focus on the work that requires the human element.

Subscribe (press the bell for notifications)
https://www.youtube.com/c/AgilityRobotics

Join our team
https://www.agilityrobotics.com/careers

Follow our journey
https://twitter.com/agilityrobotics
https://www.linkedin.com/company/agilityrobotics
https://www.instagram.com/agility_robotics
https://www.tiktok.com/@agility_robotics

#robotics #machinelearning #AI #GPT

Please take the opportunity to connect and share this video with your friends and family if you find it useful.