Skip to content

Anki Cozmo Robot Released The Beta SDK

Posted in Artificial Intelligence (AI), Robots, and Technology

Anki is announcing what we have to look forward to in the SDK that’ll come with Cozmo. At first, this didn’t seem like that big of a deal—lots of companies release SDKs with their robots in the (usually futile) hope that developers will latch onto it and imbue their robots with all kinds of new and exciting features continually and for free. However, after speaking with Anki co-founder and president Hanns Tappeiner, we’re a bit more optimistic that Cozmo’s SDK might actually motivate you (and other people) to do some really cool stuff with this robot.[via: Spectrum.IEEE]

Anki Cozmo Robot:

When playing with Cozmo, Anki’s palm-sized artificial intelligence robot, it’s easy to forgot all of the engineering and software running behind the scenes. Every action, from Cozmo’s audible chirps of victory when it wins a game to its childlike mannerisms when it recognizes your face, conceals tens of thousands of lines of code.

Anki Cozmo Robot

When the product launches this October, Anki hopes consumers won’t think of its AI robot as undecipherable technology. Instead, the company wants people to wonder what’s going on under the hood — and, eventually, to alter it themselves.

Hanns Tappeiner, shows the tools consumers can use to accomplish that feat. With an iPad mini plugged into his laptop at the company’s San Francisco office, Tappeiner boots up the Cozmo software development kit and starts sending instructions to the robot. The SDK lets him order Cozmo back and forth in a straight line, at a set speed and distance. He closes out the loop with a cheerful animation, one of dozens Anki recorded and will make available to consumers. All of this is performed with simple code, which calls more complex operations Anki engineers have already constructed.

Anki Cozmo Robot

As robotics geeks, we’re interested in trying to figure out how we can advance the field of robotics overall. One of the big missing things is that while there is robotics software out there, it’s usually mostly accessible to people who are in the field of robotics. It’s not as accessible to people who might be good developers, but who don’t know enough about robotics to really use this kind of stuff. No matter how complex and how high-level the functionality is, we want to make it available to people with single lines of code. And that’s what the Cozmo SDK does.

Hanns isn’t kidding. He showed us a series of simple demos from the beta version of the SDK along with the code (written in Python) behind them. Below are four video clips, followed by a gallery of screenshots of the code. It’s pretty slick how much capability you can leverage with these commands: for example, one demo has Cozmo autonomously driving around a trio of cubes and picking up the farthest one. All you have to do is to tell the robot “go pick up that block” using “cozmo.PickupObject” command, and it does all of the path planning, motion, orientation, and manipulator control for you:

The press release includes a bullet-pointified list of things that you can do with Cozmo’s SDK; here they are along with some extras that we added based on our conversation with Hanns:

  • Use the computer vision system to track and recognize faces and facial expressions and estimate their position and orientation in 3D space.
  • Tap into the localization system with access to the robot’s internal map and all objects in it.
  • Utilize path and motion planners with obstacle avoidance, etc.
  • Explore Cozmo’s behavior system and execute high level behaviors such as look around, findFaces, findCubes, etc.
  • Use the entire animation system of the robot with access to all animations and sounds our character team has created.
  • Cozmo’s personality engine can be turned on and off. When it’s on, Cozmo interjects its cute little behaviors into whatever you program it to do.
  • The SDK gives you access to all of the raw sensor data if you want it
  • Cozmo can recognize (and tell the difference between) cats and dogs, although it can’t identify them individually.
  • Cozmo can also recognize other Cozmos. This isn’t a feature that will be enabled as part of the default behaviors on launch, but it’ll be in the SDK.

The robot has one camera in his face, which is the only camera we’re using. The camera is the main sensor, but we have a very high quality IMU, so even when we don’t see landmarks, we’re still keeping track of how the robot is oriented. We’re using the wheels to estimate how far we’re driving, which is fairly approximate. The robot might drive around blind for a minute or so, and the longer he does that, the more his actual location will be off, but the moment he sees a landmark like any of the cubes, he’ll know exactly where he is again.

Anki Cozmo Robot

Anki hopes its tiny robot will have an effect similar to Microsoft’s Kinect motion camera. The Xbox accessory gave researchers and tinkerers a low-cost computer vision solution for all sorts of real-world robotics applications. Anki hopes Cozmo’s advanced software will prove just as versatile. “When you’re in grad school and you study robotics, you have access to a lot of really amazing tech,” Tappeiner says. “But it’s really only accessible if you’re studying robotics.

Cozmo, along with the beta SDK, should be on sale in October for $180, or $160 if you pre-order one in advance.

[ Anki Cozmo SDK ]

If you enjoyed this article, Get email updates (It’s Free)

Translate »