Beam should have a hardware API

We’ve got a few Beam telepresence robots at USV, and use them all the time.  Fred has written about them here.  We had a team meeting today, and we had two beams going at once — Fred and I were the first to arrive, and we were chatting beam-to-beam — he in LAUtah, me in Boston, both of us in NYC by robot:

It works amazingly well.  It has now become somewhat normal for robots to be roving around the office, having conversations w people, USV team folks and visitors alike.

One idea that keeps coming up is an extensible peripherals API — the Beam robots already come w a USB port (used for initial setup), and it should be possible to use that to extend it with hardware.  We joke about jousting (and have done some), but I could seriously imagine bolting on devices such as additional displays / LCDs, sensors of various kinds, devices that can perform human-like gestures (the way the Kubi can nod, shake and bow), etc.

Thinking of Beam as a platform in this way would certainly extend its capabilities (in particular for industry), and would also position Beam in a much stronger position at the center of an ecosystem.  Would love to see that happen.

2 comments on “Beam should have a hardware API”

Recently I was wandering in downtown Palo Alto and came by a store/showroom staffed solely by Beam robots. It was really fun. A Beam controlled from Penn gave us the pitch. Later he asked the other Beam controller in Brooklyn to join in and answer some questions. So there we were, standing in the middle of a storefront on University Ave. talking to two robot.

You’ve hinted at it, but I suspect no chairs or conference tables will be necessary for the VC partners meeting or Board Meeting of the future.

very cool! it is surprising how quickly the awkwardness goes away. Not entirely, but you really do start to feel normal after a while, both as the person in the robot, and as someone in the room with one.

Comments are closed.