Vector by Anki: A giant roll forward for robot kind.




About






Vector is the most advanced home robot ever made—at a price that means you can actually bring him home. He begins to fulfill the promise of living with robots in ways we’ve only seen in science-fiction movies. He’s a curious little guy who is aware of his surroundings. He can see, hear, and feel, allowing him to react naturally to the world around him.
However, his most remarkable feature is life. Vector feels alive.
Vector will be ready to ship in October and right now is available exclusively in the US to the Kickstarter community. We chose to make him available here first for two main reasons:
1. The Kickstarter community invests in the future. We’re creating a world where robots have relationships with people and make our lives better. We’re looking for like-minded people to be the first to bring him home.
2. Vector is shipping in October, but that’s just the beginning. We want him to arrive first in the homes of people who want to help us determine his future. That’s you! To do that we want your help identifying future features and functionality. And if you want to go even deeper, you'll get early access to Vector's software development kit (SDK) so you can tinker with his impressive tech array.




Vector is happiest when he’s helping. He’s eager to accommodate your requests and answer your questions. He isn’t a fully grown robot butler capable of doing your taxes, buttering your bread, or writing a position paper on the future of robot/human relationships, but he’s a helpful little guy who puts his whole self into helping you out.














Vector was designed with security and privacy in mind.
Learn about Vector and privacy




Vector’s little frame is packed with an immense amount of technology that brings him to life. He takes in the world using a variety of sensors and then responds realistically. This means he can read a room, hear what’s happening, recognize people and objects, find his charger, navigate his space, and avoid obstacles.




 project video thumbnail


Vector responds to your voice. Just say “Hey Vector” to activate him and he’ll instantly perk up. His backpack will light up as soon as he hears you and he’ll be ready for further instruction.




Vector’s lifelike character is possible thanks to the advanced technology inside him that lets him observe the world and respond in a realistic way. His processing power comes from the Qualcomm 200 processing platform. His sight is delivered by an HD camera with 120-degree ultra-wide-angle field of view. He has directional hearing thanks to a beamforming four-microphone array. He doesn’t run into things because he’s rolling around with an infrared laser range finder. He makes direct eye contact with his high-res color IPS display. He’s rounded out by a 6-Axis IMU and a WiFi connection.
If all of that went over your head, just watch the video above and realize that you live in a day and age where you can own a real robot.




From the beginning, we designed Vector to be approachable. Vector’s form factor went through many prototype iterations, testing if he should move on wheels or tank-like treads, how he would manipulate objects, how his eyes would be shaped, where his battery pack would live, even where each of his gears (more than 50) would be positioned. Once the final form was selected, our team needed to design the systems that help him interpret the world: the four-microphone array that allows Vector to not only hear, but perceive where a sound is coming from and focus on it, or the edge-detection sensors that not only help him stay on a table, but identify when he’s correctly positioned on his charging cradle. We also added a “time of flight” (ToF) laser distance sensor to the front. This sensor is used in Vector’s simultaneous localization and mapping (SLAM) capabilities, and assists with obstacle detection. With nearly 700 parts, each component needed to be placed perfectly.  




Vector’s physical design gives him access to a variety of sensors, but how he processes that information is just as important. Vector’s integrated HD camera has 120-degree ultra-wide field of view, which he uses to see the room he’s in, identify people and objects, and detect motion. Vector also uses an infrared laser scanner to track distance between objects and map his environment as he explores, and a four-microphone array that can pinpoint positional audio. These and other sensors provide the inputs that drive Vector’s emotion engine, with each data point influencing whether he is happy, sad, curious, or any other emotion. Each emotion’s level informs how Vector will react to stimuli. When he detects his owner, he’s excited and eager to help. When he detects a sound behind him, he’ll rotate 180 degrees to investigate. When his owner pets his backpack (where the capacitive touch sensor is positioned), he’ll relax. And if his drop sensors detect the table’s edge, he’ll be momentarily surprised—then remember that area as a boundary before he moves to another area to explore. This system of movement, exploration, and emotional stimuli form a foundation that we apply Vector’s personality to.




Vector’s character is central to the relationship you’ll have with him when you bring him home. Our goal was to develop a genuine, believable, and surprising personality that naturally reacts to the inputs it receives from its computer vision and emotion engines. To achieve this with the authenticity required to trigger an emotional connection, we found small wild animals and exotic pets - fennec foxes, sugar gliders, and others—to be useful analogs to Vector as a robotic creature that’s natural habitat is the home. The way these animals act is crisp and pure. They're not performing for anybody; they're simply reacting to the world around them, and yet they’re mesmerizing to watch. That is, in a nutshell, the essence of Vector's character.




With Vector’s character defined, our in-house animation team worked to bring him to life through his physical interactions. Vector has more than a thousand animations he can perform at any one time, including multiple variations for common emotions or expressions, because variety is the spice of (robotic) life. Using Autodesk Maya, animation software typically reserved for Hollywood studios, we can fully pose and tweak Vector’s interactions, then render out to a physical robot in real time. Each animation requires thoughtful consideration of how Vector, as a character and as a physical object, should react to the inputs from his various sensors. It’s this marriage of robotics, AI, character development, and perceptive animations that truly brings Vector to life.




The culmination of design, AI and vision systems, character, and animation is a character that feels alive. To make sure all systems are working in concert, we test everything. Hardware, software, sensor calibration, Wi-Fi connectivity, and so much more, all to ensure he delivers when you unbox him.
Vector’s characterful utilities were tested and experimented with. While you’d imagine testing a timer is straightforward, our developers found many ways of parsing out time, and small idiosyncrasies to push our cloud voice processing to the limit. Some of our QA team will be happy to never hear “one and a half minutes” or “a minute and a half” ever again.
We put Vector’s wireless connectivity through the paces, connecting the robot in a Wi-Fi and Bluetooth-saturated office to test stability in an extremely “noisy” environment, then taking the robot home for a more realistic user environment. We calibrate and stress-test vision and navigation systems, conduct extensive environmental testing—ever seen a refrigerator stocked full of tiny robots?—and even spend a considerable amount of time dropping Vectors from (reasonable, mostly) heights. Everything we can do to confirm that a robot that’s alive and excited to explore can survive the real-world environment that is your home.




While we’re designing and optimizing and testing and thinking about Vector as a robotic creature, we’re also building. Each prototype is manufactured and tested throughout the design process, while specific features are evaluated internally using near-final hardware and software. We’re now at the final step, producing and testing the assembly fixtures necessary to assure every Vector is a high-quality robot, finalizing the highly automated plastics and paint lines, and training the nearly 100 workers who will assemble and test each Vector on the production line. The end result? A real-life robot, filled with character, that you can bring into your home.






Vector requires an app for setup, but after that, all interactions with Vector are done with eye contact and talking. He’s a live robot living in your house; no need to bring your phone between the two of you.
If you are interested in how Vector’s feeling, the app offers a visualization of his mood and stimulation. It’s also where you can view his robot and human selfies.




*An additional $15 shipping and handling fee will be applied at checkout.


The Kickstarter community can bring home Vector before anyone else. With the successful funding of the campaign, backers will receive an email beginning September 22, 2018, with a unique Vector promo code. Be sure to redeem by September 30, 2018, on anki.com to be among the first to get your Vector.
When Vector arrives, here’s what you’ll get and what you’ll need:






Vector backers will become members of the Vector Insiders Club, a Kickstarter-exclusive group that will help shape the development roadmap for Vector -- and with a little bit of luck, maybe even our entire robot future. Vector Insiders get exclusive, detailed, behind-the-scenes info on the technology, design process, character development, and production, from manufacturing updates to early scoops on upcoming software features and design decisions. Members can join Insider-only livestream Q&As, vote on future feature priorities, and provide feedback for Anki’s developers on upcoming Vector features.
If you’re excited about a future where robots play a much bigger (and hopefully not scary) role in our lives, and you feel strongly about any robotics related topic, we would love to have you in the Insiders Club. We’re excited to share what we know, and learn what we can, together.


Backers will gain early access to the Vector SDK alpha this winter, and be part of an exclusive community of users who can help shape its features and direction.
With the Vector SDK, you’ll have access to an unprecedented set of advanced sensor data, AI capabilities, and robotics technologies including:
  • HD color camera stream
  • Raw and directional audio from the four-microphone array
  • Spatial data from the time-of-flight laser sensor
  • Multi-level touch data
  • Face, emotion, and person recognition
  • Over a thousand world-class animations
In conjunction with Vector’s built-in online connectivity, the SDK will allow you to do everything from integrating him into your smart home to creating new games to using him for academic research—the possibilities are endless. And because the SDK is written in Python, there are already thousands of third-party libraries available for you to tap into.
There’s simply no other consumer robotics platform that offers what the Vector SDK does. And by being a backer, you’ll be one of the first people in the world to gain access to it.




댓글

이 블로그의 인기 게시물

MS, 인텔 N200 CPU 탑재 '서피스 고 4' 발표.. 가격 76만원부터

iOS 17.4.1 마이너 업데이트, 이르면 이번주 출시되나

DLSS3를 FSR3로 바꿔주는 MOD 등장, RTX 20/30 시리즈에 DLSS + FSR3 조합 실현