Intel this week introduced its new RealSense Tracking Camera T265 featuring 6-degrees-of-freedom (6DoF) inside-out tracking. The tracking device is aimed primarily at various self-driving applications like small robots or drones, but can be applied to other devices that benefit from 6DoF inside-out tracking.

The Intel RealSense Tracking Camera T265 is equipped with two fish-eye cameras, each featuring an approximate 170-degree field of view. The data from the cameras is processed by the Intel Movidius Myriad 2 vision processing unit (VPU). Meanwhile, the tracking camera supports Intel’s visual inertial odometry simultaneous localization and mapping (V-SLAM) technology that constructs maps of unfamiliar environments and then repeatedly updates the location of a device within that environment.

The hardware and the V-SLAM algorithm enable applications like robots and drones to accurately navigate within known and unknown surroundings intelligently avoiding obstacles without any need for external sensors (courtesy of 6DoF). Since all processing is done by the Movidius Myriad 2 VPU, the end device can use a very cheap SoC without a lot of compute power, but with a USB 2.0/3.0 interface.

For more advanced applications that need a higher precision and depth, the RealSense Tracking Camera T265 can be paired with the RealSense Depth Camera D400-series.

Intel will begin shipments of the RealSense Tracking Camera T265 on February 28. Each device will cost $199.

Related Reading

Source: Intel

Comments Locked

14 Comments

View All Comments

  • spikebike - Thursday, January 24, 2019 - link

    Your ads are serving malware. Google chrome just said "Attackers currently on cm.mgid.com might attempt to install dangerous programs on your computer that steal or delete your information (for example, photos, passwords, messages, and credit cards)."

    Anyone know the range of the new realsense camera?
  • mode_13h - Thursday, January 24, 2019 - link

    As this seems to use only passive, visible light sensing, it doesn't have a finite range like you'd have with ToF or structured light.
  • Yojimbo - Thursday, January 24, 2019 - link

    While it can see anything big enough and bright enough no matter how far away it is, I'd imagine it has no way of telling how far away something is outside a certain range, that range being given by the precision of the parallax calculations of the two cameras. If that's correct then there should be an actual limit for usable range regardless of the brightness or size of the object. It's similar to how we can't use parallax methods to tell the distance to distant galaxies. We need to come up with other methods (such as analyzing phenomena with known physical properties that are affected by distance in known ways like supernovas).
  • Yojimbo - Thursday, January 24, 2019 - link

    Of course its effectiveness won't hit a sudden wall of unusability. The effectiveness will drop off with distance. If you are looking at two drones flying around close to you you can judge if they are going to run into each other. But if you are looking at those drones further away that are the same distance apart from each other then you will think "wait, are they going to collide? I don't think so but I can't tell."
  • mode_13h - Friday, January 25, 2019 - link

    Yeah, I made this point below. Unfortunately, as a new post, rather than a follow-up reply.

    With stereo, range really becomes a question of accuracy. For a given accuracy level, there will be a finite range.
  • PeachNCream - Friday, January 25, 2019 - link

    It's funny that a Google product is accusing something else of stealing your information when that's basically Alphabet's entire business model. Talk about hypocrisy!
  • close - Monday, January 28, 2019 - link

    Can't wait for someone from AT to come and explain how they have no control over which flavor of malware they are serving on their own website. Can't wait for Google to disable adblockers and then come to visit "reputable" malware distribution depos like AT (last time I accidentally disabled the adblocker on the AT page I almost got cancer...).
  • PeachNCream - Tuesday, January 29, 2019 - link

    There is only so much that good writing and in-depth reporting can do to save a site like Anandtech from its jumping onto its own advertising sword.
  • mode_13h - Thursday, January 24, 2019 - link

    To be clear, what happens is your depth accuracy should be inversely proportional to distance.

    The main downsides of being passive is that you need texture and illumination. The specs say it needs at least 15 lux. Now, I'm not actually sure it gives you a depth image, in which case it doesn't need texture as long as there are some features it can find in its 163 degree FoV. In fact, I'm not even sure you can get any image out of it. They don't even say whether the sensors are color.

    One cool feature it has is the ability to re-localize to a place it's previously seen.

    > Intel® RealSense™ T265 can re-localize after kidnapping, providing there are some features in view which are in its internal map. In the case of a completely new environment, it will continue to provide relative pose data until absolute position can be re-established.
  • mode_13h - Thursday, January 24, 2019 - link

    Oops, meant to post that as a reply to @spikebike.

Log in

Don't have an account? Sign up now