News
Intel Announces New Class of RealSense Stand-Alone Inside-Out Tracking Camera
Published: Jan 24,2019Intel introduced the Intel RealSense Tracking Camera T265, a new class of stand-alone inside-out tracking device that will provide developers with a powerful building block for autonomous devices, delivering high-performance guidance and navigation. The T265 uses proprietary visual inertial odometry simultaneous localization and mapping (V-SLAM) technology with computing at the edge and is key for applications that require a highly accurate and low-latency tracking solution, including robotics, drones, augmented reality (AR) and virtual reality.
Intel Steps up Hiring in Taiwan to Secure Semiconductor Chips supply Chain for PC Partners
Intel Corporation (Intel) is ramping up hiring activity globally but more so in Taiwan. The move forms part of Intel’s...
Intel and MediaTek Partner to Deliver 5G on the PC
Intel is partnering with Taiwan’s MediaTek on the development, certification and support of 5G modem solutions for the next generation of PC experiences...
“Understanding your environment is a critical component for many devices. The T265 was designed to complement our existing Intel RealSense Depth Cameras and provide a quick path to product development with our next-generation integrated V-SLAM technology.”said Sagi Ben Moshe, vice president and general manager, Intel RealSense Group
The Intel RealSense Tracking Camera T265 is powered by the Intel Movidius Myriad 2 vision processing unit (VPU), which directly handles all the data processing necessary for tracking on the machine. This makes the T265 a small footprint, low-power consumption solution that is simple for use by developers implementing into existing designs or building their own intellectual property that requires rich visual intelligence.
The Intel RealSense Tracking Camera T265 is good for applications where tracking the location of a device is important, especially in locations without GPS service, such as warehouses or remote outdoor areas where the camera uses a combination of known and unknown data to accurately navigate to its destination. The T265 is also designed for flexible implementation and can be easily added to small-footprint mobile devices like lightweight robots and drones, as well as for connectivity with mobile phones or AR headsets.
For example, integrating the T265 into a robot designed for agriculture allows the device to navigate fields in a precise lawn-mower-style pattern and intelligently adapt to avoid obstacles in its environment, including structures or people. Whether bringing medical supplies to remote, off-the-grid areas or to a lab inside a hospital ward, the T265 can be used in drone or robotic deliveries due to its wider field of view and optimization for tracking use cases.
The Intel RealSense Tracking Camera T265 uses inside-out tracking, which means the device does not rely on any external sensors to understand the environment. Unlike other inside-out tracking solutions, the T265 delivers 6-degrees-of-freedom (6DoF) inside-out tracking by gathering inputs from two onboard fish-eye cameras, each with an approximate 170-degree range of view. The V-SLAM systems construct and continually update maps of unknown environments and the location of a device within that environment. Since all position calculations are performed directly on the device, tracking with the T265 is platform independent and allows the T265 to run on very low-compute devices.
The T265 complements Intel’s RealSense D400 series cameras, and the data from both devices can be combined for advanced applications like occupancy mapping, improved 3D scanning and advanced navigation and collision avoidance in GPS-restricted environments. The only hardware requirements are sufficient non-volatile memory to boot the device and a USB 2.0 or 3.0 connection that provides 1.5 watts of power.
CTIMES loves to interact with the global technology related companies and individuals, you can deliver your products information or share industrial intelligence. Please email us to en@ctimes.com.tw
1124 viewed