According to the new research report published by The Insight Partners, titled “3D Time-of-Flight (ToF) Sensor Market - Global Analysis and Forecast to 2027”, the global 3D time-of-flight sensor market is expected to reach US$ 569.0 Mn in 2027, registering a CAGR of 6.3% during the forecast period 2019-2027.
In 2019, Europe is estimated to account to hold the largest market share, growing at a CAGR of 5.9%.
In recent years, 3D ToF sensor has materialized as a promising three-dimensional (3D) sensing technology that can be manufactured economically in a compact size. However, current state-of-the-art ToF sensors suffer from low spatial resolution due to physical limitations in the fabrication process.
A depth map provides a 3D model of the visible surfaces in a scene, which makes it very useful in many areas of interest, such as robotics, video gaming, or biomedical imaging. Some of these applications have led companies like Oculus (Virtual Reality), Snapchat (Augmented Reality), or Tesla (self-driving autonomous cars) to be amongst the most successful companies. Many approaches have been proposed in the past to acquire a depth map, for instance, light detection and ranging (LIDAR) devices, structured light, or stereo cameras. One of the most modern techniques is ToF sensors, which captures both an intensity image and a depth map of the scene at the same time. Three-dimensional (3D) imaging using ToF sensors has rapidly gained widespread adoption in many applications due to their cost-effectiveness, simplicity, and compact size. However, the current generation of ToF cameras (e.g., Microsoft Kinect Sensor, 640× 480 pixels) suffers from low spatial resolution compared to conventional CCD/CMOS sensors (easily larger than 1000 × 1000 pixels) due to physical limitations in the fabrication process.
ToF is an active imaging technique which encodes both intensity and depth information of a scene into each captured image. Both the light source and the shutter of the ToF camera are amplitude-modulated, typically at the same frequency. The modulated light travels through space and is reflected by an object, and then it reaches the camera sensor at the pixel. The light received will retain amplitude modulation frequency, but it will be phase delayed and attenuated.
Key findings of the study:
In 2018, Asia Pacific accounted for the second-largest share in the global 3D time-of-flight sensor market and was projected to show the prime CAGR from 2019 - 2027. The primary reason for this is that as the technology related to autonomous driving is expected to grow, the demand for ToF sensors is also set to increase. ToF sensors are expected to play a significant role at level 4 autonomous vehicle, where the need for in-cabin passenger and driver status sensing is at its peak. It is expected that at level 4 autonomous vehicle, ToF sensors will be utilized to determine the driver’s attention on the roads. The presence of robust automotive manufacturing industry in the Asia Pacific region further lays a robust platform for the growth of 3D ToF.
European brands such as Audi, BMW, and Mercedes are the pioneers in adopting gesture recognition for infotainment controls. Advanced driver assistance systems (ADAS) systems offered in cars of the above-mentioned vehicles also features ToF sensors for in-cabin sensing of passengers and drivers. Germany is the home to Kuka AG, one of the largest producer of industrial robotics have also adopted ToF sensors thus, providing further growth opportunities in the European market.
In 2018, the VGA resolution type sensor segment accounted for the largest market share in the 3D time-of-flight sensor market. VGA pixel, which is 640× 480 is the most preferred sensor type for the automotive application as well as the consumer application segment (e.g., Microsoft Kinect Sensor, 640× 480 pixels). The automotive application includes in-cabin passenger and driver sensing, which determines if the driver is paying attention to the road forward.
Contact Us
Contact Person: Sameer Joshi
Phone: +1-646-491-9876
Email Id: sales@reportsweb.com