One of the key objectives of the automation space is the ability to sense the surrounding environment. The ability to view, among the various types of senses is paramount, including the capture and analysis of images and the subsequent calculation in the distances of objects, shape and motion recognition, and more.
In order to analyse the acquired data and communicate it with other systems effectively, these sensing functions must be amalgamated into a robust electrical system. One of the most popular results of such an integration are optical sensors.
Optical sensors are electronic devices which process infrared or light rays into electronic signals. Optical sensors are generally lightweights and feature a host of benefits including flexibility, compactness and versatility. The optical sensor market has witnessed widespread acceptance across myriad end-use industries owing to their robust application potential.
Optical sensors can be found in all manner of devices from smartphones to motion detectors and are used for a number of applications including smart heating, occupancy sensing, gesture recognition and more. Furthermore, the rise of automation in prominent industries such as healthcare, defence and automotive, to name a few, is encouraging key industry players to integrate optical sensors into their products.
Optical sensors are usually made up of optical fibres, which work on the total internal reflection of light and were developed by the Corning Glass Works in 1970. The GaAs semiconductor lasers for light transmission through fibre-optic cables were developed during the same period.
Optical fibre cables receive input in the form of light beams, which is just the quantity to be measures as per the input of the sensor. Cable input is usually supplied by a light source such as LED or laser diodes, among others. The transmitted beam travels through the optical fibre, without sustaining any losses and is dispersed at a 60-degree angle while emitting towards the target, which is the sensor’s output.
Sensor systems in automotive applications are becoming more and more pervasive than ever with the autonomous car industry growing at a breakneck speed. In fact, several level 3 autonomous vehicles have already begun production with level 4 autonomous cars in the pipeline.
Furthermore, since 2018, automakers have been mandated to fit all new cars in the US with rear-view cameras. This indicates that the demand for sensors in the auto industry is robust and is expected to expand even further in the coming years.
As the demand for sophisticated sensors proliferates, so does the need for advanced driver-assisted systems or ADAS, equipped with processors, sensors and central sensor fusion units in order to analyse the vast amount of collected sensor data.
Driver assistance systems show remarkable ability to enhance the productivity and safety of vehicles and is considered to be the future of mobility. It comes as no surprise, therefore, that accurate and resilient cameras, optical sensors and LiDAR systems are crucial elements of ADAS for a vast array of applications like blind spot detection, lane departure warning, adaptive cruise control (ACC), reversing aid, blind spot assistance, traffic sign recognition and a wide-ranging 360° view.
Optical sensors used in automotive ADAS usually take the form of cameras, LiDAR and radar systems. The use of multiple sensor technologies reinforces the car’s security and boosts the productivity of each individual sensor.
For instance, OmniVision Technologies and Fullhan Microelectronics have jointly produced a sensor solution for capturing and analysing infrared (IR) and RGB automotive interior images using a single camera. This innovative solution brings together OmniVision’s proprietary OV2778 2MP RGB-IR image sensor and Fullhans’ exclusive ISP (image signal processor) FH8310.
These two systems work together to birth a sensor system that provides high quality video for viewing and in-cabin monitoring system applications, as well as proving an integrated system for facial recognition, identification of unattended children and objects, remote supervision, and recording facility for robo-taxis and ride hailing.