Saturday, September 14, 2024

A Machine Vision Sensing Solution For Smart Applications

- Advertisement -

The multi-camera sensing chip boosts AI by merging sensor data, benefiting smart homes, automation, medical devices, and automotive industries.

camera

eYs3D Microelectronics, a subsidiary of Etron Tech specializing in 3D sensing and computer vision, has introduced SenseLink, a multi-camera sensing system chip. This technology uses sensor fusion techniques to combine data from multiple sensors, enhancing AI computing power and delivering visual sensing solutions for smart applications.

Smart home developers can enhance automation systems, industrial automation firms can integrate it to optimize processes and improve robotic vision, and medical device manufacturers can use it to enhance diagnostic and surgical equipment. It’s also applicable in the automotive industry for driver-assistance systems and autonomous vehicles, as well as in security and surveillance for monitoring and object tracking. Research and development organizations in AI and robotics, along with consumer electronics manufacturers, can use this technology for developing products and improving existing ones, making SenseLink a useful tool across multiple fields where visual AI sensing is beneficial.

- Advertisement -

SenseLink can receive inputs from up to seven cameras through its eCV5546 and eSP930 processors. This capability allows for various camera configurations and orientations to suit different applications. The system includes three image signal processors (ISPs) for processing images of different resolutions and color specifications, supporting extensive META-DATA to enhance the accuracy of data processing and transmission across various sensor settings. SenseLink is expected to perform well in areas such as smart homes, industrial automation, and medical assistance.

In machine vision applications, SenseLink captures multiple images quickly, suitable for applications requiring sensitivity to movement and detail. The system’s camera array allows for capturing images at varying depths and distances, useful in optical inspection and automation. Additionally, its capability to provide a panoramic view without moving parts makes it suitable for wide-area surveillance and object tracking.

The company plans to showcase the Sense and React Human-Machine Interaction Developer Interface, which combines Large Language Models (LLM) with Convolutional Neural Network (CNN) sensing technology. This interface aims to enhance human-machine interaction by using advanced sensing to anticipate and respond to environmental changes, enabling devices to interact naturally with users, and making intelligent control more accessible across applications including smart homes, industrial automation, and medical services.

For more information, click here.

Nidhi Agarwal
Nidhi Agarwal
Nidhi Agarwal is a journalist at EFY. She is an Electronics and Communication Engineer with over five years of academic experience. Her expertise lies in working with development boards and IoT cloud. She enjoys writing as it enables her to share her knowledge and insights related to electronics, with like-minded techies.

SHARE YOUR THOUGHTS & COMMENTS

EFY Prime

Unique DIY Projects

Electronics News

Truly Innovative Electronics

Latest DIY Videos

Electronics Components

Electronics Jobs

Calculators For Electronics

×