Friday, November 22, 2024

World’s First Brain Inspired Complementary Vision Chip

- Advertisement -

This innovation overcomes traditional performance bottlenecks, ensuring system stability and safety in extreme scenarios, and marks a significant breakthrough in visual sensing for autonomous systems.

With the rapid advancement of artificial intelligence, unmanned systems like autonomous driving and embodied intelligence are revolutionizing technology and industry. Visual perception, essential for information acquisition, is crucial in these intelligent systems. However, achieving efficient, precise, and robust visual perception in dynamic, diverse, and unpredictable environments is challenging. In open-world scenarios, intelligent systems must process vast data and handle extreme events, such as sudden dangers, drastic light changes at tunnel entrances, and strong flash interference at night. Traditional visual sensing chips, constrained by “power wall” and “bandwidth wall,” often face distortion, failure, or high latency, impacting system stability and safety.

To address these challenges, the Center for Brain-Inspired Computing Research (CBICR) at Tsinghua University has focused on brain-inspired vision sensing technologies and proposed an innovative complementary sensing paradigm comprising a primitive-based representation and two complementary visual pathways. Inspired by the fundamental principles of the human visual system, this approach decomposes visual information into primitive-based visual representations. By combining these primitives, it mimics the features of the human visual system, forming two complementary and information-complete visual perception pathways.

- Advertisement -

New Standards in Visual Sensing Technology

Based on this new paradigm, CBICR has developed the world’s first brain-inspired complementary vision chip, “Tianmouc”. This chip achieves high-speed visual information acquisition at 10,000 frames per second, 10-bit precision, and a high dynamic range of 130 dB, while reducing bandwidth by 90% and maintaining low power consumption. It overcomes the performance bottlenecks of traditional visual sensing paradigms and efficiently handles various extreme scenarios, ensuring system stability and safety.

Leveraging the Tianmouc chip, the team has developed high-performance software and algorithms and validated their performance on a vehicle-mounted perception platform running in open environments. In various extreme scenarios, the system demonstrated low-latency, high-performance real-time perception, showcasing its potential for applications in the field of intelligent unmanned systems. The successful development of Tianmouc is a significant breakthrough in visual sensing chips. It provides strong technological support for the advancement of the intelligent revolution and opens new avenues for applications such as autonomous driving and embodied intelligence. Combined with CBICR’s established technological foundation in brain-inspired computing chips like “Tianjic”, toolchains, and brain-inspired robotics, the addition of Tianmouc will further enhance the brain-inspired intelligence ecosystem, driving the progress of artificial general intelligence.

This research was supported by the STI 2030—Major Projects, National Nature Science Foundation of China, and the IDG/McGovern Institute for Brain Research at Tsinghua University.

Akanksha Gaur
Akanksha Gaur
Akanksha Sondhi Gaur is a journalist at EFY. She has a German patent and brings a robust blend of 7 years of industrial & academic prowess to the table. Passionate about electronics, she has penned numerous research papers showcasing her expertise and keen insight.

SHARE YOUR THOUGHTS & COMMENTS

EFY Prime

Unique DIY Projects

Electronics News

Truly Innovative Electronics

Latest DIY Videos

Electronics Components

Electronics Jobs

Calculators For Electronics