Researchers have developed a new device which uses the positions of atoms to represent information rather than the number of electrons.
Artificial Intelligence (AI), machine learning (ML), neural networks-are all just softwares and are limited by the hardware on which they run. Hardware becomes a bottleneck no matter how better the software might turn out to be. But with the introduction of new alternate materials and advanced technologies this can change.
A team of researchers from University of Southern California and the University of Massachusetts, have developed a protocol for devices to reduce “noise” and demonstrated the practicality of using this protocol in integrated chips. This demonstration was made at TetraMem, a startup company co-founded by Yang and his co-authors (Miao Hu, Qiangfei Xia, and Glenn Ge), to commercialize AI acceleration technology.
This new memory chip has the highest information density per device (11 bits) among all types of known memory technologies thus far. Such small but powerful devices could play a critical role in bringing incredible power to the devices in our pockets. The chips are not just for memory but also for the processor. Millions of them in a small chip, working in parallel to rapidly run your AI tasks, could only require a small battery to power it.
The new device combines silicon with metal oxide memristors in order to create powerful but low-energy intensive chips. The technique focuses on using the positions of atoms to represent information rather than the number of electrons (which is the current technique involved in computations on chips). The positions of the atoms offer a compact and stable way to store more information in an analog, instead of digital fashion. Moreover, the information can be processed where it is stored instead of being sent to one of the few dedicated “processors,” eliminating the so-called ‘von Neumann bottleneck’ existing in current computing systems.
This new method, focusing on activating atoms rather than electrons, does not require battery power to maintain stored information. Similar scenarios happen in AI computations, where a stable memory capable of high information density is crucial. The new innovation, followed by some further development, could put the power of a mini version of ChatGPT in everyone’s personal device. It could make such high-powered tech more affordable and accessible for all sorts of applications.
Reference : Mingyi Rao et al, Thousands of conductance levels in memristors integrated on CMOS, Nature (2023). DOI: 10.1038/s41586-023-05759-5
Chip technology is really powerful!