IBM has said it has developed the world’s first 2nm chipmaking technology that should bring power and speed improvements to future computer chips.
The industry has been developing ever smaller chip designs. The most modern chipsets used by the likes of Samsung, Apple and Qualcomm are now built on 5nm technology.
IBM’s breakthrough represents another step on the quest for miniaturization. It is projected to achieve 45 percent higher performance, or 75 percent lower energy use, than today’s most advanced 7nm node chips.
The technology firm said the potential benefits of the chips could include a quadrupling of cell phone battery life. After that slashing the carbon footprint of data centers. Data centers account for one per cent of global energy use. So, that is a figure predicted to rise dramatically over the coming years. Changing all of their servers to 2nm-based processors could potentially reduce that number significantly, IBM said.
“The IBM innovation reflected in this new 2nm chip is essential to the entire semiconductor and IT industry,” said Darío Gil, director of IBM Research. “It is the product of IBM’s approach of taking on hard tech challenges and a demonstration of how breakthroughs can result from sustained investments and a collaborative R&D ecosystem approach.”
Fitting more transistors on a chip also means processor designers have more options to add innovative, specialized components to improve capabilities for specific applications like AI and cloud computing, as well as new pathways for hardware-enforced security and encryption.
IBM has achieved a number of semiconductor breakthroughs over the years, including the first implementation of 7nm and 5nm process technologies and single cell DRAM.
The technology will likely take several years to come to market. IBM used to be a major manufacturer of chips, but it now outsources its high-volume chip production to Samsung while maintaining a chip manufacturing research center in Albany, New York that produces test runs of chips.
The advance helps secure the continuation of Moore’s law, an observation first made by Intel founder Gordon Moore in 1965 that the number of transistors in a dense integrated circuit (IC) would double about every two years. While the law has largely been maintained up to now, innovation in this area is slowing as the size of transistors reaches the near-atomic level. In 2017, the director of future silicon technology for ARM Research told E&T that time was running out for Moore’s law and predicted that advancements in chip technologies would soon begin to slow.