IBM and Samsung unveil breakthrough microchip design

Want your smartphone battery to last a week? This could make it happen.

This article is an installment of Future Explored, a weekly guide to world-changing technology. You can get stories like this one straight to your inbox every Thursday morning by subscribing here.

If you didn’t know semiconductor chips were important before the pandemic, you do now.

Also known as “microchips,” these little bits of tech form the brains behind new cars, phones, computers, gaming consoles, and countless other products we rely on everyday (which are now all abruptly in short supply). 

Now, IBM and Samsung have unveiled a new microchip design that promises radically faster, more efficient chips — and a big upgrade to all of our tech that relies on them.

The challenge: Semiconductor chips contain tiny gates called “transistors” that control the flow of electricity. This is what allows a traditional computer to process information, using a binary on/off code. (If a gate’s open, that’s a “1.” If it’s closed, that’s a “0.”)

To make faster, more powerful computers, we need microchips with more transistors, but to keep the chips, well, micro, we’ve had to keep packing more and more transistors into the same area.

We’re now hitting a wall in the amount of transistors we can pack onto a microchip.

The end of the line: In 1975, semiconductor pioneer Gordon Moore famously predicted that we’d be able to double the number of transistors on microchips every two years, and that projection — “Moore’s Law” — proved accurate for decades, thanks to heroic efforts by engineers.

We’re now hitting a wall in the amount of transistors we can pack onto a microchip, though. Making them smaller is just becoming physically impossible — the ones we have today are so small that 1,000 of them laid end-to-end are only the width of a human hair.

“The fact is, when they’re talking about building physical entities like semiconductors and transistors, you can only go so small,” Robert Sutor, chief quantum exponent at IBM, told Electronic Products

“Atoms have a certain size, and you’re approaching the almost atomic molecular distances at this point,” he continued. “You’re limited by that.”

IBM and Samsung have unveiled a new microchip design that promises radically faster, more efficient chips.

The idea: IBM and Samsung’s new design stacks transistor components vertically on a microchip, with electricity flowing up and down, rather than the traditional horizontal design, with electricity flowing from side to side.

They call their approach “Vertical-Transport Nanosheet Field Effect Transistor” (VTFET), and they say it could not only allow for more transistors on a microchip, but also make the flow of electricity through them more efficient.

https://youtu.be/OF3Zwfu6Ngc

Compared to today’s standard transistor design (called FinFET), devices with VTFET chips would have a “two times improvement in performance or an 85 percent reduction in energy use,” according to IBM.

That means batteries could last way longer (or be much smaller), and computers could produce much less waste heat, which impacts performance and requires loud fans to keep the system cool. 

In other words: smartphone batteries that last a week, rather than a day.

“Scaling silicon is not the only way to get lower power and better performance.”

Valeria Bertacco

The bigger picture: IBM and Samsung haven’t said when they plan to start producing chips with the stacked transistor design, so we don’t know when they might actually start showing up in your phone or watch, but it typically takes a few years to go from the drawing board to figuring out how to mass produce chips and building devices capable of using them.

However, the stacked design and more densely packed transistors are not the only ways we’re going to keep computer power growing in the decades to come.

“Scaling silicon is not the only way to get lower power and better performance,” Valeria Bertacco, director of the University of Michigan’s Applications Driving Architectures research center, told EE Times in 2019. “It was just the easiest way until 10 years ago.”

Replacing silicon with other materials, such as graphene, might unleash big upgrades in performance, too.

Another way is to specialize microchips for specific tasks and divide the computer’s labor.

“Hardware customized for particular domains can be much more efficient and use far fewer transistors, enabling applications to run tens to hundreds of times faster,” MIT researcher Tao Schardl told MIT News in 2020.

More efficient software and better algorithms could also allow progress to continue even if transistor numbers don’t keep increasing. Replacing the semiconductor materials, such as silicon, in microchips with other materials, such as graphene, might unleash big upgrades in performance, too.

Then there’s the ultimate game-changer: quantum computers, which aim to do away with transistor-laden microchips entirely. Instead, they encode information in entangled particles that can represent 1s, 0s, or both simultaneously, which dramatically increases the computer’s potential processing power.

But today’s silicon microchip electronics aren’t going to be replaced by quantum versions or even graphene any time soon, so in the near-term, we’ll have to rely on clever new designs to grind out another round of Moore’s Law improvements.

We’d love to hear from you! If you have a comment about this article or if you have a tip for a future Freethink story, please email us at tips@freethink.com.

Related
GitHub CEO says Copilot will write 80% of code “sooner than later”
GitHub CEO Thomas Dohmke goes in depth to answer questions about how AI-powered development will change the future of innovation itself.
No, AI probably won’t kill us all – and there’s more to this fear campaign than meets the eye
A dose of scepticism is warranted when considering the AI doomsayer narrative — there are commercial incentives to manufacture fear of AI.
How the world’s most sensitive yardstick reveals secrets of the universe
When two massive objects – like black holes or neutron stars – merge, they warp space and time. Here’s the tool that measures the resulting waves.
To fear AI is to fear Newton and Einstein. There are no “dragons” here. 
Who’s afraid of utopia? AI doubters have cold feet. History can warm them.
What is an AI black box? A computer scientist explains
AI black boxes refer to AI systems with internal workings that are invisible to us. What are the implications of working without transparency?
Up Next
mars sample return
Subscribe to Freethink for more great stories