IBM Has Figured Out How to Store Data on a Single Atom

ktsdesign/Shutterstock.com

Big things really can come in small packages.

Big things really can come in small packages.

IBM announced it has managed to successfully store data on a single atom for the first time. The research, carried out at the computing giant’s Almaden lab in Silicon Valley, was published in the scientific journal Nature March 8, and could have massive implications for the way we’ll store digital information in the future.

Computers process bits, pieces of information that have two states—on or off, interpreted as 1s and os by the machine. Every computer program, tweet, email, Facebook and Quartz post, is made up of some long series of 1s and 0s. When information is stored on a computer, it’s generally saved on a hard drive that encodes that same series of 1s and 0s on a magnetic disk or electrical cells. As IBM states in its release, the average hard drive uses about 100,000 atoms to store a single bit of information using traditional methods.

IBM’s researchers found a way to magnetize individual atoms of the rare earth element holmium and use the two poles of magnetism—north and south, as you’d see on a compass—as stand-ins for the 1s and 0s. The holmium atoms are attached to a surface of another material, magnesium oxide, which holds them in place, at a chilly 5 kelvin (-450°F).

Using essentially what is a very accurate, sharp and small, needle, the researchers can pass an electrical current through the holmium atoms, which causes their north and south poles to flip, replicating the process of writing information to a traditional magnetic hard drive.

The atoms stay in whatever state they’ve been flipped into, and by measuring the magnetism of the atoms at a later point, the scientists can see what state the atom is, mirroring the way a computer reads information it’s stored on a hard drive.

IBM says the researchers used a single iron atom to measure the magnetic field of the holmium atoms—turning it to measure what states the holmium atoms were in, like a tiny compass—and a scanning tunneling microscope, a powerful microscope developed by IBM (which won its inventors Gerd Binnig and Heinrich Rohrer the Nobel Prize for physics in 1986) to image the surface of individual atoms. The needle tip of the microscope was what researchers used to pass current through the atoms.

“Magnetic bits lie at the heart of hard-disk drives, tape and next-generation magnetic memory,” Christopher Lutz, nanoscience researcher at IBM’s Almaden lab, said in a release. “We conducted this research to understand what happens when you shrink technology down to the most fundamental extreme—the atomic scale.”

While the feat is exceedingly impressive, much like IBM’s announcement in 2015 it had created a minuscule semiconductor that would likely eventually form the backbone of the smallest, fastest computer processor in the world, it’s only the beginning of the work. This is just the first step in proving what might be possible with atomic-level computing, as now researchers, and later, chip manufacturers, need to show the technologies can be scaled.

A future where infinitely more massive hard drives are commonplace—IBM envisions the entirety of iTunes’ 35 million-song music library on a drive the size of a credit card—would make computers, phones, drones and just about anything else that needs to store information considerably thinner and lighter. Now all that remains is to see if that’s feasible, and affordable.

But given IBM’s 19 consecutive quarters of declining revenue, anything that could reinvigorate the company to its former heights could well be existential. Earlier this week, IBM turned another one of its long-term research projects, quantum computing, into its own business unit, in the hopes of finding clients to dream with Big Blue on the promise of how powerful quantum computers could potentially be—even if they aren’t right now.