IBM is having a go at Moore’s Law
Andy Ruffell / 13 years ago
On Thursday, IBM announced that its scientist managed to cut the physical requirements for a bit of data, with the number of required atoms being reduced from a million to only 12.
The use of such would mean a higher density and more space, making 1TB obsolete, while 100TB or 150TB would become mainstream.
Through the research, IBM used antiferromagnetism to achieve 100 times denser memory. Antiferromagnetism refers to magnetic moments of atoms or molecules where they align with neighbouring spins pointing in opposite directions. As for the current devices, they use ferromagnetic materials.
The use of antiferromagnetism leaves some type of problem; when exceeding a certain temperature, called Néel temperature, the bit size becomes much more than 12 atoms. This presents a very small issues, as it still turns out to be better than the current technology anyway.
For the experiments, IBM used iron atoms on copper nitrate, but it is said that other materials could theoretically do even better, or in other words, use less atoms per bit.
IBM’s researcher Andreas Heinrich said:
“Moore’s Law is basically the drive of the industry to shrink components down little by little and then solve the engineering challenges that go along with that but keeping the basic concepts the same. The basic concepts of magnetic data storage or even transistors haven’t really changed over the past 20 years (…)The ultimate end of Moore’s Law is a single atom. That’s where we come in.”
For more information, please refer to this article.
Source: Fudzilla