We have a data problem.
Humanity is now generating more data than it can handle; more sensors, smartphones, and devices of all types are coming online every day and contributing to the ever-growing global dataset.
In fact, estimates for the amount of data we will generate this year alone are hovering around 40 zettabytes (or about 2.5 billion times more data than is contained in the library of Congress). Compare that to the roughly one zettabyte produced in 2010 and it’s not hard to see that we are drowning in data.
It’s valuable information to be sure, but it’s simply too much for our current computing and bandwidth capabilities to process. And it’s only going to get worse with the Internet of Things and other large networks such as 5G, which will require real-time smart data processing in addition to capable connectivity and communication.
But fear not: a promising solution known as “edge computing” is emerging.
The idea is that storing and analyzing data closer to the device or instrument, rather than sending it further away to the cloud, enables faster and more efficient data analysis. Such a capability would allow us to analyze this information effectively and, in turn, discover solutions to some of our most pressing problems, from traffic congestion to the spread of disease to clean energy alternatives.
But to truly be effective, some significant technological advances are necessary. Thankfully, ORNL researchers Ali Passian and Neena Imam have surveyed the edge computing landscape, as well as novel nanoscale technologies, to better understand how to simultaneously advance both edge computing and nanoscience to benefit scientific progress. Their work was published in the journal Sensors.
The answer, they conclude, lies in the development of next-generation materials at the nanoscale and beyond.
Researchers are manipulating materials at increasingly smaller scales to create unique behaviors, both quantum and classical in nature, that could lead to interconnects, processors, and transistors exponentially more powerful than those available today.
For example, computations performed at the molecular and atomic scales have been demonstrated, but they need to be drastically scaled up to be practical. And novel information carriers such as skyrmions-particles with novel magnetic properties-could revolutionize the way in which data is transferred.
“All of the hype around edge computing presents an excellent opportunity for nanosystem R&D, which is necessary for a full, secure network of countless edge devices,” said Passian, a research scientist in ORNL’s Quantum Information Science group. “For edge computing to succeed, next-generation nanosystems will have to first be developed.”
The pursuit of low-power sensors, signal generating devices and arrays, energy efficient and secure computing, storage, and fast communication processes could lead to technological progress rarely, if ever, seen in modern history.
Same tech, different scale
The idea of edge computing was born out of the limitations of cloud computing and was largely a result of telecom and IT needs. But as the data have grown, so has edge computing’s potential to transform scientific inquiry.
The explosion of sensors across society, however, has presented edge computing with bandwidth, latency, and storage issues.
One solution to these challenges lies in the burgeoning field of artificial intelligence, which will be critical to managing edge devices and to controlling traffic across the various networks. By incorporating a high-performance processor with built-in AI, edge computing can perform local decision-making and send only relevant data to the cloud, thus increasing the performance of various networks. AI in the cloud could also control the functions of edge devices.
And just as high-performance computing has guided the development of AI, so too will it be instrumental in guiding the development of edge computing. For instance, the modeling and simulation of edge devices will be critical, and HPC technology shows great promise for being used at the edge as well.
It’s a good thing, too, because the energy consumption of data centers and supercomputers is rising fast, increasing the need for novel architectures and technologies. Energy-efficient microprocessors are critical to the evolution of HPC, as well as for future edge devices. And just as supercomputers are expected to be fast, secure, and use as little power as possible, edge computing devices are expected to do the same, albeit at much different scales.
But both require significant advances in nanotechnology to realize their potential. “Edge computing and nanosystems may become one entity, where device and function come to interact dynamically,” Passian said.
Living in a material world
Approximately 27 percent of all materials in nature are estimated to be topological, and as such they enable electricity or light to move unhindered without resistance or backscattering. These materials exhibit unique quantum properties of great interest to nanoscientists and engineers due to their potential in advancing capabilities across the computing and data landscapes.
Quantum effects also show promise in the fields of networking and sensing; for instance, write Passian and Imam, quantum effects have been demonstrated to carry information up to approximately 1,400 kilometers in free-space channels, a phenomenon that could greatly benefit edge computing and sensing.
But perhaps most importantly, edge devices must be secure, and one of quantum communication’s greatest strengths is its ability to securely and rapidly transmit information across great distances.
Since quantum may still be impractical or difficult to apply to the edge’s challenges, however, other potential technologies are being explored to usher in the edge revolution. But new materials are needed to design the necessary processors, circuits, and transistors.
Some of the most promising candidates include carbon nanotubes (CNTs), graphene, and molybdenum disulfide. Due to their nanometer-size, CNTs are currently the most promising alternative to transistors, and CNT-based field effect transistors are leading to faster, more efficient processors and sensors.
There’s also a massive research effort around photonic systems. It is now possible to integrate photonic components on a single chip, and photonic technology can be married with other systems to create innovative computing and networking platforms.
Plasmonic and optical interconnects show potential for making these systems more efficient; for instance, “an information-carrying photon may be converted into an information-carrying plasmon that can propagate through a quantum plasmonic circuit in an optical computer or processor,” the authors write. However, the challenge of confining and controlling photons, which is necessary for the shrinking and integration of potential devices, still remains.
Finally, neuromorphic computing, which mimics the processes of the human brain, is also emerging as a potential edge platform.
In the end, the authors conclude that quantum and topological materials offer exciting and promising areas for the evolution of both nanotechnology and edge computing. But whatever the outcome, there is little doubt that edge computing will have a significant impact on numerous scientific fields as it matures.
Although challenges such as security and improved software still stand, nanoscience is providing a range of robust and promising solutions. And the intersection of these two burgeoning fields will likely unlock technologies that were unimagined just a few years ago.
But the edge computing community must collaborate with the materials and computing hardware communities.
“We need communications across disciplines,” said Passian. “Just as math is transforming biology and vice versa, edge computing and nanoscience are transforming each other.”
“Edge computing is a growing trend but a lot of research remains to be done to move computing to the edge,” said Imam, a distinguished research scientist and deputy director of research collaboration for ORNL’s computing and computational sciences directorate. “Significant reduction in data latency, compared to centralized processing, needs to be demonstrated to justify the investment, as does resiliency at the edge compute nodes.”
This work was supported by the United States Department of Defense.