Edge computing is an approach in which the data is processed and analyzed at the point of origin – the place where the data is generated. This is done to make data more accessible to end-point devices, or users, and to reduce the response time for data requests. HPC-class computing and networking technologies are critical to many edge use cases, and the intersection of HPC and ‘edge’ promises to be a hot topic in 2022. In this Q&A, Hyperion Research Senior Adviser Steve Conway describes the characteristics of edge computing and its relationship with HPC, including the edge-to-exascale paradigm.
Technovanguard: There seems to be a growing buzz about edge computing in HPC circles. In fact, you mentioned it in Technovanguard’s “See what we see at SC” video.
Steve Conway, Hyperion
Steve Conway: Over time, the rise of edge computing could have a major impact on the global HPC community, both on premises and in the cloud. Edge computing is creating an important opportunity for HPC that users and vendors are starting to build into their plans. These are sometimes labeled “edge-to-exascale” strategies, or for the more excitable, “metaverse” strategies.
Technovanguard: What is edge computing? How do you define it?
Conway: Edge computing is a relatively new form of distributed computing where much or all of the computation is done directly on or near data sources. This contrasts with the historical practices for distributed systems that typically involve sending all source data to distant centralized datacenters or cloud computing platforms. In many cases, the limited computing power available at or near the data sources is adequate; and in cases where deeper analysis is required, typically only a small subset of edge computing results needs to be sent to datacenters or clouds or containers for processing on more powerful computers.
Edge data sources roughly correspond to the Internet of Things devices and include vehicles and traffic sensors, medical devices, product manufacturing lines, military sites, and many other data-generating sources. The IoT is no longer just an “Internet of stupid things” such as home appliances. Today’s IoT also includes sophisticated devices that generate much larger, more complex data. And the edge isn’t always earthbound; satellites and spacecraft also generate a lot of complex data that needs processing.
“The IoT is no longer just an “Internet of stupid things” such as home appliances. Today’s IoT also includes sophisticated devices that generate much larger, more complex data. And the edge isn’t always earthbound; satellites and spacecraft also generate a lot of complex data that needs processing.”
Technovanguard: What are the main advantages and disadvantages of edge computing?
Conway: Most edge attributes are advantages, which is why Hyperion Research and others expect robust growth for edge computing. The main advantages are faster results, lower costs, higher autonomy and reliability, greater privacy protection and security and, most important of all, scalability.
Edge computing’s low latency enables faster responses to events in the field, such as identifying traffic violators, shoplifters and cyber criminals in time for apprehension, giving weather forecasters extra minutes to alert local communities to severe storms, or allowing cities and towns to re-route traffic before serious congestion happens. Processing all or most data at the edge can also substantially reduce the cost of network services and storage in clouds and datacenters. Because edge systems are not typically shared resources, they can offer greater privacy than long-distance networks. Also, since less data is transmitted from edge systems, the security and regulatory policies that are most appropriate for a user’s organizational business or occupation can be implemented at the edges.
But the single greatest benefit of edge computing is scalability—the ability to efficiently handle growth in the volume of source data. Without edge computing, centralized facilities might need to become prohibitively large and expensive as the number of edge devices and the volume of source data grow. Edge computing’s reliance on small, modular data processing units, close to data sources, means that edge computing initiatives can often expand cost-effectively to nearly unlimited numbers of these units.
Technovanguard: What about edge data security?
Conway: Where data security is concerned, there are advantages and disadvantages today. On the plus side, large amounts of data are more difficult to steal from many edge locations than from one central server, small data processed at the edge is usually less mission-critical than data sent to central servers, and keeping most data at the edge makes central servers less likely to be attacked. On the minus side, edge devices may not be designed or tested with cyber security in mind, loopholes and vulnerabilities in edge security may provide network access to central servers, and edge devices may be physically small enough to steal or manipulate.
Technovanguard: What do you see as HPC’s role in edge computing? How do HPC and edge intersect and what is enabled by that intersection?
Conway: There’s an understandable tendency in the HPC and larger IT communities to see edge computing as an extension of what happens in datacenters and clouds. But it’s also important to view things from the perspective of the edge, where a large majority of the data may be transient, frequently overwritten, and never need more powerful resources in datacenters and clouds.
From this perspective, HPC has a crucial role to play in the important subset of edge computing applications that need wide-area analysis and control, as opposed to just local responsiveness. A large portion of the onetime Top500-leading Tianhe-1a supercomputer, for example, was dedicated to urban traffic management in Guangzhou. Hyperion Research and other experts who follow this closely believe HPC may be the glue that unifies the emerging global IT infrastructure, from edge to exascale.
Technovanguard: Is this a new role for HPC?
Conway: HPC is no stranger to processing “big data” aggregated from many local sources for wide-area analysis, a forerunner of edge computing. Prominent examples include numerical weather forecasting based on high-volume data supplied by local human observers and sensor-bearing weather balloons, monitoring of telecommunications by government agencies around the world, and mosaicing of satellite images by space agencies to produce virtual flyovers of earth and other planets, to name a few.
Technovanguard: What are some emerging edge applications that will need HPC support?
Conway: Many existing HPC-supported science and engineering applications will benefit from increasing use of edge data. On the commercial and dual-use sides, besides urban traffic management and related automated driving systems, other prominent edge use cases for HPC include precision medicine, fraud and anomaly detection, business intelligence, smart cities development and affinity marketing. Hyperion Research’s recently completed in-depth study of the worldwide HPC market found that 80 percent of the surveyed HPC sites run or plan to run one or more of these applications, which often combine simulation with AI methodologies. So, the HPC edge trend is well under way.
Technovanguard: A lot of what happens at the edge involves data. How does AI figure into edge computing and HPC’s role in it?
Conway: HPC is at the forefront of AI R&D today and AI methods will be crucially important for HPC’s role in edge computing, but both data-intensive simulation and data-intensive analytics will be needed, sometimes in combination to support the same workload. HPC’s role in edge computing is based mainly on ultrafast computing, ultrafast data movement, and ultralarge and capable memory and storage systems.
Technovanguard: Will edge computing affect the roles of HPC vendors? In what ways?
Conway: Edge computing promises to reorient the global IT infrastructure and affect the roles of HPC system vendors, cloud services providers (CSPs), networking and storage suppliers, and others. HPC has been a self-contained niche market, but the edge computing opportunity will pull HPC into the larger IT mainstream for an important but limited role at the top of the edge-to-exascale food chain. Leading HPC vendors are already starting to exploit this new opportunity, not only in datacenters and clouds, but also with HPC containers close to edge locations. Distances from the edge and latencies are going to be important for determining the roles of HPC and other edge computing resources. One challenge for HPC vendors is achieving greater integration with the mainstream IT market and supporting open standards that can apply from edge to exascale. For leading HPC vendors, this might mean working more closely with business units within their own companies that serve the mainstream market.
“HPC has been a self-contained niche market, but the edge computing opportunity will pull HPC into the larger IT mainstream for an important but limited role at the top of the edge-to-exascale food chain. Leading HPC vendors are already starting to exploit this new opportunity, not only in datacenters and clouds, but also with HPC containers close to edge locations.”
Technovanguard: What other companies should we be paying attention to?
Conway: Technovanguard already has most of the bases covered, in my opinion, by tracking today’s important HPC vendors. I think companies involved in networking, including 5G and 6G, will be useful to watch. So will organizations pursuing open standards, cost- and energy-efficient processors, cyber security and distributed computing benchmarks.
Technovanguard: Major HPC vendors – and more broadly a number of tech giants – are using terms like “metaverse” and “omniverse.” To what extent are those concepts related to edge computing and HPC?
Conway: As I’ve seen these terms used, they usually refer to an immersive environment that enables people to experience the world as a virtual or augmented reality. These experiences available to many people will need to happen mostly at the edge, using edge and near-edge resources that might sometimes include HPC containers. I see HPC having an important R&D role in developing this environment and the more challenging experiences. Companies including SGI and Cray created some impressive HPC-enabled synthetic reality experiences in the 1990s, including flyovers and 3D virtual tours and training modules for captains in shipping companies.
Technovanguard: What is the market opportunity for “edge computing” and – more relevant to our readers – what impact will the advance of edge computing have on the HPC market?
Conway: It’s safe to say that the edge computing opportunity will add revenue to the HPC market, but it’s too early to quantify that. The whole edge ecosystem needs time to come together.
Technovanguard: Another topic with a lot of buzz right now is composable computing. Is there a link between composable and edge computing?
Conway: Sure. As the requirements for HPC systems become more heterogeneous, it becomes more difficult, technically and economically, to satisfy them effectively with a single, monolithic architecture. You risk frequently wasting HPC resources, over- or under-provisioning specific capabilities for the workloads in use. HPC vendors are wrestling with this issue, which isn’t an easy one to resolve quickly.
Bio: Steve Conway is Senior Adviser of HPC Market Dynamics at Hyperion Research. Conway directs research related to the worldwide market for high performance computing. He also leads Hyperion Research’s practice in high performance data analysis (big data needing HPC).
The post Edge to Exascale: A Trend to Watch in 2022 appeared first on Technovanguard.