The 3 Biggest Opportunities in HPC

    

Arrows-Guiding-Different-Directions-of-HPC-Opportunities

 

 

 

 

 

 

 

“Big data” has changed the world forever. Admittedly, some companies are still trying to figure out what “big data” is — both how to harness it and how to store it. It’s also unclear how, exactly, machine learning fits into this big new “big data” paradigm. While this may seem unclear to some, it’s this ambiguity that creates significant opportunity in the world of high performance computing (HPC).

In our mind that is probably one of the biggest opportunities that not enough people are currently involved with,” says Rodrigo Aramburú, CEO at technology company Blazing DB. “Most HPC applications are applied to bioinformatics, genome analysis, protein or biological simulations, but we feel that there is a huge opportunity to give many people the ability to ask questions of exorbitantly large data sets.”

Here are three key areas to watch: 

1. Speed Improvements

As the amount of data grows, engineering has become core to every aspect of an organization. It’s mission-critical to have systems with continuous uptime real-time response is a necessity. Sunder Singh, global head of consulting at Tata Consultancy Services, explains that opportunities to increase agility are in memory computing.

“This is where the industry is moving: to crunch information very quickly, right inside computer memory, at speeds never heard before,” says Singh. “With an engineered system, just plug it in, switch it on, and you have the capacity and capability to crunch large volumes of data. You've moved from purchase to implementation in a fraction of the time required if you did it all yourself, promoting speed to market.”

Testing and iteration will yield to gradual improvements in speed that over time appear drastic.

“The huge promise here is adoption by critical business applications like banking and financial services, where down time needs to be zero,” says Singh. “Taking this technology to those business areas will create a sea of transformation of all those legacy systems their disaster/recovery ecosystems will be uprooted because of this leap in technology.”

2. Information Management

There is more data being collected now than ever before. Imagine every data point on the Internet. Now think about a universe where a limitless archive exists. One day, this information will be accessible, and HPC will be necessary to retrieve it.

“First, social media evolved from platforms used by people to track vacation photos from friends to a core marketing technology used by businesses to interact with customers, track competitors, and test product ideas,” says Singh. “Processing that much data in near real-time as possible requires incredible amounts of data to be acquired, stored, processed, and protected.”

HPC will simplify the process of managing and crafting analysis from these archives. The technology will help ensure that every data point is part of a larger story.

3. Immediate Access to Data

There is always a lag time between information collection and analysis. HPC, however, is shortening this delay.

“HPC will move to real-time data analysis through FPGA acceleration,” says Pat McGarry, VP of engineering at high performance data analysis firm Ryft. “Is real-time analysis of data really that much more valuable than the wait times that we’ve grown accustomed to accepting with data preparation and indexing? The answer is unequivocally yes, and this answer is true across many industries.”

In addition to increasing speeds in delivery, HPC will help organization process new data in real-time.

“Being able to take a new genome sequence and immediately tell if it matches something that is already known provides massive process improvements,” says McGarry. “And if researchers can get faster answers, that leads to improvements to the bottom line.”


Ritika Puri is a contributing author.

High Performance Computing eBook