Exploiting the trap of crypto mining

View all on-demand sessions from the Intelligent Security Summit here.


While it’s bad form to mock the rapid fall of cryptocurrencies, it does open up some serious opportunities. For those who don’t know, cryptominers have bought up virtually every high-capacity GPU on the market in recent years. This increased prices and reduced availability to the point where even major cloud providers couldn’t get their hands on current models.

Combined with Moore’s Law, this has led to a situation where the average GPU hardware used for anything other than crypto is several years old and is probably four times less powerful than normal market conditions would support. But this has also resulted in many software companies not optimizing their products for GPU. So on average, the software you’re using is probably ten times slower than it should be.

That’s probably the biggest market opportunity in a generation, and savvy companies should now look at how to exploit it. Making your word processor or spreadsheet ten times faster probably won’t give you great business value. But there are several key areas that will.

Analyzing data and database systems

The most obvious area is database systems, especially those that operate on big data. The digitization of the world in general has not slowed down, and as a result, systems built on legacy databases today struggle to keep up. This is not always obvious to end users as a database problem, but usually manifests as painfully slow screen refresh rates or stuck, busy cursors.

Event

Intelligent Security Summit on demand

Learn the critical role of AI and ML in cybersecurity and industry-specific case studies. Check out on-demand sessions today.

Look here

This has been mitigated somewhat by a move to cloud computing with automatic horizontal scaling (adding more CPUs). However, as data volumes become very large, the process of moving data between systems and between CPU boxes becomes speed-limiting. The result is non-linear efficiency, where doubling the applied computing power, for example, only gives you 50% more speed.

The implicit response of most companies in this circumstance is essentially not to even look at all the data. For example, you can aggregate data hourly to daily or daily to monthly. Under normal operating conditions with well-understood data, this can be fine. However, it carries some risk because modern data science techniques require access to the primary granular data to gain a fundamental type of insight: anomaly detection.

Don’t ignore outliers

Deviations can be good or bad, but they are rarely neutral. They represent your best and your worst customers and your company’s best and worst reactions. They include issues of high business risk as well as rewards. So solving a technology limitation by ignoring outliers is penny-wise and pound-foolish.

A classic example is the utility companies that until recently – and sometimes still do – use data with a resolution of 1 km to monitor the risk of tree stumps and wildfires. A single pixel in such a system can have 1000 healthy trees and one dead one. But it only takes one tree to hit a power line and start a wildfire big enough to put a major utility company out of business.

The business risk, in that case, is hidden in decades-old data collection decisions under even older database technology, but it is nevertheless very real. And today would be a really good time to get started with it, as resources and methods have evolved rapidly over the past five years and have generally not taken advantage of GPU analytics or new hardware.

Uncover hidden market opportunities

A similar situation occurs with prospect and customer data within many companies. An accounting mindset and older technology can lead to the routine aggregation of data into monthly and quarterly reports ad nauseam. But you should never forget that your customers are individuals whose cumulative experience across multiple touchpoints is the foundation of the likelihood to buy or recommend (or lack thereof). As with the risk above, by default, market opportunities are hidden in common aggregations such as sums and averages.

This brings up another very important issue in business analysis, which is who within an enterprise has the authority to find such risks or opportunities. Perhaps the most important reason to upgrade older systems with GPU analysis is the availability of interactive visual analysis without code. As the name implies, this allows a much larger number of people within an organization to notice a risk or opportunity and interactively dig in to confirm or deny it. This could very well be a salesperson or frontline worker who is not traditionally thought of as a “data analyst” or “data scientist.”

Next steps for data and current systems

All business situations are unique, so a company’s next move here may vary. But as a simple next step, managers should consider which parts of the business functions they are responsible for using datasets or software tools that are more than five years old. Then look more specifically at ‘big’ data that is available compared to current systems and what value it can deliver.

If they see an opportunity, they should consider what kind of rapid pilot they can organize to validate it. Paradoxically, it can be difficult to evaluate without access to interactive GPU analytics. So companies should talk to suppliers and consider testing in a cloud environment. The pain of the cryptominers may well be the profit of companies.

Mike Flaxman is product manager at Heavy AI.

Data decision makers

Welcome to the VentureBeat community!

DataDecisionMakers is where experts, including the technical people who do data work, can share data-related insights and innovation.

To read about advanced ideas and up-to-date information, best practices and the future of data and data technology, join DataDecisionMakers.

You might even consider contributing an article yourself!

Read more from DataDecisionMakers