Compute

The Compute Stack

The Compute Stack

Vivek Raman
Key Takeaway #1

Demand for Compute is here to stay, and the rise of AI is an inflection point catalyzing exponential growth in Compute.

Key Takeaway #2

Compute is a two-sided marketplace. Demand for Compute is driven by applications, like AI, computational biology, institutional finance, graphics rendering, autonomous cars.

Key Takeaway #3

Supply for Compute is provided by what we refer to as the “Compute Stack” – consisting of data centers, hardware players, cloud service providers, and power producers.

Key Takeaway #4

Ultimately, we could see the Compute Stack decentralize into a diversified ecosystem.

The tectonic plates within the technology sector are rapidly shifting, catalyzed by the rapid rise of AI over the past six months. AI is causing a Cambrian explosion in infrastructure as the demand for “Compute” – which is the underlying fuel behind AI, Bitcoin mining, trading, graphics rendering, and more – is in a one-way trend higher. Demand for Compute is vastly exceeding supply, putting strain on the “Compute Supply Chain.” In BitOoda’s “High Performance Compute Primer” published on 5/12/23, we defined Compute and provided an overview of the broader Compute ecosystem. In this report, we will focus on the underlying infrastructure forming the supply side of Compute, which we refer to as the “Compute Stack.” ​

Data centers form the foundation of the Compute Stack. Data centers, which have been regarded as stable-growth infrastructure plays with parallels to the real estate industry, have suddenly become coveted growth engines. The “digital real estate” provided by data center infrastructure is now massively in-demand, positioning these data centers at the nexus of AI’s ascent.​

Hardware players also are thriving in the Compute ecosystem. The most prominent example is NVIDIA, which made its bet on the AI boom years (if not decades) ago and therefore is dominating the market for next-generation GPUs (A100s, H100s), which are being used to train Large Language Models (LLMs) in the still-nascent AI boom. NVIDIA therefore has (temporarily) uncontested pricing power, which, coupled with a surge in demand, is setting it up as a winner. However, as Bezos has said, “your margin is my opportunity,” and it is likely that other hardware players will develop hardware acceleration solutions to compete. ​

Cloud Service Providers, namely “hyperscalers,” comprise the front lines of the Compute Stack. The AI revolution and ensuing surge in Compute demand is occurring from the top-down, with the largest hyperscalers (AWS, Google Cloud, Microsoft Azure) capturing the initial value from AI model infrastructure (for example, Microsoft’s recent deal with OpenAI). However, like with hardware players, we expect increased competition for hyperscalers over the long run.​

Power producers are the engine room of the Compute space. The least talked-about (yet still significant) component of the Compute Stack is the underlying power producer. It is ironic that power consumption of Bitcoin mining was an immediate attack vector against BTC, while the surging demand for power to train and operate AI models has not come into the spotlight. Nevertheless, the AI industry could be magnitudes larger than BTC (since AI permeates all industries), and the power requirements will be significant. Ultimately, we expect that AI data centers and other players in the Compute Stack will adopt renewable energy mandates.​

The Compute Stack will rapidly evolve. A generic philosophy is that in technology, “change is the only constant,” and we believe this will apply to the Compute Stack. AI’s mainstream revolution is so new, and the emerging demand drivers for Compute (e.g., zero knowledge proofs) are on the horizon and rapidly approaching. As the Compute industry scales, we could see disruption across all layers of the Compute Stack, with decentralization of data centers, competitors in the hardware acceleration arms race, and disintermediation of the hyperscalers. ​

Premium Content

Unlock exclusive insights with our cutting-edge digital finance platform. Gain access to next-gen data analytics and digital asset products crafted with applied science. Subscribe now to stay ahead of the curve.

  • Research and Consulting
  • Investment Banking and Advisory
  • Sales and Origination
  • HPC and Power Advisory
Request Access Now!

The tectonic plates within the technology sector are rapidly shifting, catalyzed by the rapid rise of AI over the past six months. AI is causing a Cambrian explosion in infrastructure as the demand for “Compute” – which is the underlying fuel behind AI, Bitcoin mining, trading, graphics rendering, and more – is in a one-way trend higher. Demand for Compute is vastly exceeding supply, putting strain on the “Compute Supply Chain.” In BitOoda’s “High Performance Compute Primer” published on 5/12/23, we defined Compute and provided an overview of the broader Compute ecosystem. In this report, we will focus on the underlying infrastructure forming the supply side of Compute, which we refer to as the “Compute Stack.” ​

Data centers form the foundation of the Compute Stack. Data centers, which have been regarded as stable-growth infrastructure plays with parallels to the real estate industry, have suddenly become coveted growth engines. The “digital real estate” provided by data center infrastructure is now massively in-demand, positioning these data centers at the nexus of AI’s ascent.​

Hardware players also are thriving in the Compute ecosystem. The most prominent example is NVIDIA, which made its bet on the AI boom years (if not decades) ago and therefore is dominating the market for next-generation GPUs (A100s, H100s), which are being used to train Large Language Models (LLMs) in the still-nascent AI boom. NVIDIA therefore has (temporarily) uncontested pricing power, which, coupled with a surge in demand, is setting it up as a winner. However, as Bezos has said, “your margin is my opportunity,” and it is likely that other hardware players will develop hardware acceleration solutions to compete. ​

Cloud Service Providers, namely “hyperscalers,” comprise the front lines of the Compute Stack. The AI revolution and ensuing surge in Compute demand is occurring from the top-down, with the largest hyperscalers (AWS, Google Cloud, Microsoft Azure) capturing the initial value from AI model infrastructure (for example, Microsoft’s recent deal with OpenAI). However, like with hardware players, we expect increased competition for hyperscalers over the long run.​

Power producers are the engine room of the Compute space. The least talked-about (yet still significant) component of the Compute Stack is the underlying power producer. It is ironic that power consumption of Bitcoin mining was an immediate attack vector against BTC, while the surging demand for power to train and operate AI models has not come into the spotlight. Nevertheless, the AI industry could be magnitudes larger than BTC (since AI permeates all industries), and the power requirements will be significant. Ultimately, we expect that AI data centers and other players in the Compute Stack will adopt renewable energy mandates.​

The Compute Stack will rapidly evolve. A generic philosophy is that in technology, “change is the only constant,” and we believe this will apply to the Compute Stack. AI’s mainstream revolution is so new, and the emerging demand drivers for Compute (e.g., zero knowledge proofs) are on the horizon and rapidly approaching. As the Compute industry scales, we could see disruption across all layers of the Compute Stack, with decentralization of data centers, competitors in the hardware acceleration arms race, and disintermediation of the hyperscalers. ​

The Compute Stack - Data Centers

  • The data center was historically viewed as a somewhat sleepy, stable play with predictable cash flows and low-but-steady growth. However, the surge in demand for Compute has broadened the scope of a data center from a play on physical real estate to a play on digital real estate as well. Data centers can now optimize on cost (power price), location, efficiency, uptime, and hardware profile (GPUs, FPGAs, ASICs, etc.) to compete for contracts from hyperscalers and new AI players.​
  • Data centers, while foundational infrastructure players that are generally abstracted away in the Compute Stack, are complex operations. The location of data centers is important (close to power sources, close to exchanges for high frequency trading / minimal latency, etc.). Data centers need to be increasingly energy efficient. Lastly, security is paramount, with the need for sacrosanct data privacy and computational integrity. ​

Figure: The Future of Data Centers
Source: https://research-assets.cbinsights.com/2019/01/24160814/Data-Centers-of-the-Future1.png/

The Compute Stack - Hardware Players

  • Although the hardware space has been dominated by the meteoric rise of NVIDIA in 2023, the hardware landscape almost certainly will diversify over time (either by regulatory demand or by market demand), as we could see an arms race in hardware solutions tailored for AI. ​
  • Currently, GPUs are used to train LLMs in AI, with NVIDIA’s A100 and H100s as the most popular and most high-profile solutions today. It is unlikely that NVIDIA’s A100s and H100s will be the only solutions for AI training and inference going forward, and other hardware players and solutions are likely to come into the fray.​
  • It remains to be seen whether more specialized hardware in the form of ASICs or FPGAs can be used for more efficient AI utility, just as the Bitcoin mining industry evolved from CPUs to GPUs to specialized ASICs. However, for now, NVIDIA’s GPUs are the gold standard.​

Figure: NVIDIA H100
Source: https://www.nvidia.com/en-us/data-center/h100/

The Compute Stack - Cloud Service Providers / Hyperscalers

  • The Compute and AI revolution is manifesting itself in a top-down, trickle-down fashion. Cloud Service Providers (CSPs) are not only the “front ends” for AI and other Compute applications, but the concentration of Cloud Service Providers marketshare into the major hyperscalers has resulted in vertical integration potential across the Compute Stack.​
  • The largest hyperscalers – namely AWS, Google Cloud, and Microsoft Azure – have built massive economies of scale, brand recognition, regulatory and trust infrastructure, and adoption ahead of the surge in AI. As a result, the Cloud Service Provider space is top-heavy, and the major AI front ends are powered by the hyperscalers (ChatGPT via Microsoft/OpenAI, Google Bard).​
  • The overarching theme of this piece is that incumbents will see increasing competition from new entrants across the Compute Stack, and we expect the same for the hyperscalers as the CSP landscape decentralizes.

Figure: Hyperscalers; Source: Various Company Logos

The Compute Stack - Power Producers

  • Last, the underlying power producers are used by all other parts of the Compute Stack, chiefly data centers and hyperscalers. ​
  • While power consumption of the AI and Compute industry has not yet entered mainstream discussion (analogous to how the BTC mining industry became scrutinized for its power consumption – and thus pivoted to add renewable, ”green” mining solutions – it is inevitable that the resource consumption of the exponentially growing AI sector will enter the spotlight.​
  • Therefore, power players will be of increasing importance in the Compute Stack as an emphasis on clean, renewable power will be preferred. Power producers that proactively look to onboard Compute users and seek to capture this growing market could emerge as winners.​
  • Ultimately, the Compute Stack will be democratized as it matures over time.​

Figure: Power in the Compute Stack
Source: https://www.google.com/about/datacenters/efficiency/

Disclosures

Purpose

This research is only for the clients of BitOoda. This research is not intended to constitute an offer, solicitation, or invitation for any securities and may not be distributed into jurisdictions where it is unlawful to do so. For additional disclosures and information, please contact a BitOoda representative at info@bitooda.io.​

Analyst Certification

Vivek Raman, the primary author of this report, hereby certifies that all of the views expressed in this report accurately reflect his personal views, which have not been influenced by considerations of the firm’s business or client relationships.​

Conflicts of Interest

This research contains the views, opinions, and recommendations of BitOoda. This report is intended for research and educational purposes only. We are not compensated in any way based upon any specific view or recommendation.​​

General Disclosures

Any information (“Information”) provided by BitOoda Holdings, Inc., BitOoda Digital, LLC, BitOoda Technologies, LLC or Ooda Commodities, LLC and its affiliated or related companies (collectively, “BitOoda”), either in this publication or document, in any other communication, or on or through http://www.bitooda.io/, including any information regarding proposed transactions or trading strategies, is for informational purposes only and is provided without charge.  BitOoda is not and does not act as a fiduciary or adviser, or in any similar capacity, in providing the Information, and the Information may not be relied upon as investment, financial, legal, tax, regulatory, or any other type of advice. The Information is being distributed as part of BitOoda’s sales and marketing efforts as an introducing broker and is incidental to its business as such. BitOoda seeks to earn execution fees when its clients execute transactions using its brokerage services.  BitOoda makes no representations or warranties (express or implied) regarding, nor shall it have any responsibility or liability for the accuracy, adequacy, timeliness or completeness of, the Information, and no representation is made or is to be implied that the Information will remain unchanged. BitOoda undertakes no duty to amend, correct, update, or otherwise supplement the Information.​

The Information has not been prepared or tailored to address, and may not be suitable or appropriate for the particular financial needs, circumstances or requirements of any person, and it should not be the basis for making any investment or transaction decision.  The Information is not a recommendation to engage in any transaction.  The digital asset industry is subject to a range of inherent risks, including but not limited to: price volatility, limited liquidity, limited and incomplete information regarding certain instruments, products, or digital assets, and a still emerging and evolving regulatory environment.  The past performance of any instruments, products or digital assets addressed in the Information is not a guide to future performance, nor is it a reliable indicator of future results or performance. ​

Ooda Commodities, LLC is a member of NFA and is subject to NFA’s regulatory oversight and examinations. However, you should be aware that NFA does not have regulatory oversight authority over underlying or spot virtual currency products or transactions or virtual currency exchanges, custodians or markets.​

BitOoda Technologies, LLC is a member of FINRA.​

“BitOoda”, “BitOoda Difficulty”, “BitOoda Hash”, “BitOoda Compute”, and the BitOoda logo are trademarks of BitOoda Holdings, Inc.​

Copyright 2022 BitOoda Holdings, Inc. All rights reserved. No part of this material may be reprinted, redistributed, or sold without prior written consent of BitOoda.​

The tectonic plates within the technology sector are rapidly shifting, catalyzed by the rapid rise of AI over the past six months. AI is causing a Cambrian explosion in infrastructure as the demand for “Compute” – which is the underlying fuel behind AI, Bitcoin mining, trading, graphics rendering, and more – is in a one-way trend higher. Demand for Compute is vastly exceeding supply, putting strain on the “Compute Supply Chain.” In BitOoda’s “High Performance Compute Primer” published on 5/12/23, we defined Compute and provided an overview of the broader Compute ecosystem. In this report, we will focus on the underlying infrastructure forming the supply side of Compute, which we refer to as the “Compute Stack.” ​

Data centers form the foundation of the Compute Stack. Data centers, which have been regarded as stable-growth infrastructure plays with parallels to the real estate industry, have suddenly become coveted growth engines. The “digital real estate” provided by data center infrastructure is now massively in-demand, positioning these data centers at the nexus of AI’s ascent.​

Hardware players also are thriving in the Compute ecosystem. The most prominent example is NVIDIA, which made its bet on the AI boom years (if not decades) ago and therefore is dominating the market for next-generation GPUs (A100s, H100s), which are being used to train Large Language Models (LLMs) in the still-nascent AI boom. NVIDIA therefore has (temporarily) uncontested pricing power, which, coupled with a surge in demand, is setting it up as a winner. However, as Bezos has said, “your margin is my opportunity,” and it is likely that other hardware players will develop hardware acceleration solutions to compete. ​

Cloud Service Providers, namely “hyperscalers,” comprise the front lines of the Compute Stack. The AI revolution and ensuing surge in Compute demand is occurring from the top-down, with the largest hyperscalers (AWS, Google Cloud, Microsoft Azure) capturing the initial value from AI model infrastructure (for example, Microsoft’s recent deal with OpenAI). However, like with hardware players, we expect increased competition for hyperscalers over the long run.​

Power producers are the engine room of the Compute space. The least talked-about (yet still significant) component of the Compute Stack is the underlying power producer. It is ironic that power consumption of Bitcoin mining was an immediate attack vector against BTC, while the surging demand for power to train and operate AI models has not come into the spotlight. Nevertheless, the AI industry could be magnitudes larger than BTC (since AI permeates all industries), and the power requirements will be significant. Ultimately, we expect that AI data centers and other players in the Compute Stack will adopt renewable energy mandates.​

The Compute Stack will rapidly evolve. A generic philosophy is that in technology, “change is the only constant,” and we believe this will apply to the Compute Stack. AI’s mainstream revolution is so new, and the emerging demand drivers for Compute (e.g., zero knowledge proofs) are on the horizon and rapidly approaching. As the Compute industry scales, we could see disruption across all layers of the Compute Stack, with decentralization of data centers, competitors in the hardware acceleration arms race, and disintermediation of the hyperscalers. ​

The Compute Stack - Data Centers

  • The data center was historically viewed as a somewhat sleepy, stable play with predictable cash flows and low-but-steady growth. However, the surge in demand for Compute has broadened the scope of a data center from a play on physical real estate to a play on digital real estate as well. Data centers can now optimize on cost (power price), location, efficiency, uptime, and hardware profile (GPUs, FPGAs, ASICs, etc.) to compete for contracts from hyperscalers and new AI players.​
  • Data centers, while foundational infrastructure players that are generally abstracted away in the Compute Stack, are complex operations. The location of data centers is important (close to power sources, close to exchanges for high frequency trading / minimal latency, etc.). Data centers need to be increasingly energy efficient. Lastly, security is paramount, with the need for sacrosanct data privacy and computational integrity. ​

Figure: The Future of Data Centers
Source: https://research-assets.cbinsights.com/2019/01/24160814/Data-Centers-of-the-Future1.png/

The Compute Stack - Hardware Players

  • Although the hardware space has been dominated by the meteoric rise of NVIDIA in 2023, the hardware landscape almost certainly will diversify over time (either by regulatory demand or by market demand), as we could see an arms race in hardware solutions tailored for AI. ​
  • Currently, GPUs are used to train LLMs in AI, with NVIDIA’s A100 and H100s as the most popular and most high-profile solutions today. It is unlikely that NVIDIA’s A100s and H100s will be the only solutions for AI training and inference going forward, and other hardware players and solutions are likely to come into the fray.​
  • It remains to be seen whether more specialized hardware in the form of ASICs or FPGAs can be used for more efficient AI utility, just as the Bitcoin mining industry evolved from CPUs to GPUs to specialized ASICs. However, for now, NVIDIA’s GPUs are the gold standard.​

Figure: NVIDIA H100
Source: https://www.nvidia.com/en-us/data-center/h100/

The Compute Stack - Cloud Service Providers / Hyperscalers

  • The Compute and AI revolution is manifesting itself in a top-down, trickle-down fashion. Cloud Service Providers (CSPs) are not only the “front ends” for AI and other Compute applications, but the concentration of Cloud Service Providers marketshare into the major hyperscalers has resulted in vertical integration potential across the Compute Stack.​
  • The largest hyperscalers – namely AWS, Google Cloud, and Microsoft Azure – have built massive economies of scale, brand recognition, regulatory and trust infrastructure, and adoption ahead of the surge in AI. As a result, the Cloud Service Provider space is top-heavy, and the major AI front ends are powered by the hyperscalers (ChatGPT via Microsoft/OpenAI, Google Bard).​
  • The overarching theme of this piece is that incumbents will see increasing competition from new entrants across the Compute Stack, and we expect the same for the hyperscalers as the CSP landscape decentralizes.

Figure: Hyperscalers; Source: Various Company Logos

The Compute Stack - Power Producers

  • Last, the underlying power producers are used by all other parts of the Compute Stack, chiefly data centers and hyperscalers. ​
  • While power consumption of the AI and Compute industry has not yet entered mainstream discussion (analogous to how the BTC mining industry became scrutinized for its power consumption – and thus pivoted to add renewable, ”green” mining solutions – it is inevitable that the resource consumption of the exponentially growing AI sector will enter the spotlight.​
  • Therefore, power players will be of increasing importance in the Compute Stack as an emphasis on clean, renewable power will be preferred. Power producers that proactively look to onboard Compute users and seek to capture this growing market could emerge as winners.​
  • Ultimately, the Compute Stack will be democratized as it matures over time.​

Figure: Power in the Compute Stack
Source: https://www.google.com/about/datacenters/efficiency/

Disclosures

Purpose

This research is only for the clients of BitOoda. This research is not intended to constitute an offer, solicitation, or invitation for any securities and may not be distributed into jurisdictions where it is unlawful to do so. For additional disclosures and information, please contact a BitOoda representative at info@bitooda.io.​

Analyst Certification

Vivek Raman, the primary author of this report, hereby certifies that all of the views expressed in this report accurately reflect his personal views, which have not been influenced by considerations of the firm’s business or client relationships.​

Conflicts of Interest

This research contains the views, opinions, and recommendations of BitOoda. This report is intended for research and educational purposes only. We are not compensated in any way based upon any specific view or recommendation.​​

General Disclosures

Any information (“Information”) provided by BitOoda Holdings, Inc., BitOoda Digital, LLC, BitOoda Technologies, LLC or Ooda Commodities, LLC and its affiliated or related companies (collectively, “BitOoda”), either in this publication or document, in any other communication, or on or through http://www.bitooda.io/, including any information regarding proposed transactions or trading strategies, is for informational purposes only and is provided without charge.  BitOoda is not and does not act as a fiduciary or adviser, or in any similar capacity, in providing the Information, and the Information may not be relied upon as investment, financial, legal, tax, regulatory, or any other type of advice. The Information is being distributed as part of BitOoda’s sales and marketing efforts as an introducing broker and is incidental to its business as such. BitOoda seeks to earn execution fees when its clients execute transactions using its brokerage services.  BitOoda makes no representations or warranties (express or implied) regarding, nor shall it have any responsibility or liability for the accuracy, adequacy, timeliness or completeness of, the Information, and no representation is made or is to be implied that the Information will remain unchanged. BitOoda undertakes no duty to amend, correct, update, or otherwise supplement the Information.​

The Information has not been prepared or tailored to address, and may not be suitable or appropriate for the particular financial needs, circumstances or requirements of any person, and it should not be the basis for making any investment or transaction decision.  The Information is not a recommendation to engage in any transaction.  The digital asset industry is subject to a range of inherent risks, including but not limited to: price volatility, limited liquidity, limited and incomplete information regarding certain instruments, products, or digital assets, and a still emerging and evolving regulatory environment.  The past performance of any instruments, products or digital assets addressed in the Information is not a guide to future performance, nor is it a reliable indicator of future results or performance. ​

Ooda Commodities, LLC is a member of NFA and is subject to NFA’s regulatory oversight and examinations. However, you should be aware that NFA does not have regulatory oversight authority over underlying or spot virtual currency products or transactions or virtual currency exchanges, custodians or markets.​

BitOoda Technologies, LLC is a member of FINRA.​

“BitOoda”, “BitOoda Difficulty”, “BitOoda Hash”, “BitOoda Compute”, and the BitOoda logo are trademarks of BitOoda Holdings, Inc.​

Copyright 2022 BitOoda Holdings, Inc. All rights reserved. No part of this material may be reprinted, redistributed, or sold without prior written consent of BitOoda.​

Related Research