Is “More Clouds” the Future We Want? A Dispatch from the FTC AI Tech Summit

Corinne Cath / Jan 30, 2024

Sign on a doorway at the Federal Trade Commission in Washington D.C. Shutterstock

On January 25 the Federal Trade Commission (FTC), the US regulatory body that enforces federal consumer protection and antitrust laws, convened an online tech summit on artificial intelligence (AI). The tech summit is part of the FTC’s ongoing work on AI. That work includes its cloud computing RFI, which explores the role of this industry in shaping the AI ecosystem. In her introductory notes, FTC chair Lina Kahn emphasized the commission’s focus on how AI business models drive potentially anti-competitive incentives. She emphasized this statement by mentioning various FTC actions and inquiries, such as a set of orders it issued last week to Alphabet, Amazon, Anthropic, Microsoft, and OpenAI requiring them to “provide information regarding recent investments and partnerships involving generative AI companies and major cloud service providers.”

The AI tech stack: models, cloud, and chips

The summit’s first panel was on AI, chips, and cloud computing, or what the panelists called “AI’s tech stack.” Subsequent panels focused on the role of data in AI technologies and models and AI at the consumer application level. This dispatch documents the first discussion, and unpacks some concerns and limits of current anti-competition efforts aimed at regulating AI and its computational backbone.

The first panel included a set of speakers from academia, regulators, start-ups, and industry at large, with experts including an economist from Ofcom and an entrepreneur with expertise in open-source technologies providing insights into tricky questions regarding competition and innovation in the AI space. The one-hour conversation covered a broad set of topics and concerns. The first half covered power concentration in the AI ‘tech stack.’ The experts brought up the dangers of having key AI infrastructure in the hands of a limited number of dominant companies in cloud services (AWS & Microsoft) and chip design (NVIDIA). They mentioned that startups face many challenges in competing, particularly in the semiconductor business.

The conversation also highlighted the role of NVIDIA in supplying the technology for training large AI models, with questions raised about transparency in allocation and potential innovation constraints. Ofcom’s Tania Van den Brande highlighted concerns about concentration in the UK cloud computing market, particularly with Amazon Web Services (AWS) and Microsoft. It was the barriers to switching between cloud computing companies that led Ofcom to request an investigation by the UK Competition and Markets Authority. The panel expressed worries about centralization risks and the dominance of a limited number of companies, emphasizing the need for more competition in the cloud and chip industries in particular. The overall focus was on exploring ways to enhance competition and resilience in the evolving landscape of AI, cloud computing, and microchips–in the US and beyond.

Its market concentration, all the way down

Professor Ganesh Sitaraman, a legal scholar at Vanderbilt University, summarized these various concerns when he said, “What I think is very striking is that as you work down the layers of the AI tech stack applications, from models to cloud and chips, there's more and more concentration.” This comment was the entry point for discussion of the various market concerns as seen by the panelists. Those included a set of issues that the panelists conceded are common to other industries: entry for non-incumbents is near impossible, and there is little venture capital (VC) for companies interested in building expensive infrastructure like those required for cloud computing or chips as it takes years to catch up. Even model development is hard to get funded, as it is expensive—and you would have to do it on one of the main cloud computing platforms, which is awkward. Cloud providers are developing internal foundation models, as well as offering access to models by leaders in the field like OpenAI. Multiple panelists mentioned that this means that, for many enterprises, this makes cloud computing companies a computing provider as well as a potential competitor.

This entanglement between the big players offering models, cloud services, and AI can lead to anti-competitive behavior. The panelists echoed the opening statement of Chair Kahn and the FTC’s launch of a market inquiry into the investments and partnerships being formed between AI developers and major cloud service providers. As panelist Corey Quinn, Chief Cloud Economist at the Duckbill Group, said, “All roads lead to one company and that is NVIDIA”. Yet, he continued, there is no transparency in how they distribute the key assets they hold, i.e. who ends up getting the prized NVIDIA chips.

Likewise, it is unclear what major cloud players offer NVIDIA to incentivize them with priority access to their supply-constrained chips. Quinn, for instance, speculated that because Amazon operates across a wide number of industries, it can offer NVIDIA compelling terms that no one else can. “Okay, NVIDIA, give us a bunch more chips and we will give you the preferred placement for your other retail line on Amazon.com. People search for this list of terms. Is it happening? How would I know? There's no transparency here.” Other concerns mentioned during the panel included the costs of egress fees, the related dangers to innovation when customers are locked into one cloud provider, and the lack of a ‘national US champion’ in the chips space, to name a few.

Regulatory paths forward

Luckily, the panelists also offered various pathways forward. As this tech summit was hosted by the FTC, logically most were rooted in addressing anti-competition through antitrust regulatory options aimed at opening up markets. For example, Ofcom’s Van den Brande mentioned the need to make it easier for companies to switch between cloud providers. She focused on reducing egress fees and the costs and efforts associated with re-engineering apps when you move from one cloud provider to another. She also suggested more efforts toward undoing the discounting structures that currently discourage large customers from dividing their cloud usage between major and minor players.

Professor Sitaraman went into detail on numerous options for intervention. He mentioned the possibility of structural separation by making business lines within companies separate to hamper the vertical integration that can lead to anti-competitive behavior. He suggested non-discrimination rules that enable equal treatment of cloud clients, including those that might compete with their computing providers, as another avenue. Finally, he mentioned interoperability rules, making sure that clients can move their data from one cloud to another with ease. On this point, Ofcom’s Van den Brande conveyed to the audience that steps along these lines were being pursued to open up the UK market.

Recapping the conversation, Alex Gaynor, the panel’s moderator and the deputy chief technology officer at the FTC, said a few companies dominate the AI tech stack, which is made worse by the difficulties around switching cloud providers. “This may in turn allow them to charge excessive prices or impose coercive terms, and as a result, they may be able to exercise market power in ways that favors their own incumbency or impacts competition,” he said. This overall situation results in a potentially anti-competitive, closed-market environment, given how hard it is for new entrants. The discussion covered a lot of ground and made the infrastructural aspects of AI clear.

Clouds on the horizon of anti-competition efforts

I have some doubts, however, about some of the underlying assumptions that seem key to substantiating anti-competition efforts. For example, the panelists talked about the need for switching between cloud computing providers with ease. This is a hot topic in policy debates, focused in particular on the egress fees. It is also a topic on which Google (currently the smallest player in the cloud market) is trying to differentiate itself. It has, entirely unironically, complained to regulators that its competitors are being anti-competitive. The ease of switching being a tool for easing competition concerns seems to assume that these cloud companies and their services are interchangeable.

For my post-doctoral research at the University of Delft, I research the political economy of cloud computing with the Programmable Infrastructures Projects (PIP). I have interviewed numerous professionals working in the AI, software, networking, and cloud computing industry. Many would disagree with this assumption, arguing that each cloud has its specific niche. This means that even if switching was easy, it is still possible many cloud customers still wouldn’t move providers. “Yes, you could probably bake a pizza on the stove but there are good reasons we prefer an oven,” as one of my interviewees quipped.

The barriers to switching are often high because the offerings of each cloud company are slightly different, which in turn means that if you use one company’s APIs or AI offerings to build your software, you’ve now committed to that cloud’s ecosystem. While moving to another cloud provider is technically possible, it would likely break many things, and ultimately the competitor might not be able to provide you with the same tools that made your software a success.

Some of these dynamics were explicitly acknowledged by the panelists, yet the mismatch between the goal of undoing these issues and the tool of anti-competition law was not. The underlying assumption that if we open up the market by making switching easier must thus be critically interrogated for what it can, and cannot achieve. Also missing from the discussion, was how to incentivize these companies to make these moves.

On a slightly lighter note, as a European audience member, I found some of the language around ‘championing a US leader in the chip space’ somewhat contradictory (if not cringy). I thought we were aiming to open up the market and thus not show favoritism based on where the headquarters of a company is located, but maybe I misunderstood. This type of rhetoric does reflect the broader global political winds of chip industry protectionism, and was perhaps to be expected in this particular setting.

More concerning, however, was the future that the panelists were sketching in their calls for opening the markets. This came to the fore quite clearly in the discussion about the cloud computing industry. Panelist Daven Rauchwerk, a technologist and start-up entrepreneur in the chips industry, said: “If I had a magic wand, we would have 10,000 cloud companies in the US, and there’d be 10,000 regionalized clouds.” It is this type of thinking that worries me. Are we better off in a world where we have 10,000 cloud computing providers, rather than three with the current largest market share—if nothing else about their functioning or function in society changes?

Implicitly, the panelists seemed to accept that the current AI stack, and the overall structure of the cloud computing industry, in particular, is here to stay. But fiddling around the edges of the status quo seems misdirected as an overall goal. More competition does not automatically lead to fewer problems. Not to mention the fact that creating 10,000 new cloud companies is pie-in-the-sky thinking, especially in our current economic climate. This is not to say that the current status quo of monopolies across the AI tech stack is preferable. However, even if more competition were feasible, it only solves a narrow problem set surrounding AI without touching on its wider societal implications. More competition in the cloud computing space, certainly, does not necessarily reduce harm to society if nothing else changes. It just spreads the harm across a larger number of perpetrators.

In other words, more is not always better. For example, let’s consider the environmental impact of this approach. It’s akin to saying, we would be better off if there was more competition in the petrochemical and oil sector—when we know that would likely do nothing to reduce the overall sector’s production of fossil fuels, and its subsequent impact on climate change. Is 10,000 Shells better than one, if the business model of extraction and exploitation stays the same? I doubt it– and it might even be worse.

These same concerns hold for AI, cloud computing, and chips. Generative AI, for example, has a measurable impact on the environment, fueled by its need for computing power and reliance on the vast cloud computing power. Take Microsoft, for instance; the insatiable demand to cool its data centers, magnified by the increasingly ubiquitous application ChatGPT running on the Microsoft Azure platform, has led to a staggering surge in water consumption. It reached a whopping 6.4 million cubic meters in 2022. Again, there is no guarantee that if we open up the market and have 10,000 Microsoft Azures instead of one that the industry's environmental footprint will somehow magically reduce. Even if it were a better solution, again, creating 10,000 new cloud computing companies is hardly feasible, given the rising interest rates and the associated costs associated with building out cloud infrastructures, like data centers.

A similar argument against opening up markets as a holistic solution can be made for the chips industry. As this industry relies on the extraction and utilization of natural resources and materials like silicon, casting its shadows on the delicate balance of our environment. The high stakes regarding so-called technological innovation and environmental sustainability, to name but one societal issue impacted by the AI tech stack, demand a much more radical vision for the future of computing than can be provided by competition authorities alone.


Is this a fair criticism to leverage at the FTC, or Ofcom for that matter? Perhaps not. I realize that they cannot wield tools they do not have. Yet, I think it is important that critical scholars continue to point out the limitations of popular regulatory approaches, such as those rooted in opening markets. This is not to discourage these efforts, as encouraging more competition may provide a partial solution to consolidation in the AI tech stack. We must, however, be bright-eyed about all the concerns these approaches will not address. Only if we do that will we be able to articulate the bigger economic and geo-political forces at play here. That, in turn, will help us formulate answers on how to curb the power in the AI tech stack, or at least ask the right questions that help demonstrate what novel harms and solutions exist, much beyond those related to market consolidation and opening up markets.


Corinne Cath
Dr. Corinne Cath is an anthropologist of technology who studies the politics of Internet infrastructure. She is a recent graduate of the Oxford Internet Institute's PhD program (University of Oxford) and the Alan Turing Institute for data science. Previously, she worked on technology policy for huma...