AI to revolutionize technological infrastructure

Mon Sep 09 2024
Jim Andrews (525 articles)
AI to revolutionize technological infrastructure

Artificial intelligence is poised to significantly increase the demands placed on cloud infrastructure. For years, cloud services and private networks have been tasked with managing relatively modest volumes of data. With the advent of artificial intelligence and deep learning, the influx of photos, videos, audio, and natural language has transformed the landscape of data measurement. What was previously quantified in gigabytes and terabytes is now assessed in significantly larger units, such as petabytes and exabytes.

Information systems, encompassing cloud technologies, are required to scale in order to accommodate the vast amounts of data. Less apparent—and indeed more intriguing—is the necessity to obtain all that information at significantly greater speed and, crucially, at reduced operational costs. A number of firms have commenced efforts to advance the next generation of infrastructure. CoreWeave, a provider of cloud computing services, has garnered interest by offering customers access to cutting-edge AI chips from Nvidia, thereby spotlighting the burgeoning market.

In May, CoreWeave disclosed a substantial equity funding round amounting to $1.1 billion, which positioned the seven-year-old startup at a valuation of $19 billion. Additionally, the company secured $7.5 billion in debt financing from prominent investors such as Blackstone, Carlyle Group, and BlackRock. Nvidia is additionally a stakeholder. CoreWeave, in turn, engages with a startup known as VAST Data, which tackles the modernization of cloud and private networks through a software-centric approach. VAST has introduced an operating system that it claims is faster, more cost-effective, and scalable for various types of distributed networks.

“Our vision was to construct the necessary infrastructure for these emerging AI workloads,” stated Chief Executive Renen Hallak, who established the company in Israel in 2016. In December, VAST announced it had secured $118 million in a Series E funding round, spearheaded by Fidelity Management & Research, which nearly tripled its valuation to $9.1 billion. The firm has exceeded $200 million in annual recurring revenue and asserts a gross margin approaching 90%. Historically, data storage has been structured in tiers, with recent, high-priority information maintained in easily accessible locations, while older data is relegated to deeper, less accessible layers, as noted by Hallak. “That situation has changed with the advent of these new AI workloads,” Hallak remarked during our discussion at the company’s office in New York. “Having developed a robust AI model, the next logical step is to apply it to your entire historical dataset, as this is where the true value is realized.” “As additional information becomes available, the objective is to retrain and enhance the model,” Hallak stated. “One examines data repeatedly across vast quantities, sometimes reaching petabytes and even exabytes.” Consequently, that presents a distinctly different challenge,” he remarked.

Conventional systems further develop by incorporating nodes that retain a segment of the broader data set. This architectural framework necessitates that all components allocate resources for intercommunication, and it is susceptible to dysfunction should any single node encounter an issue. Consequently, numerous enterprise systems are limited to scaling up to merely a few dozen nodes, which falls short of the requirements driven by AI, according to Hallak. The VAST approach enables all nodes to simultaneously access comprehensive information, thereby enhancing scalability, speed, and resilience, he noted. VAST further disaggregates the costs associated with data storage and computing, claiming that this approach leads to cost savings. The call for a new technological infrastructure initially evokes images of the dominant tech corporations, yet its influence will permeate further into the broader economic landscape.

This transition is already occurring at several data-driven firms, including Pixar, the Disney studio responsible for this summer’s success, “Inside Out 2.” Since 2018, it has been engaged with VAST. Since the release of its 2020 film “Soul,” Pixar has adopted a technique referred to as volumetric animation, which yields characters and movements of greater detail. In its 2023 release “Elemental,” Pixar demonstrated a more extensive application of volumetric animation, utilizing AI to refine the flames of protagonist Ember Lumen. “Inside Out 2,” released in June, exhibited a data capacity requirement twice that of “Soul” and necessitated approximately 75% additional computational resources.

According to Eric Bermender, head of data infrastructure and platforms at Pixar, the studio’s previous method of transferring data from high-performance drives to lower-performance drives during periods of inactivity proved ineffective for rendering volumetric characters. Pixar typically utilizes on-premises networks for AI applications rather than relying on cloud solutions. Bermender noted that AI, by its nature, resists fitting neatly into conventional frameworks. “These workflows typically necessitate the processing of vast quantities of heterogeneous data that is neither cacheable nor sequential, and which would conventionally have been relegated to lower-performance archival tiers,” he stated.

The implication for businesses is that the integration of AI necessitates a technological framework capable of handling its extraordinary data requirements. It is comparable to an electric vehicle, necessitating a comprehensive reevaluation of numerous elements found in a traditional gasoline-powered automobile, extending even to the tires. For AI to achieve momentum, it will require an entirely new framework.

Jim Andrews

Jim Andrews

Jim Andrews is Desk Correspondent for Global Stock, Currencies, Commodities & Bonds Market . He has been reporting about Global Markets for last 5+ years. He is based in New York