The Need for Modernized, AI-Ready Server and Compute Infrastructure

Aberdeen looked into issues associated with organizations’ AI deployments and how they benefit from a modernized infrastructure.

November 23, 2023

why ai-ready compute infrastructure is better
  • Organizations may struggle to keep up with the heavy-duty demands of AI-driven business operations.
  • However, top organizations have managed to overcome challenges by focusing on infrastructure modernization, delivering visible benefits.
  • Here’s what Aberdeen Strategy & Research discovered.

The OpenAI meltdown, wherein the ousting of co-founder and CEO Sam Altman and his messiah-like eventual return in four days, has died down. Before it did, the proceedings sparked a passionate exchange of views between employees and stakeholders, investors and venture capitalists, and the people in general regarding morally and responsibly developing artificial intelligence (AI), i.e., for the good of humanity.

This piece does not have anything to add to that debate.

However, the entire episode is a reminder that OpenAI has arguably been the frontrunner in AI research for a while now, especially since it revolutionized the space with its generative AI tool, ChatGPT, a year ago in November 2022 and the large language model GPT-4 in March 2023.

By August 2023, as much as 22% of respondents in McKinsey’s The State of AI 2023 report said they regularly use generative artificial intelligence (AI). Additionally, 40% of organizations using generative AI plan to invest more in the technology for regular operations.

A month later, in September 2023, Aberdeen Strategy & ResearchOpens a new window discovered that 43% of organizations are using AI-enabled technology solutions (not just generative AI), while another 26% have a strategic investment in place with roles dedicated to AI and with efforts to develop AI solutions.

Such a rapid adoption rate, even if experimental, needs a solid foundation to run on. However, even OpenAI has temporarily but often run adrift of troubled waters, leading to downtime for its generative AI services, the latest of which was a 3 hr 16 min ChatGPT outage the company faced two days ago on November 21.

A few days earlier, on November 16, OpenAI’s API services faced a 44-minute outage due to a “degraded performance for TTS, Whisper, and Fine-tuned babbage-002 and davinci-002 models.”

OpenAI Services Uptime and Services Outage

OpenAI Services Uptime and Services Outage

Source: OpenAI

Aberdeen looked into issues associated with organizations’ AI deployments and how they benefit from a modernized infrastructure. Half of the surveyed companies said their AI deployments are leveraged for real-time edge capabilities. Predictive analytics is another use case (40%), followed closely by application and server deployment automation (39%).

“To achieve these outcomes, leading businesses are turning to high-performance and scalable compute solutions that take advantage of cutting-edge GPUs for high throughput and low latency wherever AI inference is needed,” noted Jim Rapoza, VP & principal analyst, IT, Aberdeen Strategy & Research.

See More: The State of AI in the Enterprise 2023: How is AI Actually Affecting Jobs?

Top Challenges in AI Deployment

A successful and sustainable AI deployment may remain a pipe dream unless organizations recognize and overcome respective challenges. Aberdeen found that 46% of organizations find it difficult to integrate AI deployments with existing infrastructure, which may be legacy.

Meanwhile, 38% of organizations said they find filling the skill and knowledge gaps in AI competency difficult.

The fallout from inconsistent deployments resulting in consistent downtimes may have consumers looking at other avenues. “One of the biggest problems businesses encounter in AI inference is latency. If it takes the AI too long to infer a response, a customer might leave, a problem can become worse, or a sale might not take place.”

It’s no coincidence that Best-in-Class businesses, i.e., those in the top 20% of key IT success metrics, were 50% more likely than respective competitors to have been using modernized server infrastructures in their AI environments.

“With a compute server infrastructure designed to simplify and optimize AI inference and deliver powerful processing capabilities in the data center, hybrid cloud, and edge environments, businesses can more easily take advantage of key AI solutions in visual generation, smart edge devices, and natural language processing,” Rapoza added.

Benefits of AI-Ready Server and Compute Infrastructure

A modernized, high-performance, AI-ready compute infrastructure is a means to achieving agility, innovation, and competition through optimized AI automation. And that’s exactly what organizations have managed to do.

The impact of process efficiency with a modernized infrastructure is more than twice as much as with legacy infrastructure. It also leads to higher revenue generation, greater customer satisfaction, reduced cybersecurity risks, and enhanced application development and deployment.

Top Benefits of an AI-ready Server and Compute Infrastructure

Top Benefits of an AI-ready Server and Compute Infrastructure

Source: Aberdeen Strategy & ResearchOpens a new window

Rapoza added, “As the need to effectively leverage and take advantage of AI runs into demands for high-performing, scalable, and efficient compute server capabilities, leading businesses can’t sit on their hands and try to make do with legacy systems or expensive on-demand cloud options if they want to innovate and be more competitive.”

Do you agree or disagree with Aberdeen’s findings? Share with us on LinkedInOpens a new window , XOpens a new window , or FacebookOpens a new window . We’d love to hear from you!

Image source: Shutterstock

MORE ON ARTIFICIAL INTELLIGENCE

Sumeet Wadhwani
Sumeet Wadhwani

Asst. Editor, Spiceworks Ziff Davis

An earnest copywriter at heart, Sumeet is what you'd call a jack of all trades, rather techs. A self-proclaimed 'half-engineer', he dropped out of Computer Engineering to answer his creative calling pertaining to all things digital. He now writes what techies engineer. As a technology editor and writer for News and Feature articles on Spiceworks (formerly Toolbox), Sumeet covers a broad range of topics from cybersecurity, cloud, AI, emerging tech innovation, hardware, semiconductors, et al. Sumeet compounds his geopolitical interests with cartophilia and antiquarianism, not to mention the economics of current world affairs. He bleeds Blue for Chelsea and Team India! To share quotes or your inputs for stories, please get in touch on sumeet_wadhwani@swzd.com
Take me to Community
Do you still have questions? Head over to the Spiceworks Community to find answers.