Remove Development Remove Hardware Remove Storage Remove Training
article thumbnail

New brain-on-a-chip platform to deliver 460x efficiency boost for AI tasks

Network World

The Indian Institute of Science (IISc) has announced a breakthrough in artificial intelligence hardware by developing a brain-inspired neuromorphic computing platform. The IISc team’s neuromorphic platform is designed to address some of the biggest challenges facing AI hardware today: energy consumption and computational inefficiency.

Energy 174
article thumbnail

Nvidia points to the future of AI hardware

CIO Business Intelligence

For CIOs deploying a simple AI chatbot or an AI that provides summaries of Zoom meetings, for example, Blackwell and NIM may not be groundbreaking developments, because lower powered GPUs, as well as CPUs, are already available to run small AI workloads. The case for Blackwell is clear, adds Shane Rau, research VP for semiconductors at IDC.

Insiders

Sign Up for our Newsletter

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

article thumbnail

Federal Government Signals Interest In Several Key Leading Edge Technologies

CTOvision

As technologists we found the questions informative and insightful and even inspiring (it was great seeing the government prove they are tracking developments in the tech world). Agile Software Development. Today’s top trend with software development leaders is continuous development.

article thumbnail

Key #BigData Partnership: @Cloudera and @Koverse Team to Deliver on the Enterprise Data Hub

CTOvision

The joint development work focuses on Apache Accumulo, the scalable, high performance distributed key/value store that is part of the Apache Software Foundation. Cloudera also offers software for business critical data challenges including storage, access, management, analysis, security and search.

article thumbnail

Why Purpose-Built Infrastructure is the Best Option for Scaling AI Model Development

CIO Business Intelligence

“In an early phase, you might submit a job to the cloud where a training run would execute and the AI model would converge quickly,” says Tony Paikeday, senior director of AI systems at NVIDIA. Developers find that a training job now takes many hours or even days, and in the case of some language models, it could take many weeks.

article thumbnail

Getting infrastructure right for generative AI

CIO Business Intelligence

For generative AI, a stubborn fact is that it consumes very large quantities of compute cycles, data storage, network bandwidth, electrical power, and air conditioning. In storage, the curve is similar, with growth from 5.7% of AI storage in 2022 to 30.5% Facts, it has been said, are stubborn things.

Storage 139
article thumbnail

Nvidia speeds AI, climate modeling

CIO Business Intelligence

Nvidia is hoping to make it easier for CIOs building digital twins and machine learning models to secure enterprise computing, and even to speed the adoption of quantum computing with a range of new hardware and software. Seeing double.

3D 144