Search a title or topic

Over 20 million podcasts, powered by 

Player FM logo
Artwork

Content provided by Endeavor Business Media. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Endeavor Business Media or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://podcastplayer.com/legal.
Player FM - Podcast App
Go offline with the Player FM app!

Scaling Down: How Solidigm SSDs Help Keep Data Center Costs More Efficient

33:39
 
Share
 

Manage episode 513985322 series 3492717
Content provided by Endeavor Business Media. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Endeavor Business Media or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://podcastplayer.com/legal.

Charting the Future of AI Storage Infrastructure

In this episode, Solidigm Director of Strategic Planning Brian Jacobosky guides listeners through a tech-forward conversation on how storage infrastructure is helping redefine the AI-era data center. The discussion frames storage as more than just a cost factor; it's also a strategic building block for performance, efficiency, and savings.

Storage Moves to the Center of AI Data Infrastructure

Jacobosky explains how, in the AI-driven era, storage is being elevated from a forgotten metric like “dollars per gigabyte” to a core priority: maximizing GPU utilization, managing soaring power draw, and unlocking space savings. He illustrates how every watt and every square inch counts. As GPU compute scales dramatically, storage efficiency is being engineered to enable maximum density and throughput.

High-Capacity SSDs as a Game-Changer

Jacobosky spotlights Solidigm D5-P5336 122TB SSDs as emblematic of the shift. Rather than a simple technical refresh, these drives represent a tectonic realignment in how data centers are being designed for huge capacity and optimized performance. With all-flash deployments offering up to nine times the space savings compared to hybrid architectures, Jacobosky underscores how SSD density can enable more GPU scale within fixed power and space budgets. This could even unlock achieving a 1‑petabyte SSD by the end of the decade.

Embedded Efficiency

The episode brings environmental considerations to the forefront. Jacobosky shares how an “all‑SSD” strategy can dramatically slash physical footprints as well as energy consumption. From data center buildout through end of lifecycle drive retirement, efficiency is driving both operational cost savings and ESG benefits — helping reduce concrete and steel usage, power draw, and e‑waste.

Pioneering Storage Architectures and Cooling Innovation

Listeners learn how AI-first innovators like Neo Cloud-style providers and sovereign AI operators lead the charge in deploying next-generation storage. Jacobosky also previews the Solidigm PS-1010 E1.S form factor, an NVIDIA fanless server solution that enables direct‑to‑chip Cold-Plate-Cooled SSDs integrated into GPU servers. He predicts that this systems-level integration will become a standard for high-density AI infrastructure.

Storage as a Strategic Investment

Solidigm challenges the notion that high-capacity storage is cost prohibitive. Within the framework of the AI token economy, Jacobosky explains that the true measure becomes minimizing cost per token and time to first token and, when storage is optimized for performance, capacity, and efficiency, the total cost of ownership (TCO) will often prove favorable after the first evaluation.

Looking Ahead: Memory Wall, Inference Workloads, Liquid Cooling

Jacobosky ends with a look ahead to where storage innovation will lead in the next five years. As AI models grow in size and complexity, he argues, storage is increasingly acting as an extension of memory, breaking through the “memory wall” for large inference workloads. Companies will design infrastructure from the ground up with liquid-cooling, future-scalable storage, and storage that supports massive model deployments without compromising latency.

This episode is essential listening for data center architects, AI infrastructure strategists, and sustainability leaders looking to understand how storage is fast-becoming a defining factor in AI-ready data centers of the future.

  continue reading

160 episodes

Artwork
iconShare
 
Manage episode 513985322 series 3492717
Content provided by Endeavor Business Media. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Endeavor Business Media or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://podcastplayer.com/legal.

Charting the Future of AI Storage Infrastructure

In this episode, Solidigm Director of Strategic Planning Brian Jacobosky guides listeners through a tech-forward conversation on how storage infrastructure is helping redefine the AI-era data center. The discussion frames storage as more than just a cost factor; it's also a strategic building block for performance, efficiency, and savings.

Storage Moves to the Center of AI Data Infrastructure

Jacobosky explains how, in the AI-driven era, storage is being elevated from a forgotten metric like “dollars per gigabyte” to a core priority: maximizing GPU utilization, managing soaring power draw, and unlocking space savings. He illustrates how every watt and every square inch counts. As GPU compute scales dramatically, storage efficiency is being engineered to enable maximum density and throughput.

High-Capacity SSDs as a Game-Changer

Jacobosky spotlights Solidigm D5-P5336 122TB SSDs as emblematic of the shift. Rather than a simple technical refresh, these drives represent a tectonic realignment in how data centers are being designed for huge capacity and optimized performance. With all-flash deployments offering up to nine times the space savings compared to hybrid architectures, Jacobosky underscores how SSD density can enable more GPU scale within fixed power and space budgets. This could even unlock achieving a 1‑petabyte SSD by the end of the decade.

Embedded Efficiency

The episode brings environmental considerations to the forefront. Jacobosky shares how an “all‑SSD” strategy can dramatically slash physical footprints as well as energy consumption. From data center buildout through end of lifecycle drive retirement, efficiency is driving both operational cost savings and ESG benefits — helping reduce concrete and steel usage, power draw, and e‑waste.

Pioneering Storage Architectures and Cooling Innovation

Listeners learn how AI-first innovators like Neo Cloud-style providers and sovereign AI operators lead the charge in deploying next-generation storage. Jacobosky also previews the Solidigm PS-1010 E1.S form factor, an NVIDIA fanless server solution that enables direct‑to‑chip Cold-Plate-Cooled SSDs integrated into GPU servers. He predicts that this systems-level integration will become a standard for high-density AI infrastructure.

Storage as a Strategic Investment

Solidigm challenges the notion that high-capacity storage is cost prohibitive. Within the framework of the AI token economy, Jacobosky explains that the true measure becomes minimizing cost per token and time to first token and, when storage is optimized for performance, capacity, and efficiency, the total cost of ownership (TCO) will often prove favorable after the first evaluation.

Looking Ahead: Memory Wall, Inference Workloads, Liquid Cooling

Jacobosky ends with a look ahead to where storage innovation will lead in the next five years. As AI models grow in size and complexity, he argues, storage is increasingly acting as an extension of memory, breaking through the “memory wall” for large inference workloads. Companies will design infrastructure from the ground up with liquid-cooling, future-scalable storage, and storage that supports massive model deployments without compromising latency.

This episode is essential listening for data center architects, AI infrastructure strategists, and sustainability leaders looking to understand how storage is fast-becoming a defining factor in AI-ready data centers of the future.

  continue reading

160 episodes

All episodes

×
 
Loading …

Welcome to Player FM!

Player FM is scanning the web for high-quality podcasts for you to enjoy right now. It's the best podcast app and works on Android, iPhone, and the web. Signup to sync subscriptions across devices.

 

Copyright 2025 | Privacy Policy | Terms of Service | | Copyright
Listen to this show while you explore
Play