Nvidia announces BlueField-4 STX storage architecture at GTC

Reuters
Yesterday
<a href="https://laohu8.com/S/NVDA">Nvidia</a> announces BlueField-4 STX storage architecture at GTC

Nvidia announced the BlueField-4 STX storage reference architecture at the GTC conference it attended. The company said STX is designed to support long-context reasoning for agentic AI systems and includes a rack-scale implementation with the CMX context memory storage platform. Nvidia reported up to 5x higher token throughput and up to 4x higher energy efficiency versus traditional storage approaches. Early adopters named for context memory storage included CoreWeave, Oracle Cloud Infrastructure, and Mistral AI.

Disclaimer: This news brief was created by Public Technologies (PUBT) using generative artificial intelligence. While PUBT strives to provide accurate and timely information, this AI-generated content is for informational purposes only and should not be interpreted as financial, investment, or legal advice. Nvidia Corporation published the original content used to generate this news brief via GlobeNewswire (Ref. ID: 202603161529PRIMZONEFULLFEED9672905) on March 16, 2026, and is solely responsible for the information contained therein.

At the request of the copyright holder, you need to log in to view this content

Disclaimer: Investing carries risk. This is not financial advice. The above content should not be regarded as an offer, recommendation, or solicitation on acquiring or disposing of any financial products, any associated discussions, comments, or posts by author or other users should not be considered as such either. It is solely for general information purpose only, which does not consider your own investment objectives, financial situations or needs. TTM assumes no responsibility or warranty for the accuracy and completeness of the information, investors should do their own research and may seek professional advice before investing.

Most Discussed

  1. 1
     
     
     
     
  2. 2
     
     
     
     
  3. 3
     
     
     
     
  4. 4
     
     
     
     
  5. 5
     
     
     
     
  6. 6
     
     
     
     
  7. 7
     
     
     
     
  8. 8
     
     
     
     
  9. 9
     
     
     
     
  10. 10