Companies Say the Risks of 'Open' Artificial Intelligence Models Are Worth It -- WSJ

Dow Jones
3 hours ago

By Steven Rosenbush and Isabelle Bousquette

SAN JOSE, Calif. -- Nvidia is mobilizing an effort to create "open" artificial intelligence models that make their code publicly available and provide an alternative to proprietary giants such as OpenAI and Anthropic.

The Nemotron Coalition will advance development of frontier-level foundation models through shared expertise, data and compute, Nvidia Chief Executive Jensen Huang said Monday. Inaugural members include model developers and AI labs Cursor, Black Forest Labs, LangChain, Mistral AI, Perplexity, Reflection AI, Sarvam and Thinking Machines Lab.

Nvidia said the first model built by the coalition will underpin its coming Nemotron 4 family of open models. The coalition will support transparency, collaboration, sovereignty and broader access to intelligence, "ensuring the future of AI is shaped with the world and built for the world," Huang said.

To date, proprietary developers have largely been in the vanguard of creating frontier models. But open model developers have been closing the gap, led by companies such as Mistral. Open models are generally free for users to download and modify. True open-source models allow full access to training data and code, while open-weight models share the numerical parameters, or "weights," that underlie them.

Perplexity has gained attention with its AI agent Computer. And early last year, China's DeepSeek impressed fellow engineers and computer scientists, including Huang. Other open models from China, such as Alibaba Cloud's Qwen family, have impressed as well, although the models from China have stirred security and governance concerns.

Mistral said Tuesday it was introducing Forge, a system that enterprises can use to build frontier-grade AI models grounded in their proprietary knowledge.

Roles for open and closed

Companies at Nvidia's annual GTC event this week said open models played crucial roles in their AI initiatives. Leaders from Capital One, ServiceNow and CrowdStrike said they appreciated open model strengths such as customizability and lower cost, even as they grappled with the challenges of making them secure.

So far, proprietary developers have set the standard for frontier models. Now, the market's emphasis is moving to inference, or the use of trained models by businesses and other users. When it comes to business applications, large and expensive cutting-edge frontier models aren't always the best tool for the job.

As inference takes off, business demand is growing for smaller, customized and lower-cost models, and that is creating a new opening for the open approach.

Most AI architectures have room for both open and closed models and often assign them different functions.

"At the end of the day," CrowdStrike Chief Technology Officer Elia Zaitsev said, closed models are "general purpose and very effective, but they are not customizable to specific use cases or niche domains."

Closed models are generally trained on huge amounts of publicly available data from the internet, but struggle when confronted with tasks where training data might be less readily available, Zaitsev said. That includes CrowdStrike's field of cybersecurity, where data for identifying threats is constantly changing as the threats and attackers themselves rapidly shift.

Managing supply chain risk

Zaitsev said that thus far much of the open model innovation has come from Chinese companies. They are innovative, but using them poses something of a supply chain risk, he said. For example, they might create a "back door" that an external adversary could take advantage of, he said.

At Capital One, Milind Naphade, head of AI foundations, said using open models like Nvidia's Nemotron and OpenAI's GPT-OSS gives the company more control and better performance thanks to the ability to customize on its own data.

"It's a level of customization that's simply impossible with...fine-tuning that can be done on closed models," Naphade said.

Capital One still uses closed models for use cases such as employee productivity, but for customer-facing tools, it opts for open models, according to Naphade.

"A material amount of the AI that's ultimately going to be consumed is going to be done" with such models, said CoreWeave co-founder and Chief Executive Michael Intrator, who added that the company delivers a sizable amount of compute to French AI lab Mistral.

By ramping up support for open models, Nvidia will promote better security and governance, higher performance standards, as well as the growth of applications that run on its systems, according to Heath Terry, global head of technology and communications research at Citi. Nvidia will support open models in a way that the enterprise customer needs, and raise the floor for what a company has to produce to charge a premium, according to Terry.

"It's Nvidia's way of continuing to advance the technology and drive more adoption, which drives more demand for chips," he said.

Write to Steven Rosenbush at steven.rosenbush@wsj.com and Isabelle Bousquette at isabelle.bousquette@wsj.com

 

(END) Dow Jones Newswires

March 18, 2026 07:00 ET (11:00 GMT)

Copyright (c) 2026 Dow Jones & Company, Inc.

At the request of the copyright holder, you need to log in to view this content

Disclaimer: Investing carries risk. This is not financial advice. The above content should not be regarded as an offer, recommendation, or solicitation on acquiring or disposing of any financial products, any associated discussions, comments, or posts by author or other users should not be considered as such either. It is solely for general information purpose only, which does not consider your own investment objectives, financial situations or needs. TTM assumes no responsibility or warranty for the accuracy and completeness of the information, investors should do their own research and may seek professional advice before investing.

Most Discussed

  1. 1
     
     
     
     
  2. 2
     
     
     
     
  3. 3
     
     
     
     
  4. 4
     
     
     
     
  5. 5
     
     
     
     
  6. 6
     
     
     
     
  7. 7
     
     
     
     
  8. 8
     
     
     
     
  9. 9
     
     
     
     
  10. 10