Why Artificial Superintelligence Could Arrive Sooner Than Wall Street Thinks

Motley Fool
Yesterday
  • Tech leaders are shifting from artificial general intelligence (AGI) to artificial superintelligence (ASI), signaling rapid progress.
  • Over $1 trillion in AI infrastructure spending is already under way, dwarfing previous technology buildouts.
  • The semiconductor supply chain is scaling for computational demands far beyond current commercial AI.

Wall Street sees artificial superintelligence (ASI) as a 2030s story. The evidence points to something far more imminent.

OpenAI's Sam Altman recently said his team "knows how to build AGI (artificial general intelligence)" and is "turning our aim beyond that -- to superintelligence." The language is deliberate: we know, not we hope to know. For reference, AGI refers to machines that can match human-level cognition. ASI refers to intelligence operating at speeds, scales, and domains that are orders of magnitude beyond human capacity.

This quiet shift suggests a milestone has already been crossed, and investors would be wise to position themselves accordingly.

Image source: Getty Images.

The infrastructure tells the story

The numbers reveal unprecedented mobilization. On Jan. 21, OpenAI, Oracle, and SoftBank announced the $500 billion Stargate project, a private initiative coordinated with federal policy.

Meta Platforms (META 0.73%) launched "Superintelligence Labs" almost overnight, with Mark Zuckerberg's internal memo confirming the new division's aggressive talent acquisition of 11 prominent AI researchers from competitors with nine-figure pay packages. Major data center operators are scrambling for power, with some regions seeing demand spikes that defy all previous forecasts.

Nvidia's (NVDA 1.28%) financials tell their own story. In Q1 fiscal 2026, four unidentified customers accounted for 54% of the company's $44.1 billion in revenue. One customer alone spent $7 billion in a single quarter. Companies typically disclose major customers unless those relationships involve highly sensitive projects.

A market opportunity beyond comprehension

Government legislation laid the groundwork. The CHIPS and Science Act allocated $52.7 billion for semiconductor manufacturing, with additional billions for AI research. The Infrastructure Investment and Jobs Act added $62 billion for energy modernization.

Recent semiconductor legislation raised tax credits from 25% to 35% for R&D, with the Department of Energy receiving $150 million to prepare scientific data for AI models and "seed efforts for self-improving artificial intelligence models for science and engineering."

Combined with private projects like Stargate, more than $1 trillion is being deployed for AI infrastructure within the next five years, more than double the entire global cloud computing market in 2024.

The competitive landscape is already decided

While investors debate whether Microsoft or Alphabet wins the AI race, governments are treating AI supremacy as a national security imperative. Export controls on Nvidia's H20 chips to China, resulting in $5.5 billion in charges. Moreover, the FY2025 National Defense Authorization Act specifically earmarks $143.8 billion for science and technology research and development (R&D), calling for AI pilot programs in national security.

In the supply chain, the winners are clear: ASML Holding (ASML -0.73%) holds a monopoly on extreme ultraviolet lithography. Lam Research (LRCX -0.06%) and Applied Materials (AMAT 0.53%) dominate critical semiconductor equipment markets.

There are no viable alternatives to these key strategic players in the AI value chain. Moreover, it's simply too late in the game for competitors to emerge in a meaningful way. ASML, Lam Research, and Applied Materials scan as forever buy-and-holds in this environment.

The acceleration is real, even if the timeline isn't certain

Nvidia's data center revenue soared 409% to $18.4 billion in Q4 fiscal 2024 -- growth far beyond what incremental AI upgrades would justify.

History offers precedent: Military technology often runs five to 10 years ahead of public release. GPS served military needs in the 1970s; it reached consumers decades later. The internet itself emerged from DARPANET years before the public had email.

Consider the transformer architecture that powers all modern AI systems. Google publicly introduced transformers in 2017, revolutionizing machine learning. But if the military-civilian lag holds, similar breakthroughs could have emerged in classified programs as early as 2007 to 2012, precisely when DARPA began funding radical computing efficiency programs.

In 2012, DARPA awarded Nvidia contracts under the PERFECT program -- up to $20 million over 5.5 years -- to explore radical computing efficiencies. The 2015 launch of DARPA's XAI (Explainable AI) program further suggests that complex AI systems were already in development within classified programs. Whether those programs succeeded fully is beside the point. The pattern is clear: Today's public AI capabilities may reflect technologies that originated in classified or military programs years earlier.

Meta's pivot to "Superintelligence Labs" and Altman's shift from AGI to ASI development suggest that insiders believe the first threshold, AGI, has been crossed. The transition from AGI to ASI will require breakthroughs, but the scale of current investment indicates confidence that those breakthroughs are within reach. Altman's "event horizon" language implies an irreversible process already in motion.

Valuation requires a new framework

Valuing Nvidia or ASML by traditional price-to-earnings ratios misses the broader transformation under way. Nvidia trades at 51.4 times earnings -- a multiple that may look conservative if ASI materializes before 2030.

The infrastructure spending alone ensures hundreds of billions in semiconductor and AI revenue. These investments are already in motion, with no historical parallel for the speed or scale.

Investment implications

For conservative exposure, diversify across the AI supply chain. Exchange-traded funds, such as Invesco QQQ Trust and Vanguard Information Technology ETF, provide broad coverage of the sector's growth.

For more direct exposure, focus on irreplaceable players: Nvidia for AI chips, ASML for lithography systems, and Lam Research for deposition and etch equipment.

For higher-risk, asymmetric upside, consider emerging AI chip designers or data center REITs positioned to benefit from the infrastructure boom.

The timeline has collapsed

Infrastructure spending exceeds $1 trillion. Some prominent tech leaders speak of AGI in the past tense without acknowledging that this critical threshold has been crossed. Supply chains are scaling for demands that don't yet exist publicly.

The conclusion is clear: ASI is highly likely to come faster than the publicly stated timelines suggest. As a result, the seemingly high valuations for AI's key architects may be extremely misleading. If ASI is indeed imminent, the whole game is about to change in unimaginable ways.

Disclaimer: Investing carries risk. This is not financial advice. The above content should not be regarded as an offer, recommendation, or solicitation on acquiring or disposing of any financial products, any associated discussions, comments, or posts by author or other users should not be considered as such either. It is solely for general information purpose only, which does not consider your own investment objectives, financial situations or needs. TTM assumes no responsibility or warranty for the accuracy and completeness of the information, investors should do their own research and may seek professional advice before investing.

Most Discussed

  1. 1
     
     
     
     
  2. 2
     
     
     
     
  3. 3
     
     
     
     
  4. 4
     
     
     
     
  5. 5
     
     
     
     
  6. 6
     
     
     
     
  7. 7
     
     
     
     
  8. 8
     
     
     
     
  9. 9
     
     
     
     
  10. 10