THE STANFORD EMERGINGTECHNOLOGY REVIEW 2025

Share This

A Preview of the 2025 Report on Ten Key Technologies and Their Policy Implications.

In every era, technological discoveries bring both promise and risk. Rarely,
however, has the world experienced technological change at the speed and
scale we see today. From nanomaterials that are fifty thousand times smaller
than the width of a human hair to commercial satellites and other private-sector
technologies deployed in outer space, breakthroughs are rapidly reshaping
markets, societies, and geopolitics. What’s more, US technology policy isn’t the
unique province of government like it used to be. Instead, inventors and investors
are making decisions with enormous policy consequences, even if they may not
always realize it. Artificial intelligence (AI) algorithms are imbued with policy
choices about which outcomes are desired and which are not. Nearly every new
technology, from bioengineering new medicines to building underwater research
drones, has both commercial and military applications. Private-sector investment,
too, simultaneously generates both national advantages and vulnerabilities by
developing new capabilities, supply chains, and dependencies, and by pursuing
commercial opportunities that may not serve long-term national interests.

While engineers and executives need to better understand the policy world,
government leaders need to better understand the engineering and business
worlds. Otherwise, public policies intended to protect against societal harms
may end up accelerating them, and efforts to align innovation with the national
interest could end up harming that interest by dampening America’s innovation
leadership and the geopolitical advantages that come with it.
In these complex times, the only certainties are that uncertainty is rampant and
the stakes are high: Decisions made today in boardrooms, labs, and government
offices are likely to set trajectories for the United States and the world for years
to come.
Now more than ever, understanding the landscape of discovery and how to
harness technology to forge a better future requires working across sectors, fields,
and generations. Universities like Stanford have a vital role to play in this effort.
In 2023, we launched the Stanford Emerging Technology Review (SETR), the first
ever collaboration between Stanford University’s School of Engineering and the
Hoover Institution. Our goal is ambitious: transforming technology education for
decision makers in both the public and private sectors so that the United States
can seize opportunities, mitigate risks, and ensure that the American innovation
ecosystem continues to thrive.
This is our latest report surveying the state of ten key emerging technologies
and their implications. It harnesses the expertise of leading faculty in science and
engineering fields, economics, international relations, and history to identify key
technological developments, assess potential implications, and highlight what
policymakers should know.

This report is our flagship product, but it is just one element of our continuous
technology education campaign for policymakers that now involves nearly one
hundred Stanford scholars across forty departments and research institutes. In the
past year, SETR experts have briefed senior leaders across the US government—in
Congress and in the White House, Commerce Department, Defense Department,
and US intelligence community. We have organized and participated in fifteen
Stanford programs, including multiday AI and biotechnology boot camps for
congressional staff; SETR roundtables for national media and officials from
European partners and allies; and workshops convening leaders across sectors in
semiconductors, space technology, and bioengineering. And we are just getting
started.
Our efforts are guided by three observations:

  1. America’s global innovation leadership matters.
    American innovation leadership is not just important for the nation’s economy and
    security. It is the linchpin for maintaining a dynamic global technology innovation
    ecosystem and securing its benefits.
    International scientific collaboration has long been pivotal to fostering global
    peace, progress, and prosperity, even in times of intense geopolitical competition.
    During the Cold War, American and Soviet nuclear scientists and policymakers
    worked together to reduce the risk of accidental nuclear war through arms
    control agreements and safety measures. Today, China’s rise poses many new
    challenges. Yet maintaining a robust global ecosystem of scientific cooperation
    remains essential—and it does not happen by magic. It takes work, leadership,
    and a fundamental commitment to freedom to sustain the openness essential
    for scientific discovery. Freedom is the fertile soil of innovation, and it takes
    many forms: the freedom to criticize a government; to admit failure in a research
    program as a step toward future progress; to share findings openly with others;
    to collaborate across geographical and technical borders with reciprocal access
    to talent, knowledge, and resources; and to work without fear of repression or
    persecution. In short, it matters whether the innovation ecosystem is led by
    democracies or autocracies. The United States has its flaws and challenges, but
    this country remains the best guarantor of scientific freedom in the world.
  2. Academia’s role in American innovation is essential—and at risk.
    The US innovation ecosystem rests on three pillars: the government, the private
    sector, and the academy. Success requires robust research and development
    (R&D) in all three. But they are not the same, and evidence increasingly suggests
    that universities’ role as the engines of innovation is at a growing risk.
    Universities, along with the US National Laboratories, are the only institutions
    that conduct research on the frontiers of knowledge without regard for potential
    profit or foreseeable commercial application. This kind of research is called basic
    or fundamental research. It takes years, sometimes decades, to bear fruit. But
    without it, future commercial innovations would not be possible. Radar, global
    positioning systems (GPS), and the internet all stemmed from basic research done
    in universities. So did the recent “overnight success” of the COVID-19 mRNA
    vaccines, which relied on decades of university research that discovered mRNA
    5
    FOREWORD
    could activate and block protein cells and figured out how to deliver mRNA
    to human cells to provoke an immune response. Similarly, the cryptographic
    algorithms protecting data on the internet today would not have been possible
    without decades of academic research in pure math. And many of the advances
    in AI, from ChatGPT to image recognition, build on pioneering work done in
    university computer science departments that also trained legions of students
    who have gone on to found, fund, and lead many of today’s most important tech
    companies. In many ways and in nearly every field, America’s innovation supply
    chain starts with research universities.
    Yet evidence suggests that the engine of innovation in US research universities
    is not running as well as it could, posing long-term risks to the nation. In 2024,
    for the first time, the number of Chinese contributions surpassed ones from the
    United States in the closely watched Nature Index, which tracks eighty-two of the
    world’s premier science journals. Funding trends are also heading in the wrong
    direction. The US government is the only funder capable of making large and risky
    investments in the basic science conducted at universities (as well as at national
    laboratories) that is essential for future applications. Yet federal R&D funding has
    plummeted in percentage terms since the 1960s, from 1.86 percent of GDP in
    1964 to just 0.66 percent of GDP in 2016. The Creating Helpful Incentives to
    Produce Semiconductors (CHIPS) and Science Act of 2022 was supposed to turn
    the tide by dramatically raising funding for basic research, but major increases
    were subsequently scrapped in budget negotiations. The United States still funds
    more basic research than China does, but Chinese investment is rising six times
    faster—and is expected to overtake US spending within a decade.
    Although private-sector investment in technology companies and associated
    university research has increased substantially, it is not a substitute for federal
    funding, which supports university R&D directed at national and public issues, not
    commercial viability.
    To be sure, the rising dominance of private industry in innovation brings significant
    benefits. But it is also generating serious and more hidden risks to the health of
    the entire American innovation ecosystem. Technology and talent are migrating
    from academia to the private sector, accelerating the development of commercial
    products while eroding the foundation for the future. We are already reaching a
    tipping point in AI. In 2022, more than 70 percent of students who received PhDs
    in artificial intelligence at US universities took industry jobs, leaving fewer faculty
    to teach the next generation. As the bipartisan National Security Commission on
    Artificial Intelligence put it, “Talent follows talent.”
    Today, only a handful of the world’s largest companies have both the talent and
    the enormous computing power necessary for developing sophisticated large
    language models (LLMs) like ChatGPT. No university comes close. In 2024, for
    example, Princeton University announced that it would use endowment funds to
    purchase 300 advanced Nvidia chips to use for research, costing about $9 million,
    while Meta announced plans to purchase 350,000 of the same chips by year’s end,
    at an estimated cost of $10 billion.

Read more here.

Leave a Comment