Extropic Develops Probabilistic Computing Chip for AI Efficiency

Image Credit: Jacky Lee

Extropic, a startup founded in 2022 by Guillaume Verdon and Trevor McCourt, is working on a computer chip that uses thermodynamic fluctuations in electronic circuits to perform probabilistic computations. This approach seeks to address the increasing energy demands of artificial intelligence and high-performance computing. The company recently provided technical details to WIRED, published on March 26, 2025, positioning its technology as a possible alternative to conventional silicon processors in datacenters.

[Read More: Databricks Secures $9.5B Funding, Soars to $98B Valuation Amid Rapid AI Growth]

Probabilistic Bits and Their Mechanism

The core of Extropic’s design is the probabilistic bit, or "p-bit", which operates differently from the fixed 1 or 0 states of standard binary bits. A technical document shared with WIRED includes an oscilloscope measurement showing a p-bit switching between states, with its likelihood of being a 1 or 0 controlled by the company’s engineers. This functionality relies on random thermodynamic fluctuations—variations in electric charge within circuits—that are typically minimized in traditional designs. By connecting multiple p-bits, Extropic states it can perform complex probabilistic calculations. Verdon, the CEO, described this in WIRED, saying,

“This signal on the oscilloscope may seem simple at first glance, but it demonstrates a key building block for our platform, representing the birth of the world’s first scalable, mass-manufacturable, and energy-efficient probabilistic computing platform”.

Previous efforts in thermodynamic computing often used superconducting circuits requiring cryogenic cooling, such as temperatures near 1 Kelvin. Extropic, however, uses standard silicon to manage these fluctuations at room temperature, as outlined in its March 2024 Litepaper from extropic.ai. This change eliminates the need for specialized cooling systems, aiming to improve practical deployment. The Litepaper notes that early prototypes used superconducting aluminum with Josephson junctions, but the company has shifted to silicon-based designs.

[Read More: AI Chips Are Evolving to Mimic the Human Brain with New Spintronic Tech]

Focus on Monte Carlo Simulations

The chip targets Monte Carlo simulations, a method that samples probabilities to model uncertainty, used in areas like finance, biology, and AI reasoning models such as OpenAI’s o3 and Google’s Gemini 2.0 Flash Thinking. Verdon told WIRED, “The reality is that the most computationally-hungry workloads are Monte Carlo simulations”, indicating the chip’s intended use extends to probabilistic tasks in high-performance computing. Extropic claims its hardware can perform these tasks more efficiently by leveraging natural randomness, unlike digital processors that rely on energy-intensive pseudo-random algorithms.

According to the Litepaper, their “Extropic accelerators” offer “many orders of magnitude” improvement in runtime and energy use for probabilistic algorithms.

[Read More: AI Transforms Data Management: Boosting Efficiency & Security Across Industries]

Efficiency Goals and Industry Comparison

Extropic states its chip could be three to four orders of magnitude (1,000 to 10,000 times) more energy-efficient than existing hardware. This level of efficiency, if achieved, could lower the power needs of AI datacenters, which consumed about 460 terawatt-hours globally in 2024, according to the International Energy Agency (IEA). The Litepaper cites challenges like the slowing of Moore’s Law and thermal noise in transistors as driving factors, though it provides no specific performance data beyond oscilloscope signals and theoretical projections.

Nvidia’s H100 GPU, a dominant chip in AI training, delivers up to 3,958 teraflops of FP8 precision compute but uses 700 watts per chip. Extropic’s efficiency claims remain untested against such standards, and its focus on probabilistic workloads differs from Nvidia’s strength in matrix-based computations.

[Read More: Chainlink’s AI & Blockchain Initiative: Transforming Financial Data Processing for FMI Leaders]

Current Progress and Obstacles

Extropic has built a superconducting prototype using aluminum-based Josephson junctions, tied to $14.1 million in seed funding from investors like Kindred Ventures and Valor Equity Partners. This prototype, operating at cryogenic temperatures, served as an initial demonstration of p-bit technology. The company has since moved to room-temperature silicon designs, with plans for GPU-like expansion cards.

No chips are commercially available, and independent validation is lacking. A 2021 Purdue University study confirms the theoretical basis of p-bits but notes difficulties in scaling and noise management. Extropic’s use of silicon suggests manufacturability within existing infrastructure, though production timelines and costs are not public.

[Read More: DeepSeek AI Banned in Multiple Countries Over Data Privacy & Security Concerns]

Position in the Market

Nvidia commands over 90% of the AI training chip market, specializing in matrix computations. Extropic’s probabilistic focus suggests a different application area. WIRED notes that Extropic’s founders view taking on Nvidia as a formidable challenge, given that switching to a new architecture involves substantial adjustments for users. Industry trends, including datacenters near nuclear plants, highlight energy concerns that could favour efficiency-driven designs.

[Read More: Tokyo University of Science Pioneers High-Efficiency Sodium-Ion Batteries Using Machine Learning]

License This Article

Source: Wired

TheDayAfterAI News

We are your source for AI news and insights. Join us as we explore the future of AI and its impact on humanity, offering thoughtful analysis and fostering community dialogue.

https://thedayafterai.com
Previous
Previous

Character.AI Introduces ‘Parental Insights’ Feature to Enhance Teen Safety

Next
Next

NVIDIA Introduces Cosmos World Foundation Models for Physical AI Development