- The AI Field
- Posts
- 🚨Data Centers Are Going to Space!
🚨Data Centers Are Going to Space!
Space data centers, efficient chips and a new generation of intelligent systems.

Hey AI Enthusiasts!
This week’s roundup is packed with some massive breakthroughs, from data centers in space to AI chips that break the laws of thermodynamics.
Let’s dive in!
In today’s insights:
🛰 Starcloud launches first space-based AI data center powered by NVIDIA GPUs
💻 Cursor 2.0 debuts with AI-agent coding and a 4× faster model
💰 OpenAI readies $1 trillion IPO, one of tech’s biggest ever
⚙️ Extropic reveals chip that runs AI 10,000× more efficiently than GPUs
Read time: 5 minutes.
🗞️ Recent Updates
Nvidia
🛰️ Data Centers Are Going to Space
Space isn’t just for stars anymore. 🌠
Starcloud’s H100-powered satellite brings sustainable, high-performance computing beyond Earth.
Learn more: nvda.ws/47eYZvC
— NVIDIA (@nvidia)
10:06 PM • Oct 21, 2025
The AI Field: Starcloud, an NVIDIA-backed startup, is launching the first space-based data center satellite in November—featuring an NVIDIA H100 GPU orbiting Earth. The move could revolutionize how we power AI by tapping into unlimited solar energy and using the vacuum of space for cooling.
Details:
The Starcloud-1 satellite will deliver 100x more GPU compute power than any previous space operation, marking the first time a data center-class GPU operates in orbit.
Space-based data centers eliminate water cooling needs and offer 10x cheaper energy costs by using constant solar power and radiating heat directly into space—cutting CO2 emissions by 10x over their lifetime.
Early applications include real-time Earth observation analysis, wildfire detection, and synthetic-aperture radar imaging that generates 10GB of data per second, enabling insights in minutes instead of hours.
Starcloud's CEO predicts that within 10 years, nearly all new data centers will be built in outer space rather than on Earth.
Why This Matters: As AI demands skyrocket, traditional data centers face mounting energy and water consumption challenges. Moving computation to space could be a game-changer for sustainability—offering virtually unlimited renewable energy with no environmental impact beyond the initial launch. If successful, this isn't just a novel experiment; it's a potential blueprint for the future infrastructure of AI, where our most powerful computing happens above the clouds rather than beneath them.
How can AI power your income?
Ready to transform artificial intelligence from a buzzword into your personal revenue generator
HubSpot’s groundbreaking guide "200+ AI-Powered Income Ideas" is your gateway to financial innovation in the digital age.
Inside you'll discover:
A curated collection of 200+ profitable opportunities spanning content creation, e-commerce, gaming, and emerging digital markets—each vetted for real-world potential
Step-by-step implementation guides designed for beginners, making AI accessible regardless of your technical background
Cutting-edge strategies aligned with current market trends, ensuring your ventures stay ahead of the curve
Download your guide today and unlock a future where artificial intelligence powers your success. Your next income stream is waiting.
Cursor/Coding Tool
🚨 Cursor 2.0 Brings Multi-Agent AI Coding
Introducing Cursor 2.0.
Our first coding model and the best way to code with agents.
— Cursor (@cursor_ai)
4:12 PM • Oct 29, 2025
The AI Field: Cursor has launched version 2.0 with a completely redesigned interface built around AI agents rather than files, plus its own custom coding model called Composer that's 4x faster than comparable models. The shift marks a major pivot toward autonomous, multi-agent software development.
Details:
Composer is Cursor's new "frontier model" optimized for low-latency coding, completing most tasks in under 30 seconds and trained specifically to navigate large, complex codebases with semantic search capabilities.
The platform can now run multiple AI agents in parallel without interference, and it discovered that assigning the same problem to different models and picking the best solution significantly improves output quality.
Cursor 2.0 includes a native browser tool that lets AI agents test their own code automatically, iterating until they produce correct results—a step toward truly autonomous development.
The redesigned interface focuses on outcomes over files, though developers can still access code directly or switch back to a "classic IDE" view when needed.
Why This Matters: The bottleneck in AI coding is shifting from writing code to reviewing and testing it. Cursor's approach tackles both problems by making agent-generated changes easier to review and enabling agents to validate their own work. If AI agents can not only write but also test and refine code autonomously, we're moving closer to a future where developers act more as architects and overseers than line-by-line coders. This isn't just an incremental update—it's a fundamental rethinking of what software development looks like when AI becomes a true collaborative partner.

The AI Field: OpenAI is preparing for an initial public offering that could value the company at up to $1 trillion, with potential regulatory filings as early as the second half of 2026 and a possible 2027 listing. The move would mark one of the largest IPOs in history, despite the company currently losing billions annually.
Details:
OpenAI is looking to raise at least $60 billion, with an annualized revenue run rate expected to reach about $20 billion by year-end, though losses are mounting inside the $500 billion company.
The IPO preparations follow a complex restructuring that reduces OpenAI's reliance on Microsoft, with the nonprofit OpenAI Foundation now holding a 26% stake in the for-profit OpenAI Group.
An IPO would enable more efficient capital raising and larger acquisitions using public stock, helping to finance CEO Sam Altman's plans to pour trillions of dollars into AI infrastructure.
In the first half of the year, OpenAI lost $13.5 billion on revenue of $4.3 billion and is on track to lose $27 billion for the year, with estimates showing the company may burn $115 billion by 2029.
Why This Matters: A $1 trillion valuation for a company losing billions annually raises serious questions about AI market frothiness and echoes dot-com bubble comparisons. OpenAI's public debut could set the tone for the entire AI sector—either validating sky-high valuations or triggering a reality check. With massive infrastructure needs and mounting losses, this IPO will test whether investors are willing to bet trillions on AI's future or if we're witnessing unsustainable hype. The outcome could reshape how we think about AI company valuations for years to come.
Hello Thermo World.
— Extropic (@extropic)
5:00 PM • Oct 29, 2025
The AI Field: Extropic has unveiled a radical new "thermodynamic sampling unit" (TSU) chip that uses probabilistic computing instead of traditional matrix math, claiming simulations show it can run generative AI workloads using 10,000 times less energy than GPUs. The startup has already shipped its first development hardware to AI labs and weather companies.
Details:
Instead of processing deterministic computations like CPUs and GPUs, TSUs produce samples from programmable probability distributions using "p-bits"—circuits that perform millions of coin flips per second using 10,000x less energy than a single floating-point operation.
Founded by former Google quantum researchers and backed by $14.1 million in seed funding, Extropic has released its XTR-0 development platform and open-sourced a Python library called "thrml" to let developers simulate TSU behavior on existing GPUs.
The key innovation is minimizing energy by keeping all communication strictly local—circuits only interact with nearby neighbors, avoiding the expensive data movement that makes GPUs power-hungry.
Extropic will ship its Z-1 chip next year, designed to run a new Denoising Thermodynamic Model that creates images and videos, with generative AI as the primary target since diffusion models naturally align with probability sampling.
Why This Matters: By 2030, AI data centers could consume 10% of all U.S. electricity, an unsustainable trajectory that threatens to become AI's greatest bottleneck. Extropic's approach fundamentally challenges the industry consensus that scaling AI simply requires bigger data centers and more power. If thermodynamic computing delivers on its promises at scale, it could remove the energy ceiling that currently limits AI development. But this is still early-stage hardware with simulated benchmarks, so the real test will be whether these radical efficiency gains translate to production systems running real-world AI workloads.
🗞️ More AI Hits
OpenAI signs a US $38 billion multi-year cloud partnership with Amazon Web Services (AWS)
OpenAI will use AWS infrastructure (including GPUs) to scale its AI workloads under this major deal.
Experts find major flaws in over 440 AI benchmark tests
A study revealed that almost all commonly-used AI performance/safety benchmarks have serious issues — calling into question some claims about model abilities and safety.
South Korea proposes massively increased AI spending in next year’s budget
The South Korean president called for the tripling of national AI-related budget to 10.1 trillion won (~US$6.9 billion) for 2026, emphasising semiconductors, robotics and shipbuilding.
AI tools could save employees nearly a full working day per week — but only with training
According to a study, employees using AI tools effectively (with training) could reclaim about 7.5 hours per week — but without training the benefits drop sharply.
Golden Nuggets
🛰 Starcloud launches first space-based AI data center powered by NVIDIA GPUs
💻 Cursor 2.0 debuts with AI-agent coding and a 4× faster model
💰 OpenAI readies $1 trillion IPO, one of tech’s biggest ever
⚙️ Extropic reveals chip that runs AI 10,000× more efficiently than GPUs
What did you think of today’s edition? |
Until next time!
Olle | Founder of The AI Field

