The idea is deceptively simple: mount a high-performance compute node on the exterior of newly built homes, tap into the electrical capacity most households don’t use, and aggregate that power into a distributed AI data center. Instead of waiting years to build centralized facilities, XFRA spreads infrastructure across thousands of residences—turning neighborhoods into compute clusters.
The Numbers That Make It Real
Traditional data center expansion is slow and expensive. Building a 100-megawatt facility typically takes three to five years and costs upwards of $15 million per megawatt. According to Span CEO Arch Rao, XFRA can match that capacity by deploying nodes across 8,000 homes in roughly six months—at about $3 million per megawatt.
That’s not a marginal improvement. It’s a structural shift: six times faster, five times cheaper.
The economics hinge on a simple fact about residential electricity usage. Most U.S. homes are provisioned with 200-amp service at 240 volts but operate far below peak capacity—typically around 40–45%. That leaves roughly 19.2 kilowatts of unused power per home, sitting idle for large portions of the day.
XFRA’s orchestration software captures that slack, distributing AI workloads across homes based on latency requirements, availability, and grid conditions. To customers—hyperscalers and AI cloud providers—it appears as a unified compute resource, no different from a traditional data center.
What’s Actually Inside the Box
Each XFRA node is effectively a compact enterprise data center:
- 16× Nvidia RTX PRO 6000 Blackwell Server Edition GPUs
- 96GB ECC GDDR7 memory per GPU
- 4× AMD EPYC CPUs
- 3TB of system RAM
- 24-port gigabit networking
These aren’t consumer-grade components—they’re high-end inference machines. The GPUs are passively cooled, designed to run without fans, while a liquid cooling system manages thermals. That detail matters. Noise complaints are one of the biggest friction points for traditional data centers; XFRA sidesteps that by design.
Marc Spieler, Nvidia’s Senior Managing Director of Global Energy Industry, put it bluntly: the bottleneck in scaling AI infrastructure isn’t chips—it’s power access. Homes already have it.
More Than Compute: A Virtual Power Plant
XFRA isn’t just about compute distribution—it’s also an energy play.
Each node integrates with a home battery system, allowing it to:
- Buffer peak demand
- Respond to utility demand-response signals
- Shift workloads away from stressed or offline nodes
In effect, the network behaves like a virtual power plant (VPP)—a distributed system that can stabilize the grid while delivering compute.
The U.S. Department of Energy estimates that scaling VPP capacity to 80–160 gigawatts by 2030 could reduce peak load by 10–20% and save roughly $10 billion annually in infrastructure costs. Span is aiming for gigawatt-scale XFRA deployment by 2027, positioning itself squarely within that opportunity.
The Desert Stress Test
The first real-world test arrives in Q3 2026, with a 100-home pilot planned in either Arizona or Nevada.
The location isn’t arbitrary. Desert summers regularly exceed 110°F, making them the harshest possible environment for fanless, liquid-cooled systems. If XFRA maintains thermal stability there, its case for national expansion becomes significantly stronger.
The pilot will deliver around 1.25 megawatts of compute capacity, powered by approximately 1,600 GPUs. Small by hyperscale standards—but enough to validate performance, reliability, and economics.
What Homeowners Get—and What They Risk
For homeowners, the value proposition is straightforward:
- Flat ~$150/month fee covering electricity and internet
- Potential for free utilities in high-demand zones
- No installation cost
Span retains ownership of the hardware, and homeowners carry no direct responsibility for the workloads.
But the trade-offs are real.
Cybersecurity firm Huntress has pointed out that distributing compute across residential environments significantly expands the attack surface compared to centralized data centers. Connectivity varies. Physical security is weaker. Regulatory frameworks—especially around insurance and liability—are still underdeveloped.
There’s also the social factor. Residential crypto mining booms have shown how quickly sentiment can turn when infrastructure moves into neighborhoods.
A Different Model for AI Infrastructure
XFRA challenges a core assumption: that AI infrastructure must be centralized, massive, and slow to build.
Instead, it proposes a model that is:
- Distributed
- Energy-aware
- Rapidly deployable
Backers and ecosystem partners—including Google, Tesla, and Carrier through the Utilize coalition—suggest that the idea is gaining serious traction.
The Real Test Isn’t Technical
Technically, XFRA is already viable. Early deployments with paying customers suggest the system works.
The real question is behavioral.
- Will homeowners accept AI infrastructure on their walls in exchange for cheaper utilities?
- Will regulators and insurers adapt fast enough to support the model?
- Will communities embrace—or reject—the idea of turning homes into nodes in a global compute network?
Because if the answer is yes, the next generation of AI infrastructure won’t live in remote, billion-dollar campuses.
It’ll live next door.
Related: AI’s Hidden Cost: The Bill That Never Stops Growing