Passive two-phase cooling that cuts data center cooling energy by 50% with zero moving parts.
CALYOS builds passive two-phase cooling systems derived from satellite thermal management technology (Euro Heat Pipes heritage). Their systems use the latent heat of vaporization to transfer heat without pumps, compressors, or fans.
A working fluid evaporates at the heat source (chip/server), travels as vapor to a condenser (exterior), releases heat, and returns as liquid by gravity/capillary action. No electricity required for the cooling loop itself. The system is entirely sealed -- no water consumption, no refrigerant leaks, no maintenance.
Zero moving parts (no pumps, fans, or compressors) means silent, vibration-free, maintenance-free operation. Up to 50% reduction in cooling energy vs. traditional CRAC/CRAH systems. Works in extreme conditions: dust, vibration, temperature extremes. Space heritage gives unmatched reliability data. Unlike immersion cooling, requires no special fluids or tank infrastructure.
TRL 7-8 -- Demonstrated at BEDEX 2026 for ruggedized/tactical military electronics. Commercial data center deployments in progress.
Cooling consumes 30-40% of total data center energy. With AI workloads pushing rack densities to 40-100 kW/rack, traditional air cooling is hitting physical limits. CALYOS eliminates the cooling energy penalty entirely for the heat transport loop, and their passive systems scale linearly with rack count without the compounding inefficiencies of chilled water plants.
Edge data centers (especially military/tactical where silence and reliability are critical). High-density AI compute racks (40kW+) where air cooling fails. Modular/containerized data centers for rapid deployment. Retrofit of existing facilities to increase rack density without expanding cooling plant. Disaster recovery sites requiring maintenance-free operation.
Direct chip-to-outdoor heat rejection (no intermediate chilled water loop). Compatible with existing server form factors -- replaces heat sinks, not servers. Can integrate with dry coolers or natural convection radiators outside. No raised floor or plenum required. Pairs with any power source (grid, on-site generation, renewables).
50% reduction in cooling OPEX (electricity). Elimination of water consumption (critical in water-stressed regions -- saves ~1.8L per kWh of IT load). Zero maintenance cost for cooling loop. Higher rack density in same footprint = better $/kW of deployed compute. Payback typically 2-3 years vs. traditional cooling CAPEX.
Global data center cooling market: $23B by 2028 (growing at 13% CAGR). Liquid cooling specifically: $8.5B by 2028. Military/edge cooling: $2.1B segment growing at 18% CAGR driven by tactical compute requirements.
Traditional CRAC/CRAH air cooling (dominant but hitting limits above 15 kW/rack). Direct liquid cooling / cold plates (CoolIT, Asetek -- requires pumps and plumbing). Single-phase immersion (GRC, LiquidCool -- requires tank infrastructure). Two-phase immersion (Chemours Opteon, 3M Novec -- fluid costs $50-200/liter). Rear-door heat exchangers (Vertiv, Schneider -- incremental improvement).
CALYOS uniquely combines two-phase efficiency with zero moving parts. Immersion cooling competitors require complete server redesign and expensive fluids. Direct liquid cooling competitors (CoolIT, Asetek) still need pumps and CDUs. CALYOS is the only solution with space heritage reliability data. Their DIANA selection validates defense/military readiness that competitors lack.
AI compute CAPEX expected to exceed $200B annually by 2027. Every 1 MW of AI compute generates ~0.4 MW of cooling load. Grid queue delays (3-7 years) make cooling efficiency a capacity multiplier. EU Energy Efficiency Directive mandates PUE improvements. DoD modernization budget includes $2.8B for IT infrastructure upgrades.
VP of Data Center Operations (cares about uptime, rack density, PUE). Chief Sustainability Officer (water and energy reduction metrics). VP of Engineering/Infrastructure (integration complexity, compatibility). Director of Edge/Tactical Computing (military buyers -- ruggedness, silence, field maintainability). Energy Procurement (reducing cooling as % of total power draw).
Equinix (largest colo operator, PUE reduction targets). Digital Realty (sustainability commitments, water reduction). QTS/Blackstone (aggressive build-out, AI-ready facilities). CoreWeave (GPU-dense AI compute, extreme cooling needs). Microsoft Azure / Google Cloud (hyperscale, custom cooling R&D). Military: DISA, Army Futures Command, SOCOM.
DCD-NY sessions on liquid cooling and sustainability are prime targets. Any panel featuring hyperscaler infrastructure leaders. Edge computing and modular data center tracks. Sustainability and ESG reporting sessions.
SOLARSTEAM (complementary -- CALYOS handles chip-level heat, SOLARSTEAM provides facility-level thermal management). Grengine (cyber-secure power + passive cooling = hardened edge compute package). Flatlight (optical interconnects + passive cooling = next-gen DC rack design). Novac (supercapacitor UPS + passive cooling = maintenance-free edge node).
Schneider Electric (building management integration). Vertiv (existing DC cooling vendor -- potential OEM channel). Equinix / Digital Realty (pilot deployment partners). Chemours (competitor in two-phase, but different approach -- could be ecosystem partner on fluid standards).
Bundle passive cooling with Airloom on-site wind power for a fully off-grid, maintenance-free edge data center. Bundle with Grengine cyber-secure batteries for military hardened compute nodes. Pair with Flatlight optical interconnects for a next-gen rack-level solution.
Autonomous edge data centers in unmaintained environments — submarine cable landing stations, cell tower edge nodes, offshore platforms, military forward operating bases, and remote mining/oil sites. The zero-maintenance aspect is the killer feature: most edge DC cooling fails because there's nobody to service it. CALYOS systems have no moving parts to fail, no pumps to replace, no coolant to top off. You deploy and forget for 15+ years. The second creative angle: gravity-independent cooling for shipboard and submarine data centers. Traditional liquid cooling relies on gravity for flow; CALYOS capillary pumping works in any orientation, including inverted — critical for naval vessels that roll and pitch. No other cooling company can make this claim with the same reliability heritage (they've done space applications where gravity is zero).
Edge DC operators spend $15-40K/year per site on cooling maintenance (truck rolls, pump replacements, coolant changes). A 1,000-site edge network saves $15-40M/year in maintenance alone. For naval/shipboard DCs, there is literally no competing technology that works reliably at all orientations without active pumping. Insurance costs drop dramatically when you eliminate both water and pump failure modes from the cooling system.
The loop heat pipe capillary wick creates a pressure differential that drives fluid circulation without any mechanical pump. The evaporator can be mounted directly on the chip (junction-to-case), and the condenser can be located meters away connected by simple tubing. The dielectric working fluid (likely R1234yf or similar low-GWP refrigerant) means a leak doesn't destroy electronics — it just evaporates harmlessly. The system is self-regulating: as heat load increases, evaporation rate increases, driving faster circulation automatically. This passive feedback loop means the cooling system inherently tracks GPU utilization without any control system.
Partner with Vapor IO (edge DC infrastructure), Flexential (edge colocation), or Schneider Electric's micro-DC division. At DCD-NY, look for edge infrastructure exhibitors and naval/defense DC builders like Curtiss-Wright Defense Solutions.
Zero-maintenance cooling that lets you deploy high-density compute anywhere there's no HVAC technician — and forget about it for 15 years.