Joey Eamigh deb1cdc527
spec: add Phase 6 post-review enhancements
Data coverage expansion (80-120 DCs, power plants, SE grid regions,
2yr backfill), map UX overhaul (dark mode, legend, floating labels,
Chapel Hill center), chart fixes, GPU calculator update, granular
Suspense boundaries for navigation performance.
2026-02-11 14:37:43 -05:00

33 KiB

Energy & AI: The Datacenter Power Crunch

An interactive dashboard visualizing how AI datacenter buildout is driving regional electricity demand and energy prices across the United States.

Why This Matters

The AI boom isn't just a software story — it's an energy story. Training a single frontier model can consume as much electricity as a small town uses in a year. And we're not talking about one model: every major tech company is racing to build out GPU clusters at unprecedented scale.

This is creating a tectonic shift in US energy markets:

  • Dominion Energy (Virginia/PJM) has seen datacenter load applications surge 10x since 2020. Northern Virginia alone hosts ~70% of the world's internet traffic and is adding gigawatts of new AI load.
  • ERCOT (Texas) is fielding datacenter interconnection requests totaling more than the entire current grid capacity of some US states.
  • Natural gas prices are being pushed up because gas-fired power plants are the marginal generator in most US regions — when demand spikes, gas sets the price.
  • Grid reliability is under threat: regions that were comfortably supplied five years ago are now facing capacity shortfalls, triggering emergency demand response events.

The people making billion-dollar decisions about this — energy investors, utility planners, datacenter operators, commodity traders — need real-time visibility into where demand is concentrating, how prices are responding, and which regions are approaching their limits. That's what this dashboard provides.

Business Case

The problem: Energy market data is scattered across dozens of sources (EIA, ISOs, FRED, commodity exchanges) with inconsistent formats, no geospatial context, and no connection to the AI infrastructure buildout driving the changes. Analysts spend hours stitching together spreadsheets to answer basic questions like "how have electricity prices changed in regions with heavy datacenter buildout?"

The solution: A single, real-time dashboard that overlays datacenter locations on energy market data, making the AI-energy nexus immediately visible and explorable.

Target audience: Energy investors evaluating utility stocks and commodity positions. Utility analysts planning generation and transmission investments. Datacenter site selectors choosing where to build next. Business strategists assessing the infrastructure costs underlying AI.

Monetization angle: Freemium model. Free tier provides the dashboard with real-time data. Premium tier adds predictive analytics (price forecasting, capacity constraint alerts), custom region comparisons, CSV/API data export, and email alerting for price spike events. Enterprise tier provides embeddable widgets for trading desks and analyst reports.

Tech Stack

Layer Technology Version Why
Framework Next.js (App Router) 16 Turbopack default, "use cache" directive, React 19.2 — the most capable React framework available
Styling Tailwind CSS 4 CSS-first config, zero-runtime, pairs perfectly with shadcn/ui
Components shadcn/ui latest Copy-paste components, not a dependency — full control, great defaults, built-in chart components (Recharts wrappers)
Maps @vis.gl/react-google-maps latest Google's own recommended React library for Maps. Declarative, hooks-based, AdvancedMarker support
Database PostgreSQL + PostGIS 18 + 3.5 PostGIS is the gold standard for geospatial queries — ST_DWithin, ST_Distance, polygon containment, all in SQL
ORM Prisma (with TypedSQL) 7.x TypedSQL lets us write .sql files for PostGIS queries and get fully typed TypeScript wrappers. Best of both worlds: Prisma for CRUD, raw SQL for geo
Runtime / PM Bun latest Fastest JS runtime, built-in TS support, great package manager
Charts Recharts (via shadcn/ui) latest shadcn/ui's chart components wrap Recharts with consistent theming — no extra config
Serialization superjson latest Preserves Date, BigInt, Map, Set across server→client boundary. Without it, every timestamp silently becomes a string while TypeScript still calls it Date
Containerization Docker Compose - One command to spin up PostGIS. Reproducible dev environment

Type Safety Strategy (E2E)

External APIs like EIA and FRED return untyped JSON. A single bad response shape can cascade into runtime crashes or — worse — silently wrong data on a dashboard people might make investment decisions from. Every data boundary is typed and validated:

External APIs → Zod schemas → Server (validated, typed)
Database → Prisma generated types → Server
PostGIS queries → TypedSQL (.sql files) → Typed wrappers
Server → Client: superjson serialize/deserialize (preserves Date, BigInt, etc.)
              → Next.js Server Actions + Server Components (typed props)
Forms → Zod + react-hook-form (validated inputs)

No any. No unvalidated API responses. If the EIA changes their response format, we get a Zod parse error at ingestion time, not a broken chart at render time.

Why superjson?

React Server Components and Server Actions serialize data using JSON.stringify under the hood. This silently destroys rich types:

  • Datestring (but TypeScript still says Date)
  • Map{}
  • Set{}
  • BigInt → throws an error
  • undefined → omitted

For a dashboard where almost every data point has a TIMESTAMPTZ, this is catastrophic. You think you have a Date you can call .getTime() on, but you actually have "2026-01-15T14:30:00.000Z" and your code silently breaks or produces wrong calculations.

superjson wraps serialization to preserve these types. Every Server Action and every Server Component that passes temporal or complex data to client components must use superjson at the boundary. Create a shared utility (src/lib/superjson.ts) that standardizes this across the codebase.

Data Sources

We use two categories of data: real-time feeds from government APIs (free, reliable, well-documented) and curated seed data for the geospatial layer.

Real-Time (API-driven)

Source Data Granularity Update Frequency Why This Source
EIA API Regional electricity prices, demand, generation mix Hourly, by balancing authority Hourly The definitive US energy data source. Free, 9k req/hr, decades of history. Covers all ISO/RTO regions.
EIA API Natural gas (Henry Hub), WTI crude, coal spot prices Daily/weekly Daily Gas prices directly drive electricity prices (gas is the marginal fuel). Oil/coal provide macro context.
FRED API Historical commodity price time series Daily Daily Clean, reliable time series going back to the 1940s. Perfect for long-run trend analysis.

Static / Seed Data

This data requires upfront research and curation. A Researcher agent should compile these datasets during Phase 1, producing clean JSON/GeoJSON files that the seed script can ingest.

Source Data Format Why
DataCenterMap / manual curation Datacenter locations (lat/lng, operator, capacity MW, year opened) GeoJSON FeatureCollection of Points The core geospatial layer. PostGIS Point geometries enable spatial queries (nearby DCs, DCs in region, clustering). Needs real research — scrape DataCenterMap.com, cross-reference with public announcements for major hyperscaler facilities.
EIA / ISO boundaries Grid region polygons (PJM, ERCOT, CAISO, NYISO, ISONE, MISO, SPP) GeoJSON FeatureCollection of MultiPolygons PostGIS MultiPolygon geometries enable the price heatmap overlay and spatial joins between DCs and regions. EIA and HIFLD provide these boundary files.
AI milestones Timeline of major AI announcements with dates and descriptions JSON array Chart annotations that tell the story — "here's when ChatGPT launched, here's when prices started climbing." Turns data into narrative.

Database Schema (PostgreSQL + PostGIS)

┌─────────────────────────┐     ┌──────────────────────────┐
│ datacenters             │     │ grid_regions             │
├─────────────────────────┤     ├──────────────────────────┤
│ id            UUID PK   │     │ id            UUID PK    │
│ name          TEXT      │     │ name          TEXT       │
│ operator      TEXT      │     │ code          TEXT       │  (e.g. "PJM", "ERCOT")
│ location      GEOGRAPHY │◄───►│ boundary      GEOGRAPHY  │  (MultiPolygon)
│   (Point, 4326)        │     │ iso           TEXT       │
│ capacity_mw   FLOAT    │     │ created_at    TIMESTAMPTZ│
│ status        TEXT      │     └──────────────────────────┘
│ year_opened   INT      │
│ region_id     UUID FK  │──────┘
│ created_at    TIMESTAMPTZ│
└─────────────────────────┘

┌─────────────────────────┐     ┌──────────────────────────┐
│ electricity_prices      │     │ commodity_prices          │
├─────────────────────────┤     ├──────────────────────────┤
│ id            UUID PK   │     │ id            UUID PK    │
│ region_id     UUID FK   │     │ commodity     TEXT       │  (natural_gas, wti_crude, coal)
│ price_mwh     FLOAT    │     │ price         FLOAT      │
│ demand_mw     FLOAT    │     │ unit          TEXT       │
│ timestamp     TIMESTAMPTZ│    │ timestamp     TIMESTAMPTZ│
│ source        TEXT      │     │ source        TEXT       │
└─────────────────────────┘     └──────────────────────────┘

┌─────────────────────────┐
│ generation_mix          │
├─────────────────────────┤
│ id            UUID PK   │
│ region_id     UUID FK   │
│ fuel_type     TEXT      │  (gas, nuclear, wind, solar, coal, hydro)
│ generation_mw FLOAT    │
│ timestamp     TIMESTAMPTZ│
└─────────────────────────┘

TypedSQL Queries (prisma/sql/)

Examples of PostGIS queries that get typed wrappers:

findDatacentersInRegion.sql

-- @param {String} $1:regionCode
SELECT
  d.id, d.name, d.operator, d.capacity_mw, d.status, d.year_opened,
  ST_AsGeoJSON(d.location)::TEXT as location_geojson
FROM datacenters d
JOIN grid_regions r ON d.region_id = r.id
WHERE r.code = $1
ORDER BY d.capacity_mw DESC

findNearbyDatacenters.sql

-- @param {Float} $1:lat
-- @param {Float} $2:lng
-- @param {Float} $3:radiusKm
SELECT
  d.id, d.name, d.operator, d.capacity_mw,
  ST_AsGeoJSON(d.location)::TEXT as location_geojson,
  ST_Distance(d.location, ST_MakePoint($2, $1)::geography) / 1000 as distance_km
FROM datacenters d
WHERE ST_DWithin(d.location, ST_MakePoint($2, $1)::geography, $3 * 1000)
ORDER BY distance_km

getRegionPriceHeatmap.sql

SELECT
  r.code, r.name,
  ST_AsGeoJSON(r.boundary)::TEXT as boundary_geojson,
  AVG(ep.price_mwh) as avg_price,
  MAX(ep.price_mwh) as max_price,
  AVG(ep.demand_mw) as avg_demand,
  COUNT(DISTINCT d.id)::INT as datacenter_count,
  COALESCE(SUM(d.capacity_mw), 0) as total_dc_capacity_mw
FROM grid_regions r
LEFT JOIN electricity_prices ep ON ep.region_id = r.id
  AND ep.timestamp > NOW() - INTERVAL '24 hours'
LEFT JOIN datacenters d ON d.region_id = r.id
GROUP BY r.id, r.code, r.name, r.boundary

Real-Time Candy

These features make the dashboard feel alive — like a trading floor terminal, not a static report.

Live Price Ticker Tape

A scrolling horizontal banner across the top of every page showing current regional electricity prices and commodity spot prices — styled like a financial news ticker. Green/red coloring for price direction. Always visible, always updating.

Animated Number Transitions

Hero metrics (avg electricity price, gas spot, total DC capacity) use smooth count-up/count-down animations when data updates. Numbers don't just appear — they roll to the new value. Uses framer-motion animate with spring physics.

Pulsing Map Markers

Datacenter markers on the Google Map emit a soft radial pulse animation when their region's electricity price exceeds its 30-day average. The faster the pulse, the bigger the price deviation. A calm map means stable prices; a map full of pulsing dots means something interesting is happening.

GPU Cost Calculator (Live)

A sticky widget: "Running 1,000 H100 GPUs right now costs $X,XXX/hr in Virginia vs $Y,YYY/hr in Texas." Updates with live regional prices. Users can adjust GPU count with a slider. Makes the abstract price data immediately tangible — this is what it actually costs to train AI right now.

Grid Stress Gauges

Radial gauge components per region showing current demand / peak capacity as a percentage. Styled like a speedometer — green zone, yellow zone, red zone. When a region creeps past 85% capacity utilization, the gauge glows red. Immediate visual signal for grid stress.

Price Spike Toasts

sonner toast notifications that pop up when any region's price crosses a configurable threshold (e.g., >$100/MWh) or hits a new 30-day high. Persistent in the bottom-right corner. Gives that "breaking news" feeling.

Auto-Refresh with Countdown

A subtle countdown timer in the nav: "Next refresh in 47s". Data auto-refreshes on a configurable interval (default 60s). Uses Next.js router.refresh() to re-run server components without a full page reload. The countdown itself is a client component with requestAnimationFrame for smooth ticking.

Ambient Region Glow

On the map, grid region polygons don't just use static fill colors — they have a subtle CSS animation that "breathes" (opacity oscillation). Higher-priced regions breathe faster and brighter. The map looks alive at a glance.

Pages & Views

1. Dashboard Home (/)

  • Hero metrics: Live national avg electricity price, natural gas spot, total DC capacity
  • Price change sparklines: 24h/7d/30d trends for key indicators
  • Recent alerts: Notable price spikes or demand records
  • Quick map preview: Thumbnail of the full map with DC hotspots

2. Interactive Map (/map)

  • Google Maps with terrain dark styling (Map ID configured in Google Cloud Console, stored in .env as NEXT_PUBLIC_GOOGLE_MAP_ID)
  • Datacenter markers: Clustered markers sized by capacity (MW), colored by operator
  • Regional overlays: Grid region polygons colored by current electricity price (heatmap)
  • Click interactions: Click a region to see detail panel (prices, demand, generation mix, DC list)
  • Click a datacenter: See operator, capacity, year opened, regional context
  • Filters: By operator, capacity range, region, time period
  • Multi-line charts: Regional electricity prices over time (selectable regions)
  • Commodity overlay: Natural gas / crude oil prices on secondary axis
  • AI milestone annotations: Vertical markers for ChatGPT launch, major cluster announcements
  • Correlation view: Scatter plot of DC capacity vs regional price
  • Time range selector: 1M, 3M, 6M, 1Y, ALL

4. Demand Analysis (/demand)

  • Regional demand growth: Bar/line charts showing demand trends by ISO region
  • Peak demand tracking: Historical peak demand records
  • Forecast overlay: EIA demand forecasts where available
  • DC impact estimation: Estimated datacenter load as percentage of regional demand

5. Generation Mix (/generation)

  • Stacked area charts: Generation by fuel type per region over time
  • Renewable vs fossil split: How DC-heavy regions compare
  • Carbon intensity proxy: Generation mix as indicator of grid cleanliness

Project Structure

bonus4/
├── docker-compose.yml
├── .env                          # API keys (EIA, FRED, Google Maps)
├── .prettierrc.js                # (existing)
├── tsconfig.json                 # (existing)
├── eslint.config.js              # (existing)
├── next.config.ts
├── package.json
├── prisma/
│   ├── schema.prisma
│   ├── migrations/
│   ├── seed.ts                   # Datacenter locations, region boundaries, AI milestones
│   └── sql/                      # TypedSQL queries
│       ├── findDatacentersInRegion.sql
│       ├── findNearbyDatacenters.sql
│       ├── getRegionPriceHeatmap.sql
│       ├── getLatestPrices.sql
│       ├── getPriceTrends.sql
│       ├── getDemandByRegion.sql
│       └── getGenerationMix.sql
├── src/
│   ├── app/
│   │   ├── layout.tsx
│   │   ├── page.tsx              # Dashboard home
│   │   ├── map/
│   │   │   └── page.tsx
│   │   ├── trends/
│   │   │   └── page.tsx
│   │   ├── demand/
│   │   │   └── page.tsx
│   │   ├── generation/
│   │   │   └── page.tsx
│   │   └── api/
│   │       └── ingest/           # Data ingestion endpoints (cron-triggered)
│   │           ├── electricity/route.ts
│   │           ├── commodities/route.ts
│   │           └── generation/route.ts
│   ├── components/
│   │   ├── ui/                   # shadcn/ui components
│   │   ├── map/
│   │   │   ├── energy-map.tsx    # Main Google Maps component
│   │   │   ├── datacenter-marker.tsx
│   │   │   ├── region-overlay.tsx
│   │   │   └── map-controls.tsx
│   │   ├── charts/
│   │   │   ├── price-chart.tsx
│   │   │   ├── demand-chart.tsx
│   │   │   ├── generation-chart.tsx
│   │   │   └── sparkline.tsx
│   │   ├── dashboard/
│   │   │   ├── metric-card.tsx
│   │   │   └── alerts-feed.tsx
│   │   └── layout/
│   │       ├── nav.tsx
│   │       └── footer.tsx
│   ├── lib/
│   │   ├── db.ts                 # Prisma client singleton
│   │   ├── api/
│   │   │   ├── eia.ts            # EIA API client + Zod schemas
│   │   │   └── fred.ts           # FRED API client + Zod schemas
│   │   ├── schemas/              # Shared Zod schemas
│   │   │   ├── electricity.ts
│   │   │   ├── commodities.ts
│   │   │   └── geo.ts
│   │   └── utils.ts
│   ├── actions/                  # Server Actions (typed server→client boundary)
│   │   ├── prices.ts
│   │   ├── datacenters.ts
│   │   ├── demand.ts
│   │   └── generation.ts
│   └── types/
│       └── index.ts              # Shared type definitions
├── scripts/
│   └── backfill.ts               # One-time historical data backfill (idempotent)
├── data/                            # Curated seed data files
│   ├── datacenters.geojson       # Datacenter locations
│   ├── grid-regions.geojson      # ISO/RTO boundary polygons
│   └── ai-milestones.json        # Timeline of major AI announcements
├── Assignment.md
├── CLAUDE.md
└── SPEC.md

Docker Compose

services:
  db:
    image: postgis/postgis:18-3.5
    ports:
      - "5433:5432"
    environment:
      POSTGRES_DB: energy_dashboard
      POSTGRES_USER: energy
      POSTGRES_PASSWORD: ${POSTGRES_PASSWORD}
    volumes:
      - pgdata:/var/lib/postgresql/data

volumes:
  pgdata:

Port 5433 externally (5432 is occupied by another project).

Timezone Strategy

All timestamps are stored and processed in UTC. Period.

  • Database: All TIMESTAMPTZ columns store UTC. PostgreSQL handles this natively.
  • EIA API: Returns UTC — no conversion needed at ingestion.
  • FRED API: Returns date-only strings (no timezone) — treat as UTC midnight.
  • Display: Convert to the relevant market's local time only at render time. ERCOT → Central, PJM → Eastern, CAISO → Pacific. Use a shared formatMarketTime(utcDate, regionCode) utility.
  • Charts: X-axis labels show market-local time with timezone abbreviation (e.g., "2:00 PM CT").

Energy markets care deeply about local time — a 3pm demand spike in Texas means something different than 3pm UTC. But internal consistency requires UTC everywhere except the final display layer.

Data Ingestion Strategy

The dashboard needs to feel live without hammering external APIs. The strategy is: fetch once, cache in Postgres, serve from cache, refresh on a schedule.

  1. On-demand + cache: Server actions check Postgres first. If the latest cached data is within the TTL window (e.g., 30 min for electricity, 6 hours for commodities), serve from cache. Otherwise, fetch fresh from EIA/FRED, validate with Zod, upsert into Postgres, and return.
  2. API route ingestion: /api/ingest/* routes provide a manual trigger and a cron-compatible endpoint for bulk data pulls. Useful for backfilling historical data and for production scheduled ingestion.
  3. Seed data: Datacenter locations and grid region boundaries are relatively static. Loaded via prisma db seed from curated JSON/GeoJSON files.
  4. Rate limit awareness: EIA allows ~9k req/hr (generous), FRED allows 120/min. With caching, we'll typically make <100 EIA requests/hour even under heavy use. The real bottleneck is EIA's 5,000-row-per-query limit — pagination handled in the API client.

Historical Backfill

On first run, the database is empty and the charts have nothing to show. The ingestion routes must support a backfill mode that pulls 6-12 months of historical data:

  • EIA bulk download: Use the EIA's bulk data endpoints (no API key required for bulk CSVs) to pull historical electricity prices, demand, and generation mix. Parse and upsert into Postgres. This is a one-time operation run via a script, not on every startup.
  • FRED historical: FRED supports date range queries — pull the full series in one request per commodity.
  • Backfill script: scripts/backfill.ts — a standalone script that populates historical data. Run once after the initial migration. Should be idempotent (safe to re-run).

Testing Strategy

No unit tests. Unit tests in a project like this devolve into testing constants and mocking everything — they verify nothing useful and waste time. Instead:

Integration Tests

  • Test the real data pipeline: EIA API → Zod validation → Postgres upsert → query → typed result
  • Test TypedSQL queries against an actual PostGIS database (use the Docker Compose instance)
  • Test Server Actions return correctly shaped data with superjson serialization intact
  • Use bun test with the built-in test runner

E2E Tests

  • Test that each page renders without crashing
  • Test that the map loads with markers
  • Test that charts render with data
  • Test navigation between pages
  • Use Playwright (or agent-browser for manual verification during development)

Smoke Test (minimum bar)

At the very least, every page must:

  1. Render without a React error boundary firing
  2. Show real data (not just loading spinners)
  3. Have no TypeScript errors (bunx tsc --noEmit)
  4. Have no lint errors (bun run lint)

The Reviewer and Tester agents enforce this on every task. This is the real test suite — humans (and agents) verifying actual behavior, not asserting that true === true.

Implementation Phases

Phase 1: Foundation

  • Scaffold Next.js 16 project (in /tmp, then copy into bonus4 dir)
  • Integrate existing prettier/eslint/tsconfig — install their deps, verify they work
  • Install and configure shadcn/ui + Tailwind 4 (dark theme)
  • Docker Compose for PostgreSQL 18 + PostGIS 3.5
  • Prisma schema + initial migration (with CREATE EXTENSION postgis)
  • Research & curate seed data (parallelizable with the above):
    • Datacenter locations GeoJSON (DataCenterMap.com, public announcements)
    • ISO/RTO boundary polygons GeoJSON (EIA, HIFLD)
    • AI milestones timeline JSON
  • Seed script (prisma db seed) to load curated data
  • .gitignore (node_modules, .next, .env, pgdata, etc.)
  • git init + initial commit
  • Verify: project builds, lints clean, database has seed data, dev server starts

Phase 2: Data Layer

  • EIA API client with Zod validation (electricity prices, demand, generation)
  • EIA commodity client (natural gas, oil, coal)
  • FRED API client with Zod validation
  • Historical backfill script (scripts/backfill.ts) — bulk download 6-12 months, idempotent
  • TypedSQL queries for all geospatial operations
  • Server Actions with superjson for typed server→client data passing
  • Ingestion API routes (/api/ingest/*)
  • Integration test: full pipeline from API fetch → Postgres → Server Action → typed result

Phase 3: Dashboard UI

  • App layout (nav with ticker tape slot, sidebar, footer with disclaimer)
  • Dashboard home with metric cards + animated number transitions
  • Google Maps integration (terrain dark Map ID, centered on US)
  • Datacenter markers (AdvancedMarker, clustered, sized by capacity)
  • Region polygon overlays with price-based fill coloring
  • Click interactions (region detail panel, DC detail panel)
  • E2E smoke test: every page renders, map loads, markers visible

Phase 4: Charts & Analysis

  • Price trend charts (Recharts via shadcn/ui, multi-line, time range selector)
  • Commodity overlay (natural gas / oil on secondary axis)
  • Demand analysis views (regional growth, peak tracking)
  • Generation mix charts (stacked area by fuel type)
  • AI milestone annotations (vertical markers on charts)
  • Correlation view (scatter: DC capacity vs regional price)

Phase 5: Polish & Real-Time Candy

  • Live price ticker tape (scrolling banner, every page)
  • Pulsing map markers (price deviation triggers pulse)
  • GPU cost calculator widget (live, slider for GPU count)
  • Grid stress gauges (radial, per region)
  • Price spike toasts (sonner notifications)
  • Auto-refresh with countdown timer
  • Ambient region glow (breathing animation)
  • Responsive design (desktop + tablet)
  • Loading states (loading.tsx) + error boundaries (error.tsx)
  • Disclaimer footer (educational/informational purposes, not financial advice)
  • README with installation and setup docs

Phase 6: Post-Review Enhancements

Based on comprehensive review and user feedback, the following enhancements address data coverage gaps, map UX improvements, chart fixes, and navigation performance.

6.1 Data Coverage Expansion

Datacenter Inventory (Target: 80-120 facilities)

The current 38 datacenters are insufficient. Expand to comprehensive US coverage:

  • North Carolina (CRITICAL — this is a UNC Chapel Hill project): Add 10+ facilities including Apple Maiden, Google Lenoir, Microsoft Catawba County, AWS Richmond County ($10B campus), Digital Realty Charlotte, Compass Durham, Compass Statesville, T5 Kings Mountain, DartPoints Asheville, American Tower Raleigh.
  • Missing hyperscaler campuses: Meta has 20+ US campuses (we have 5), Google has 23 (we have 5), AWS has hundreds (we have 5), Microsoft has 34+ (we have 6). Add at least the major 500+ MW campuses.
  • Missing operators: Apple (5 US campuses), Cloudflare (37+ US edge locations), Switch (Las Vegas, Reno), Vantage (NoVA, Santa Clara), Compass, Stack Infrastructure, EdgeCore, T5.
  • Geographic gaps: Fill Southeast (SC, TN, FL, KY, AL), Northwest (WA, ID), and Midwest (NE, WI, MN).
  • Sources: Meta datacenter page (atmeta.com), Google datacenter page, DataCenterMap.com, Baxtel.com, press releases for 2024-2025 announcements, IM3 Open Source Data Center Atlas (DOE/PNNL).

Power Plant Data (NEW)

Add US power plants to the map from the EIA US Energy Atlas:

  • Source: https://atlas.eia.gov/datasets/eia::power-plants/about — GeoJSON download
  • Filter: Plants >= 50 MW nameplate capacity
  • Fields: name, operator, fuel type, capacity MW, latitude, longitude, state
  • Schema: New power_plants table in Prisma schema with location geography column
  • Map display: Smaller markers (distinct from datacenters) colored by fuel type, with a different shape (e.g., diamond or triangle)
  • Seed: Download, filter, and load into data/power-plants.geojson, add to seed script

Southeast Grid Regions (NEW)

The Southeast US has no ISO/RTO — it's served by vertically integrated utilities. Add these as grid regions:

  • DUKE (Duke Energy Carolinas + Progress): NC/SC focus, EIA respondent codes DUK and CPLE
  • SOCO (Southern Company): GA/AL, EIA respondent code SOCO
  • TVA (Tennessee Valley Authority): TN and surrounding states, EIA respondent code TVA
  • Requires approximate boundary polygons in data/grid-regions.geojson
  • Add state-to-region mappings for retail price data
  • Add to backfill respondent code list

Historical Data Depth

Extend the backfill window from 6 months to 2 years:

  • EIA hourly demand data is available back to July 2015
  • EIA monthly retail prices back to January 2001
  • FRED commodity prices back to the 1990s
  • 2 years captures the full AI boom narrative (GPT-3 June 2020, ChatGPT Nov 2022, GPT-4 March 2023)
  • For chart performance, pre-aggregate to daily for data older than 3 months

6.2 Map UX Overhaul

Default Center and Zoom

  • Center on Chapel Hill, NC: { lat: 35.9132, lng: -79.0558 }
  • Zoom level 6 (shows NC, VA, SC, GA, TN — the most data-dense region)

Dark Mode (2 prop changes)

  • Add colorScheme={ColorScheme.DARK} to the <Map> component (Google's built-in dark theme)
  • Set disableDefaultUI={true} to hide streetview, fullscreen, map type controls (copyright stays)
  • Import ColorScheme from @vis.gl/react-google-maps

Map Legend (NEW component)

Create src/components/map/map-legend.tsx:

  • Position: absolute bottom-right of map container
  • Price color gradient bar: blue → cyan → yellow → red with tick labels ($0, $20, $50, $100+/MWh)
  • Marker size scale: 3-4 circles with MW labels (50, 200, 500 MW)
  • Pulsing indicator explanation
  • Grid stress glow explanation
  • Style: zinc-900/90 bg, backdrop-blur, matching existing map controls panel

Floating Region Price Labels (NEW)

Render AdvancedMarker at each region's centroid showing:

  • Region code (PJM, ERCOT, etc.)
  • Current average price ($XX.XX/MWh)
  • Small colored border matching heatmap color
  • This is the single highest-impact change for map readability

Breathing Animation Tuning

  • Slow the breathing period from 0.5-1.25s to 6-8s (0.125 Hz)
  • Only breathe regions where demand/capacity > 85% (stressed) — calm regions stay static
  • Reduce animation interval from 50ms to 200ms (5 FPS)
  • Reduce amplitude to very subtle (+/- 0.03 to 0.07)

Enhanced Datacenter Markers

  • Show capacity (MW) label inside markers >= 200 MW
  • Different visual treatment by status: operational (solid), under construction (dashed border), planned (hollow ring)
  • Lower pulsing threshold already applied (3%)

6.3 Chart Fixes

Generation Chart Timestamp Fix

  • Change X-axis dataKey from dateLabel (time-only, duplicates across days) to timestamp (unique epoch ms)
  • Add context-aware tickFormatter: 24h shows "3 PM", 7d/30d shows "Jan 15 3PM", 90d/1y shows "Jan 15"
  • Update tooltip labelFormatter to include full date + time

Correlation Chart Dark Theme Labels

  • Add fill: 'hsl(var(--muted-foreground))' to XAxis and YAxis tick props
  • Currently defaults to #666666 which is invisible on dark background

6.4 GPU Calculator Update

Default GPU Model

  • Change default from H100 SXM to B200 (1,000W TDP)
  • B200 is the current-gen datacenter GPU most customers are deploying

NVIDIA R200 (Rubin) — Add if specs confirmed

  • Announced at GTC 2025: 1,800W TDP
  • Recent reports suggest revised to 2,300W TDP
  • Add as "R200 (Rubin)" at 1,800W (official announced spec)
  • 288 GB HBM4, NVLink 6, 2H2026 availability

6.5 Navigation Performance

Granular Suspense Boundaries

Replace full-page loading skeletons with per-section Suspense boundaries:

  • Extract each data-fetching section into its own async Server Component
  • Wrap each in <Suspense fallback={<SectionSkeleton />}>
  • Page shell (headers, layout, tabs) renders instantly
  • Individual sections stream in as data resolves
  • Apply to all 5 pages: dashboard, map, trends, demand, generation