Lab 02 — Visualization

Power, Pixels,
and Proportions

Two data visualizations exploring GPU hardware efficiency and U.S. data center electricity consumption — drawn from real datasets including LBNL 2024, the IEA Energy & AI report, and the MDPI Kappa-Energy Index paper.

LBNL 2024 U.S. Data Center Report IEA Energy & AI Dataset MDPI Kappa-Energy Index (2025) Kaggle GPU Benchmarks

GPU Architecture:
Performance vs. Energy Cost

Six deep learning architectures benchmarked on two NVIDIA GPUs. Each row shows how training energy and top-1 accuracy interact — the ideal model sits at low energy, high accuracy. Data from the MDPI Kappa-Energy Index study (2025), Tables 3–7.

SOURCE — MDPI Sensors 25(3):846 · DOI:10.3390/s25030846
Architecture Params (M) TITAN Xp Energy (Wh) GTX 1080 Ti Energy (Wh) Top-1 Accuracy (%) Kappa-Energy Index Efficiency Tier
AlexNet 61.1 14.2 16.8 56.5 3.98 Low
VGG-16 138.4 210.5 248.3 71.6 0.34 Very Low
ResNet-18 11.7 28.7 33.4 69.8 2.43 High
EfficientNet-B3 12.2 41.3 49.6 82.1 1.99 High
ConvNeXt-T 28.6 67.9 79.2 82.1 1.21 Medium
Swin Transformer 28.3 89.4 104.8 81.3 0.91 Medium-Low
Energy-Accuracy Trade-off Map
Training energy consumption (Wh, TITAN Xp) vs. ImageNet Top-1 Accuracy. Bubble size = parameter count. Ideal quadrant: bottom-right (low energy, high accuracy).
Highly efficient
Moderate trade-off
Energy-intensive
Bubble area ∝ parameter count
Key Insight

ResNet-18 and EfficientNet-B3 dominate the efficiency frontier — EfficientNet achieves the highest accuracy (82.1%) at just 41 Wh, roughly 5× less energy than VGG-16 with 12% better accuracy. VGG-16 sits alone in the "energy trap" quadrant: the worst energy-per-accuracy ratio of all tested architectures.

U.S. Data Center Electricity
Consumption, 2000–2023

Twenty-three years of tracked electricity demand, disaggregated by facility type: traditional enterprise, colocation, and hyperscale/cloud. The rise of hyperscale is the dominant structural shift — and AI-optimized clusters are accelerating it further. From the LBNL 2024 report, Figure 2.1 & Table 2.1.

SOURCE — LBNL 2024 U.S. Data Center Energy Usage Report · eta-publications.lbl.gov
Year Traditional Enterprise (TWh) Colocation (TWh) Hyperscale / Cloud (TWh) Total (TWh) YoY Growth Hyperscale Share
Stacked Area: Electricity by Facility Type
Total U.S. data center electricity (TWh). The green surge from 2015 onward reflects hyperscale build-out driven by cloud and, increasingly, AI workloads.
Traditional Enterprise
Colocation
Hyperscale / Cloud
Key Insight

U.S. data center electricity demand grew from ~61 TWh in 2000 to ~176 TWh in 2023 — a ~190% increase. But the composition shifted dramatically: hyperscale now accounts for ~55% of total consumption, up from near-zero in 2010. Critically, even as total demand surged, efficiency improvements (better PUE, server consolidation) prevented consumption from growing proportionally with compute capacity.

GPU Specs Dashboard

Interactive analytics across GPU hardware generations — compute density, memory bandwidth proxies, and clock relationships. Filter by manufacturer, release year, and sort metric.

SOURCE — Kaggle GPU Benchmarks · wthoman API

Connecting to data source…
GPU Models
Total records loaded
Avg Unified Shaders
Compute-heavy filtered set
Median Memory
Useful for tier clustering
Top GPU Clock

Memory vs. Unified Shaders

Bubble size = memory bus width. Ideal: high shaders, large memory.

NVIDIA
Intel
AMD

GPU Clock vs. Memory Clock

Horizontal bars sorted by GPU clock (MHz).

GPU clock
Mem clock

Filtered Records

ManufacturerProductYear Mem (GB)Bus WidthGPU Clock Mem ClockShadersChip
Loading…

Global Data Center
Energy Demand

IEA Electricity 2024 projections — how much electricity data centers consumed in 2022 and where demand is heading by 2026 across regions, plus a breakdown of how that electricity is split inside a typical facility.

SOURCE — IEA, Electricity 2024: Analysis and Forecast to 2026 · jhu30699

Global demand, 2022
460 TWh
Actual figure from IEA report
IEA base case, 2026
~800 TWh
Range: 620–1,050 TWh
Internal energy split
40/40/20
Computing / Cooling / Other IT
Regional snapshot — 2026 forecast (TWh)

Global Electricity Demand from Data Centers

IEA 2022 actual vs 2026 low / base / high scenario (TWh)

0 200 400 600 800 1,000 460 2022 actual 620 2026 low ~805 2026 base 1,050 2026 high TWH / YEAR
Actual / base scenario
Low / high scenario

Electricity Use Inside a Data Center

Typical breakdown by component type (IEA)

40/40/20 % of total use Computing — 40% Cooling — 40% Other IT — 20%

Cooling matches computing in electricity draw — a core finding that supports efficiency gains from better PUE practices and hardware co-design.

Underlying Data

Dataset20222026Notes

The Data Center
Electricity Surge

Regional electricity consumption 2017–2030 and GPU hardware efficiency landscape — tracing how demand is growing by geography and how compute efficiency has evolved across NVIDIA architectures.

SOURCE — IEA Electricity 2024 · NVIDIA datasheets · opederso

Global Data Center Electricity by Region

TWh · actual and IEA base-case projections

Region 2022 TWh 2024 TWh 2030 TWh 2024–2030 Growth Share (2024) Energy Mix Carbon Intensity
United States130183426+133%~45%Gas 41%, Nuclear 20%High
China88104279+168%~25%Coal-heavyVery High
Europe (EU+UK)5670115+64%~15%Renewables-leadingMedium
Japan141934+79%~5%Gas & NuclearHigh
Rest of World403991+133%~10%MixedVaries
Global Total460415945+128%100%Gas 40%+ proj.High

GPU Hardware Efficiency Comparison

NVIDIA architectures from Volta to Hopper — TDP, compute throughput, and efficiency per watt

GPUArchitectureYear TDP (W)FP16 TFLOPSVRAM (GB) Mem BW (TB/s)TFLOPS/WTier
V100 SXM2Volta2017300112320.900.37Data Center
RTX 3090Ampere (consumer)2020350142240.940.41Prosumer
A100 SXM4Ampere2020400312802.000.78Data Center
RTX 4090Ada Lovelace2022450330241.010.73Prosumer / Edge
H100 SXM5Hopper2022700990803.351.41Data Center (AI)
H200 SXMHopper20247009901414.801.41Data Center (AI)

Regional Electricity Demand, 2017–2030

Stacked area · TWh · IEA base-case projection (dashed after 2024)

United States China Europe Japan Rest of World

GPU Efficiency vs. Power Draw

X: TDP (W) · Y: FP16 TFLOPS/W · Bubble size: memory bandwidth

Data center GPU Consumer / prosumer

Regional Concentration & Growth Trends

Three interactive views — a regional constellation, a growth ladder with forecast band, and a proportional ribbon — that together map where AI data center electricity use sits today and where it is heading by 2030.

Underlying Data Grid

Source figures from the IEA Energy and AI materials and the 2024 LBNL United States Data Center Energy Usage Report.

Metric GroupLabelRegion YearValueUnitsStatusSource

Regional Share Constellation

Circle area scaled to TWh. Hover or click each region for details. The center circle represents the full 415 TWh global total.

Interactive SVG · hover or click

Global data center electricity is concentrated — the U.S. holds ~45% of the 2024 total, followed by China (25%) and Europe (15%).

Growth Ladder + U.S. Forecast Band

Bars show actual and IEA base-case values. The gradient ribbon represents the LBNL U.S. 2028 uncertainty range — click it for details.

Interactive SVG · hover bars, click forecast band

The IEA base case has global demand more than doubling by 2030. The U.S. 2028 band (325–580 TWh) underscores forecast uncertainty.

Share Ribbon

The same regional share data in proportional form. Hover each segment for the exact share and implied TWh.

Interactive SVG · hover segments

Concentration

A few regions carry the bulk of global data center electricity use, so grid choices in those regions shape the whole global story.

Acceleration

Moving from 415 TWh in 2024 to 945 TWh in the IEA base case for 2030 shows how quickly this load can expand in just six years.

Range

The U.S. 2028 band of 325–580 TWh is a reminder that forecasting infrastructure demand is a range problem, not a single-number problem.