Posts Tagged ‘costseq3’

Inside the Data Centers That Train A.I. and Drain the Electrical Grid

December 28, 2025

QT:{{”
Users ask questions, prompting the A.I. to produce individual units of intelligence called “tokens.” A token might be a small square of pixels or a fragment of a word. To write a college term paper, an A.I. might produce about five thousand tokens, consuming enough electricity to run a microwave oven at full power for about three minutes. As A.I. fields increasingly complex requests—for video, for audio, for therapy—the need for computing power will increase many times over.

Multiply that by the more than eight hundred million people who use ChatGPT every week, and the data-center explosion makes sense. ChatGPT is now more popular than Wikipedia

Data centers “are perhaps bigger, by an order of magnitude, than anything we’ve connected to the grid before,” he said. “If you think about the city of Philadelphia, its load is about one gigawatt. Now imagine adding one-gigawatt-sized data centers to the grid, and not just one, but multiples of them.”

The modern approach to A.I. development has been to vacuum up any online data available—including audio, video, practically all published work in English, and more than three billion web pages—and let lawyers sort through the mess.

But there is now talk of a data shortage. There are thought to be about four hundred trillion words on the indexed internet, but, as the OpenAI co-founder Andrej Karpathy has noted, much of that is “total garbage.” High-quality text is harder to find. If trends continue, researchers say, A.I. developers could exhaust the usable supply of human text between 2026 and 2032.

“}}

Witt, S. (2025, October 27). Inside the data centers that train A.I. and drain the electrical grid. The New Yorker.
https://www.newyorker.com/magazine/2025/11/03/inside-the-data-centers-that-train-ai-and-drain-the-electrical-grid

Nice numbers on energy usage

Chris on X: “GPT-5.1 (Thinking High) is about 300 times cheaper per task than o3-preview (Low) while scorin g only a few points lower on ARC-AGI-1. 1 year later intelligence has gotten 300 times cheaper. This is why I can’t st and people who say “wahh the models too expensive” it will become https://t.co/VkfepKVTgV” / X

December 23, 2025

https://x.com/chatgpt21/status/1990516566073729362

AI models are helping dirty industries go green

April 21, 2025

https://www.economist.com/science-and-technology/2025/04/10/ai-models-are-helping-dirty-industries-go-green

Limits to economic growth | Nature Physics

December 24, 2024

https://www.nature.com/articles/s41567-022-01652-6

Murphy, T. W. (2022). Limits to economic growth. Nature Physics, 18(8), 844–847. https://doi.org/10.1038/s41567-022-01652-6

The state of academic publishing in 3 graphs, 6 trends, and 4 thoughts | Dynamic Ecology

December 22, 2024

QT:{{”
Publishing is growing exponentially – While the number of scientists is also growing exponentially, it is at a slower rate than papers. We are producing more papers per scientist every year. This is a profoundly important fact. Every ecologist knows the power and unsustainability of exponential growth. This also makes it abundantly clear that the publishers only deserve half the blame. Scientists have created a Red Queen situation in which we’re aggressively chasing opportunities to publish. Do we really need 1,000,000 (about +40%) more publications than 10 years ago! (Figure 1).
“}}

https://dynamicecology.wordpress.com/2024/04/29/the-state-of-academic-publishing-in-3-graphs-5-trends-and-4-thoughts/

What is scaling? – ScienceDirect

December 8, 2024

https://www.sciencedirect.com/science/article/pii/S0883902623000691

Scaling Laws for Neural Language Models

November 16, 2024

https://arxiv.org/pdf/2001.08361

Jared Kaplan ∗
Johns Hopkins University, OpenAI
jaredk@jhu.edu
Sam McCandlish∗
OpenAI
sam@openai.com
Tom Henighan
OpenAI
henighan@openai.com
Tom B. Brown
OpenAI
tom@openai.com
Benjamin Chess
OpenAI
bchess@openai.com
Rewon Child
OpenAI
rewon@openai.com
Scott Gray
OpenAI
scott@openai.com
Alec Radford
OpenAI
alec@openai.com
Jeffrey Wu
OpenAI
jeffwu@openai.com
Dario Amodei
OpenAI
damodei@openai.com

Supersized AI – ScienceDirect

November 2, 2024

https://www.sciencedirect.com/science/article/abs/pii/S0262407921018017

Rorvig, M. (2021). Supersized AI. The New Scientist, 251(3355), 36–40. https://doi.org/10.1016/s0262-4079(21)01801-7

How We’ll Reach a 1 Trillion Transistor GPU – IEEE Spectrum

August 19, 2024

https://spectrum.ieee.org/trillion-transistor-gpu

Nice scaling graphs

Where the internet lives | Feb 3rd 2024 | The Economist

March 10, 2024

https://www.economist.com/technology-quarterly/2024-02-03

nice discussion on the scaling economics of data centers