https://www.google.com/search?q=will+patents+on+short+read+sequencing+expire+soon&sca_esv=08b21aac9d38c64a&sxsrf=ANbL-n5H-YkllPd3iZHUwXYyKg3BsQoJrQ%3A1778021779967&ei=k3X6adrcOqqiptQPjOjwoAU&biw=1770&bih=1526&ved=0ahUKEwia8uOkn6OUAxUqkYkEHQw0HFQQ4dUDCBM&uact=5&oq=will+patents+on+short+read+sequencing+expire+soon&gs_lp=Egxnd3Mtd2l6LXNlcnAiMXdpbGwgcGF0ZW50cyBvbiBzaG9ydCByZWFkIHNlcXVlbmNpbmcgZXhwaXJlIHNvb24yBRAhGKABMgUQIRigATIFECEYoAEyBRAhGKABMgUQIRigATIFECEYqwIyBRAhGKsCSK1IUABYwkZwAngBkAEAmAGAAaAB6iGqAQQ0Mi43uAEDyAEA-AEBmAIzoAL7I8ICChAjGIAEGIoFGCfCAgQQIxgnwgILEAAYgAQYigUYkQLCAgoQLhiABBiKBRhDwgIKEAAYgAQYigUYQ8ICEBAuGIAEGIoFGEMYsQMYgwHCAggQLhiABBixA8ICCBAAGIAEGLEDwgIFEAAYgATCAgsQLhiABBjHARivAcICCxAAGIAEGLEDGIMBwgIXEC4YgAQYsQMYlwUY3AQY3gQY3wTYAQHCAgsQLhiABBixAxiDAcICCBAuGLEDGIAEwgIXEC4YsQMYgAQYlwUY3AQY3gQY3wTYAQHCAgUQLhiABMICBxAAGIAEGArCAgsQABiABBiKBRiGA8ICBxAAGIAEGA3CAgYQABgeGA3CAgYQABgWGB7CAgoQABgFGB4YDRgKwgIIEAAYCBgeGA3CAggQABiABBiiBMICBRAAGO8FwgIIEAAYiQUYogTCAgUQIRifBcICBxAhGAoYoAGYAwDiAwUSATEgQLoGBggBEAEYFJIHBTQxLjEwoAepgwOyBwUzOS4xMLgH9CPCBwk2LjIwLjIzLjLIB6gBgAgB&sclient=gws-wiz-serp QT{{” Yes, several foundational patents related to short-read sequencing, particularly those held by Illumina regarding reversible terminators and cluster generation, have begun to expire or are set to expire in the coming years (e.g., through 2024–2029). This has already spurred increased competition and the entry of new market players “}}
Posts Tagged ‘costseq3’
Inside the Data Centers That Train A.I. and Drain the Electrical Grid
December 28, 2025QT:{{”
Users ask questions, prompting the A.I. to produce individual units of intelligence called “tokens.” A token might be a small square of pixels or a fragment of a word. To write a college term paper, an A.I. might produce about five thousand tokens, consuming enough electricity to run a microwave oven at full power for about three minutes. As A.I. fields increasingly complex requests—for video, for audio, for therapy—the need for computing power will increase many times over.
Multiply that by the more than eight hundred million people who use ChatGPT every week, and the data-center explosion makes sense. ChatGPT is now more popular than Wikipedia
…
Data centers “are perhaps bigger, by an order of magnitude, than anything we’ve connected to the grid before,” he said. “If you think about the city of Philadelphia, its load is about one gigawatt. Now imagine adding one-gigawatt-sized data centers to the grid, and not just one, but multiples of them.”
…
The modern approach to A.I. development has been to vacuum up any online data available—including audio, video, practically all published work in English, and more than three billion web pages—and let lawyers sort through the mess.
But there is now talk of a data shortage. There are thought to be about four hundred trillion words on the indexed internet, but, as the OpenAI co-founder Andrej Karpathy has noted, much of that is “total garbage.” High-quality text is harder to find. If trends continue, researchers say, A.I. developers could exhaust the usable supply of human text between 2026 and 2032.
“}}
Witt, S. (2025, October 27). Inside the data centers that train A.I. and drain the electrical grid. The New Yorker.
https://www.newyorker.com/magazine/2025/11/03/inside-the-data-centers-that-train-ai-and-drain-the-electrical-grid
Nice numbers on energy usage
AI models are helping dirty industries go green
April 21, 2025Limits to economic growth | Nature Physics
December 24, 2024https://www.nature.com/articles/s41567-022-01652-6
Murphy, T. W. (2022). Limits to economic growth. Nature Physics, 18(8), 844–847. https://doi.org/10.1038/s41567-022-01652-6
The state of academic publishing in 3 graphs, 6 trends, and 4 thoughts | Dynamic Ecology
December 22, 2024QT:{{”
Publishing is growing exponentially – While the number of scientists is also growing exponentially, it is at a slower rate than papers. We are producing more papers per scientist every year. This is a profoundly important fact. Every ecologist knows the power and unsustainability of exponential growth. This also makes it abundantly clear that the publishers only deserve half the blame. Scientists have created a Red Queen situation in which we’re aggressively chasing opportunities to publish. Do we really need 1,000,000 (about +40%) more publications than 10 years ago! (Figure 1).
“}}
What is scaling? – ScienceDirect
December 8, 2024Scaling Laws for Neural Language Models
November 16, 2024https://arxiv.org/pdf/2001.08361
Jared Kaplan ∗
Johns Hopkins University, OpenAI
jaredk@jhu.edu
Sam McCandlish∗
OpenAI
sam@openai.com
Tom Henighan
OpenAI
henighan@openai.com
Tom B. Brown
OpenAI
tom@openai.com
Benjamin Chess
OpenAI
bchess@openai.com
Rewon Child
OpenAI
rewon@openai.com
Scott Gray
OpenAI
scott@openai.com
Alec Radford
OpenAI
alec@openai.com
Jeffrey Wu
OpenAI
jeffwu@openai.com
Dario Amodei
OpenAI
damodei@openai.com
Supersized AI – ScienceDirect
November 2, 2024https://www.sciencedirect.com/science/article/abs/pii/S0262407921018017
Rorvig, M. (2021). Supersized AI. The New Scientist, 251(3355), 36–40. https://doi.org/10.1016/s0262-4079(21)01801-7
How We’ll Reach a 1 Trillion Transistor GPU – IEEE Spectrum
August 19, 2024https://spectrum.ieee.org/trillion-transistor-gpu
Nice scaling graphs