Posts Tagged ‘from’

Stackelberg competition – Wikipedia

June 20, 2021

https://en.wikipedia.org/wiki/Stackelberg_competition

Reconciling modern machine-learning practice and the classical bias–variance trade-off

May 31, 2021

QT:{{“U-shaped bias–variance trade-off curve has shaped our view of model selection and directed applications of learning algorithms in practice. “}}
Nice discussion of the limitations of the bias-variance tradeoff for #DeepLearning
https://www.pnas.org/content/116/32/15849

How to Make Oobleck – A Simple Recipe for Making Slime | Live Science

May 19, 2021

QT:{{”
Want to have fun with physics and even “walk on water”? Try making a mixture of cornstarch and water called oobleck. It makes a great science project or is just fun to Oobleck is a non-Newtonian fluid. “}}

https://www.livescience.com/21536-oobleck-recipe.html

sleep datasets with possible public or expert access

May 6, 2021

STAGES (n=30,000)
https://academic.oup.com/sleep/article/41/suppl_1/A124/4988361

Others somewhat smaller
https://sleepdata.org/datasets/hchs (N=16,000; actigraphy but not genomics)

https://sleepdata.org/datasets/mesa (N=6,800; actigraphy but not genomics; strength is longitudinal following)

Robo-writers: the rise and risks of language-generating AI

April 17, 2021

https://www.nature.com/articles/d41586-021-00530-0

GPT3

QT:{{”
A neural network’s size — and therefore its power — is roughly measured by how many parameters it has. These numbers define the strengths of the connections between neurons. More neurons and more connections means more parameters; GPT-3 has 175 billion. The next-largest language model of its kind has 17 billion (see ‘Larger language models’). (In January, Google released a model with 1.6 trillion parameters, but it’s a ‘sparse’ model, meaning each parameter does less work. In terms of performance, this is equivalent to a ‘dense’ model that has between 10 billion and 100 billion parameters, says William Fedus, a researcher at the University of Montreal, Canada, and Google.)
“}}

Smart cities built with smart materials | Science

April 8, 2021

Is plain-old asphalt a smart, self-healing material?
https://science.sciencemag.org/content/371/6535/1200

Human local adaptation of the TRPM8 cold receptor along a latitudinal cline

April 7, 2021

Stumbled onto this paper. Thought the conclusion that Europeans were more cold-sensitive due to TRPM8 was quite counter-intuitive – but interesting nevertheless
https://journals.plos.org/plosgenetics/article?id=10.1371/journal.pgen.1007298

cold receptor

Downie – YouTube Video Downloader for macOS – Charlie Monroe Software

March 29, 2021

https://software.charliemonroe.net/downie/

Massive Google-funded COVID database will track variants and immunity

March 29, 2021

https://www.nature.com/articles/d41586-021-00490-5

VideoHive – Stock Footage & Video Effects

March 27, 2021

https://videohive.net/