QT:{{”
Want to have fun with physics and even “walk on water”? Try making a mixture of cornstarch and water called oobleck. It makes a great science project or is just fun to Oobleck is a non-Newtonian fluid. “}}
Posts Tagged ‘fromnpc’
How to Make Oobleck – A Simple Recipe for Making Slime | Live Science
May 19, 2021Neural interface translates thoughts into type
May 17, 2021Robo-writers: the rise and risks of language-generating AI
April 17, 2021https://www.nature.com/articles/d41586-021-00530-0
GPT3
QT:{{”
A neural network’s size — and therefore its power — is roughly measured by how many parameters it has. These numbers define the strengths of the connections between neurons. More neurons and more connections means more parameters; GPT-3 has 175 billion. The next-largest language model of its kind has 17 billion (see ‘Larger language models’). (In January, Google released a model with 1.6 trillion parameters, but it’s a ‘sparse’ model, meaning each parameter does less work. In terms of performance, this is equivalent to a ‘dense’ model that has between 10 billion and 100 billion parameters, says William Fedus, a researcher at the University of Montreal, Canada, and Google.)
“}}
Vera Rubin – Wikipedia
March 25, 2021The coronavirus is here to stay — here’s what that means
February 21, 2021The ethical questions that haunt facial-recognition research
December 3, 2020https://www.nature.com/articles/d41586-020-03187-3
Thought a lot of the issues discussed in this article were potentially applicable to genomics