Posts Tagged ‘npc’

Oral delivery of systemic monoclonal antibodies, peptides and small molecules using gastric auto-injectors | Nature Biotechnology

September 25, 2021

https://www.nature.com/articles/s41587-021-01024-0

An easily swallowed capsule injects drugs straight into the gut https://www.nature.com/articles/d41586-021-02443-4

The brain circuit that encourages eating for pleasure : Research Highlights

September 4, 2021

The brain circuit that encourages eating for pleasure : Research Highlights

https://www.nature.com/articles/d41586-020-02485-0

Why sports concussions are worse for women

September 4, 2021

QT:{{”
Smith’s team knew from imaging and brain-tissue studies that axon fibres from the brains of female rats and humans are slimmer than those from males. They wanted to know more about the differences and what effect they might have on brain injury, so they cultured rat neurons and then damaged them by exposing them to a rapid air blast. In the neurons from female rats, the axons were smaller and the microtubules narrower and more susceptible to damage than in the cells from males. The same was true for cultured human neurons5.
“}}
https://www.nature.com/articles/d41586-021-02089-2

How to Make Oobleck – A Simple Recipe for Making Slime | Live Science

May 19, 2021

QT:{{”
Want to have fun with physics and even “walk on water”? Try making a mixture of cornstarch and water called oobleck. It makes a great science project or is just fun to Oobleck is a non-Newtonian fluid. “}}

https://www.livescience.com/21536-oobleck-recipe.html

Neural interface translates thoughts into type

May 17, 2021

https://www.nature.com/articles/d41586-021-00776-8

Robo-writers: the rise and risks of language-generating AI

April 17, 2021

https://www.nature.com/articles/d41586-021-00530-0

GPT3

QT:{{”
A neural network’s size — and therefore its power — is roughly measured by how many parameters it has. These numbers define the strengths of the connections between neurons. More neurons and more connections means more parameters; GPT-3 has 175 billion. The next-largest language model of its kind has 17 billion (see ‘Larger language models’). (In January, Google released a model with 1.6 trillion parameters, but it’s a ‘sparse’ model, meaning each parameter does less work. In terms of performance, this is equivalent to a ‘dense’ model that has between 10 billion and 100 billion parameters, says William Fedus, a researcher at the University of Montreal, Canada, and Google.)
“}}

Massive Google-funded COVID database will track variants and immunity

March 29, 2021

https://www.nature.com/articles/d41586-021-00490-5

Vera Rubin – Wikipedia

March 25, 2021

https://en.wikipedia.org/wiki/Vera_Rubin

The coronavirus is here to stay — here’s what that means

February 21, 2021

endemic
https://www.nature.com/articles/d41586-021-00396-2

How COVID unlocked the power of RNA vaccines

January 21, 2021

.@ElieDolgin’s great feature on the development of new mRNA vaccines highlights how important breakthroughs in lipid nanoparticles were. Interesting that a lot of the key research appears to be funded by @Darpa.

https://www.nature.com/articles/d41586-021-00019-w