I don’t trust it, but, at the same time, I am willing to let it help me write. Also, suspect this will make grading term papers quite a challenge in the future.
https://www.nytimes.com/2022/04/15/magazine/ai-language.html?smid=tw-share
Posts Tagged ‘gpt3’
A.I. Is Mastering Language. Should We Trust What It Says? – The New York Times
April 24, 2022OpenAI API
March 26, 2022interesting stuff to use “playground”
Compose AI: Automate Your Writing
March 26, 2022Teaching computers to read and speak chemistry
May 7, 2021Robo-writers: the rise and risks of language-generating AI
April 17, 2021https://www.nature.com/articles/d41586-021-00530-0
GPT3
QT:{{”
A neural network’s size — and therefore its power — is roughly measured by how many parameters it has. These numbers define the strengths of the connections between neurons. More neurons and more connections means more parameters; GPT-3 has 175 billion. The next-largest language model of its kind has 17 billion (see ‘Larger language models’). (In January, Google released a model with 1.6 trillion parameters, but it’s a ‘sparse’ model, meaning each parameter does less work. In terms of performance, this is equivalent to a ‘dense’ model that has between 10 billion and 100 billion parameters, says William Fedus, a researcher at the University of Montreal, Canada, and Google.)
“}}