Posts Tagged ‘x57l’

Two Silicon Valley genomic data experts announce partnership in Miami – Refresh Miami

September 26, 2021

Making machine learning trustworthy | Science

August 29, 2021

Expanding Access to Large-Scale Genomic Data While Promoting Privacy: A Game Theoretic Approach: The American Journal of Human Genetics

August 29, 2021

Expanding Access to Large-Scale Genomic Data While Promoting Privacy: A Game Theoretic Approach

Zhiyu Wan
Yevgeniy Vorobeychik
Weiyi Xia
Ellen Wright Clayton
Murat Kantarcioglu
Bradley Malin

Published:January 05, 2017

(727) Extract URLs or Link Text from a Google Sheets Cell – YouTube

August 1, 2021

Allele-Specific QTL Fine Mapping with PLASMA: The American Journal of Human Genetics

July 17, 2021

metabrain QTL

July 17, 2021

Brain expression quantitative trait locus and network analysis reveals downstream effects and putative drivers for brain-related diseases by N. de Klein et al., bioRxiv, 2021.



June 10, 2021

Only a tenth of the human genome is studied | The Economist

April 28, 2021

There are roughly 20,000 genes in the human genome. Understanding genes and the proteins they encode can help to unravel the causes of diseases, and inspire new drugs to treat them. But most research focuses on only about ten percent of genes. Thomas Stoeger, Luis Amaral and their colleagues at Northwestern University in Illinois used machine learning to investigate why that might be.

First the team assembled a database of 430 biochemical features of both the genes themselves (such as the levels at which they are expressed in different cells) and the proteins for which they code (for example, their solubility). When they fed these data to their algorithm, they were able to explain about 40% of the difference in the attention paid to each gene (measured by the number of papers published) using just 15 features. Essentially, there were more papers on abundantly expressed genes that encode stable proteins. That suggests researchers—perhaps not unreasonably—focus on genes that are easier to study. Oddly, though, the pattern of publication has not changed much since 2000, despite the completion of the human genome project in 2003 and huge advances in DNA-sequencing technology. “}}

100+ GPT-3 Examples, Demos, Apps, Showcase, and NLP Use-cases | GPT-3 Demo

April 17, 2021

Robo-writers: the rise and risks of language-generating AI

April 17, 2021


A neural network’s size — and therefore its power — is roughly measured by how many parameters it has. These numbers define the strengths of the connections between neurons. More neurons and more connections means more parameters; GPT-3 has 175 billion. The next-largest language model of its kind has 17 billion (see ‘Larger language models’). (In January, Google released a model with 1.6 trillion parameters, but it’s a ‘sparse’ model, meaning each parameter does less work. In terms of performance, this is equivalent to a ‘dense’ model that has between 10 billion and 100 billion parameters, says William Fedus, a researcher at the University of Montreal, Canada, and Google.)