Posts Tagged ‘teaching’

Random Forests Algorithm explained with a real-life example and some Python code | by Carolina Bento | Towards Data Science

March 19, 2023

https://towardsdatascience.com/random-forests-algorithm-explained-with-a-real-life-example-and-some-python-code-affbfa5a942c

Significant Statistics | Podcast on Spotify

September 3, 2022

https://open.spotify.com/show/0LDwx7dFiQLltpJT9GDe11

Videos can be found on YouTube Channel:
https://www.youtube.com/channel/UCHVyc1NJuYvzpoom-L3nBpg/
More info and notes on Website: https://blogs.lt.vt.edu/jmrussell/

Opioids and the Science of Addiction

February 8, 2021

https://teach.genetics.utah.edu/content/addiction/lessonplans/

Course Demand Statistics for CBB567 in Spring ’20

January 18, 2020

https://ivy.yale.edu/course-stats/course/courseDetail?termCode=202001&courseNumbers=CB%26B+567%2FMB%26B+567%2FS%26DS+567&subjectCode=CB%26B

Bayesian Networks | December 2010 | Communications of the ACM

January 4, 2020

https://m-cacm.acm.org/magazines/2010/12/102122-bayesian-networks/fulltext

Midsummer Course Sharpens Skills in Informatics and Data Science | Yale School of Medicine

August 11, 2019

https://medicine.yale.edu/news/article.aspx?id=20962

Introduction to Proteins: Course presentations and more

June 30, 2019

… all presentations, tables, animations, and exercises of the second edition of Introduction to Proteins: Structure, Function, and Motion are now freely available in the new book website:
http://ibis.tau.ac.il/wiki/nir_bental/index.php/Introduction_to_Proteins_Book

Deep learning and process understanding for data-driven Earth system science | Nature

March 4, 2019

https://www.nature.com/articles/s41586-019-0912-1
Perspective | Published: 13 February 2019
Deep learning and process understanding for data-driven Earth system science Markus Reichstein, Gustau Camps-Valls, Bjorn Stevens, Martin Jung, Joachim Denzler, Nuno Carvalhais & Prabhat
Nature volume 566, pages195–204 (2019)

QT:[[”
Figure 3 presents a system-modelling view that seeks to integrate machine learning into a system model. As an alternative perspective, system knowledge can be integrated into a machine learning frame- work. This may include design of the network architecture36,79, physical constraints in the cost function for optimization58, or expansion of the training dataset for undersampled domains (that is, physically based data augmentation)80.

Surrogate modelling or emulation
See Fig. 3 (circle 5). Emulation of the full (or specific parts of) a physical model can be useful for computational efficiency and tractability rea- sons. Machine learning emulators, once trained, can achieve simulations orders of magnitude faster than the original physical model without sacrificing much accuracy. This allows for fast sensitivity analysis, model parameter calibration, and derivation of confidence intervals for the estimates.

(2) Replacing a ‘physical’ sub-model with a machine learning model
See Fig. 3 (circle 2). If formulations of a submodel are of semi-empirical nature, where the functional form has little theoretical basis (for example, biological processes), this submodel can be replaced by a machine learning model if a sufficient number of observations are available. This leads to a hybrid model, which combines the strengths of physical modelling (theoretical foundations, interpretable compartments) and machine learning (data-adaptiveness).

Integration with physical modelling
Historically, physical modelling and machine learning have often been treated as two different fields with very different scientific paradigms (theory-driven versus data-driven). Yet, in fact these approaches are complementary, with physical approaches in principle being directly interpretable and offering the potential of extrapolation beyond observed conditions, whereas data-driven approaches are highly flexible in adapting to data and are amenable to finding unexpected patterns (surprises).

A success story in the geosciences is weather
prediction, which has greatly improved through the integration of better theory, increased computational power, and established observational systems, which allow for the assimilation of large amounts of data into the modelling system2
. Nevertheless, we can accurately predict the evolution
of the weather on a timescale of days, not months.
“]]

# REFs that I liked
ref 80

ref 57
Karpatne, A. et al. Theory-guided data science: a new paradigm for scientific discovery from data. IEEE Trans. Knowl. Data Eng. 29, 2318–2331 (2017).

# some key BULLETS

• Complementarity of physical & ML approaches
–“Physical approaches in principle being directly interpretable and offering the potential of extrapolation beyond observed conditions, whereas data-driven approaches are highly flexible in adapting to data”

• Hybrid #1: Physical knowledge can be integrated into ML framework –Network architecture
–Physical constraints in the cost function
–Expansion of the training dataset for undersampled domains (ie physically based data augmentation)

• Hybrid #2: ML into physical – eg Emulation of specific parts of a physical for computational efficiency

(4) MIT Computational Biology: Genomes, Networks, Evolution, Health – Fall 2018 – 6.047/6.878/HST.507 – YouTube

December 22, 2018

https://www.youtube.com/playlist?list=PLypiXJdtIca6GBQwDTo4bIEDV8F4RcAgt

Explaining Odds Ratios

November 16, 2018

https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2938757/