Martin Krzywinski / Genome Sciences Center / mkweb.bcgsc.ca Martin Krzywinski / Genome Sciences Center / mkweb.bcgsc.ca - contact me Martin Krzywinski / Genome Sciences Center / mkweb.bcgsc.ca on Twitter Martin Krzywinski / Genome Sciences Center / mkweb.bcgsc.ca - Lumondo Photography Martin Krzywinski / Genome Sciences Center / mkweb.bcgsc.ca - Pi Art Martin Krzywinski / Genome Sciences Center / mkweb.bcgsc.ca - Hilbertonians - Creatures on the Hilbert Curve
Poetry is just the evidence of life. If your life is burning well, poetry is just the ashLeonard Cohenburn somethingmore quotes

statistics: beautiful



In Silico Flurries: Computing a world of snow. Scientific American. 23 December 2017


statistics + data

Nature Methods: Points of Significance

Martin Krzywinski @MKrzywinski mkweb.bcgsc.ca
Points of Significance column in Nature Methods. (Launch of Points of Significance)
1 | Bzdok, D., Krzywinski, M. & Altman, N. (2018) Points of significance: Statistics vs Machine learning. Nature Methods 15:233–234.
2 | Bzdok, D., Krzywinski, M. & Altman, N. (2018) Points of significance: Machine learning: supervised methods. Nature Methods 15:5–6.
3 | Bzdok, D., Krzywinski, M. & Altman, N. (2017) Points of significance: Machine learning: a primer. Nature Methods 14:1119–1120.
4 | Altman, N. & Krzywinski, M. (2017) Points of significance: Ensemble methods: Bagging and random forests. Nature Methods 14:933–934.
5 | Krzywinski, M. & Altman, N. (2017) Points of significance: Classification and regression trees. Nature Methods 14:757–758.
6 | Lever, J., Krzywinski, M. & Altman, N. (2017) Points of significance: Principal component analysis. Nature Methods 14:641–642.
7 | Altman, N. & Krzywinski, M. (2017) Points of significance: Clustering. Nature Methods 14:545–546.
8 | Altman, N. & Krzywinski, M. (2017) Points of significance: Tabular data. Nature Methods 14:329–330.
9 | Altman, N. & Krzywinski, M. (2017) Points of significance: Interpreting P values. Nature Methods 14:213–214.
10 | Altman, N. & Krzywinski, M. (2017) Points of significance: P values and the search for significance. Nature Methods 14:3–4.
11 | Lever, J., Krzywinski, M. & Altman, N. (2016) Points of significance: Regularization. Nature Methods 13:803–804.
12 | Lever, J., Krzywinski, M. & Altman, N. (2016) Points of significance: Model selection and overfitting. Nature Methods 13:703–704.
13 | Lever, J., Krzywinski, M. & Altman, N. (2016) Points of significance: Classifier evaluation. Nature Methods 13:603–604.
14 | Lever, J., Krzywinski, M. & Altman, N. (2016) Points of significance: Logistic regression. Nature Methods 13:541–542.
15 | Altman, N. & Krzywinski, M. (2016) Points of significance: Regression diagnostics. Nature Methods 13:385–386.
16 | Altman, N. & Krzywinski, M. (2016) Points of significance: Analyzing outliers: Influential or nuisance. Nature Methods 13:281–282.
17 | Krzywinski, M. & Altman, N. (2015) Points of significance: Multiple linear regression. Nature Methods 12:1103–1104.
18 | Altman, N. & Krzywinski, M. (2015) Points of significance: Simple linear regression. Nature Methods 12:999–1000.
19 | Altman, N. & Krzywinski, M. (2015) Points of significance: Association, correlation and causation. Nature Methods 12:899–900.
20 | Puga, J.L, Krzywinski, M. & Altman, N. (2015) Points of significance: Bayesian networks. Nature Methods 12:799–800.
21 | Kulesa, A., Krzywinski, M., Blainey, P. & Altman, N. (2015) Points of significance: Sampling distributions and the bootstrap. Nature Methods 12:477–478.
22 | Puga, J.L, Krzywinski, M. & Altman, N. (2015) Points of significance: Bayesian statistics. Nature Methods 12:277–278.
23 | Puga, J.L, Krzywinski, M. & Altman, N. (2015) Points of significance: Bayes' theorem. Nature Methods 12:277–278.
24 | Altman, N. & Krzywinski, M. (2015) Points of significance: Split plot design. Nature Methods 12:165–166.
25 | Altman, N. & Krzywinski, M. (2015) Points of significance: Sources of variation. Nature Methods 12:5–6.
26 | Krzywinski, M., Altman, N. (2014) Points of significance: Two factor designs. Nature Methods 11:1187–1188.
27 | Krzywinski, M., Altman, N. & Blainey, P. (2014) Points of significance: Nested designs. Nature Methods 11:977–978.
28 | Blainey, P., Krzywinski, M. & Altman, N. (2014) Points of significance: Replication. Nature Methods 11:879–880.
29 | Krzywinski, M. & Altman, N. (2014) Points of significance: Analysis of variance (ANOVA) and blocking. Nature Methods 11:699–700.
30 | Krzywinski, M. & Altman, N. (2014) Points of significance: Designing comparative experiments. Nature Methods 11:597–598.
31 | Krzywinski, M. & Altman, N. (2014) Points of significance: Non-parametric tests. Nature Methods 11:467–468.
32 | Krzywinski, M. & Altman, N. (2014) Points of significance: Comparing samples — Part II — Multiple testing. Nature Methods 11:355–356.
33 | Krzywinski, M. & Altman, N. (2014) Points of significance: Comparing samples — Part I — t–tests. Nature Methods 11:215–216.
34 | Krzywinski, M. & Altman, N. (2014) Points of significance: Visualizing samples with box plots. Nature Methods 11:119–120.
35 | Krzywinski, M. & Altman, N. (2013) Points of significance: Power and sample size. Nature Methods 10:1139–1140.
36 | Krzywinski, M. & Altman, N. (2013) Points of significance: Significance, P values and t–tests. Nature Methods 10:1041–1042.
37 | Krzywinski, M. & Altman, N. (2013) Points of significance: Error bars. Nature Methods 10:921–922.
38 | Krzywinski, M. & Altman, N. (2013) Points of significance: Importance of being uncertain. Nature Methods 10:809–810.
VIEW ALL

news + thoughts

Statistics vs Machine Learning

Tue 03-04-2018
We conclude our series on Machine Learning with a comparison of two approaches: classical statistical inference and machine learning. The boundary between them is subject to debate, but important generalizations can be made.

Inference creates a mathematical model of the datageneration process to formalize understanding or test a hypothesis about how the system behaves. Prediction aims at forecasting unobserved outcomes or future behavior. Typically we want to do both and know how biological processes work and what will happen next. Inference and ML are complementary in pointing us to biologically meaningful conclusions.

Martin Krzywinski @MKrzywinski mkweb.bcgsc.ca
Nature Methods Points of Significance column: Statistics vs machine learning. (read)

Statistics asks us to choose a model that incorporates our knowledge of the system, and ML requires us to choose a predictive algorithm by relying on its empirical capabilities. Justification for an inference model typically rests on whether we feel it adequately captures the essence of the system. The choice of pattern-learning algorithms often depends on measures of past performance in similar scenarios.

Bzdok, D., Krzywinski, M. & Altman, N. (2018) Points of Significance: Statistics vs machine learning. Nature Methods 15:233–234.

Background reading

Bzdok, D., Krzywinski, M. & Altman, N. (2017) Points of Significance: Machine learning: a primer. Nature Methods 14:1119–1120.

Bzdok, D., Krzywinski, M. & Altman, N. (2017) Points of Significance: Machine learning: supervised methods. Nature Methods 15:5–6.

...more about the Points of Significance column

Happy 2018 `\pi` Day—Boonies, burbs and boutiques of `\pi`

Wed 14-03-2018

Celebrate `\pi` Day (March 14th) and go to brand new places. Together with Jake Lever, this year we shrink the world and play with road maps.

Streets are seamlessly streets from across the world. Finally, a halva shop on the same block!

Martin Krzywinski @MKrzywinski mkweb.bcgsc.ca
A great 10 km run loop between Istanbul, Copenhagen, San Francisco and Dublin. Stop off for halva, smørrebrød, espresso and a Guinness on the way. (details)

Intriguing and personal patterns of urban development for each city appear in the Boonies, Burbs and Boutiques series.

Martin Krzywinski @MKrzywinski mkweb.bcgsc.ca
In the Boonies, Burbs and Boutiques of `\pi` we draw progressively denser patches using the digit sequence 159 to inform density. (details)

No color—just lines. Lines from Marrakesh, Prague, Istanbul, Nice and other destinations for the mind and the heart.

Martin Krzywinski @MKrzywinski mkweb.bcgsc.ca
Roads from cities rearranged according to the digits of `\pi`. (details)

The art is featured in the Pi City on the Scientific American SA Visual blog.

Check out art from previous years: 2013 `\pi` Day and 2014 `\pi` Day, 2015 `\pi` Day, 2016 `\pi` Day and 2017 `\pi` Day.

Machine learning: supervised methods (SVM & kNN)

Thu 18-01-2018
Supervised learning algorithms extract general principles from observed examples guided by a specific prediction objective.

We examine two very common supervised machine learning methods: linear support vector machines (SVM) and k-nearest neighbors (kNN).

SVM is often less computationally demanding than kNN and is easier to interpret, but it can identify only a limited set of patterns. On the other hand, kNN can find very complex patterns, but its output is more challenging to interpret.

Martin Krzywinski @MKrzywinski mkweb.bcgsc.ca
Nature Methods Points of Significance column: Machine learning: supervised methods (SVM & kNN). (read)

We illustrate SVM using a data set in which points fall into two categories, which are separated in SVM by a straight line "margin". SVM can be tuned using a parameter that influences the width and location of the margin, permitting points to fall within the margin or on the wrong side of the margin. We then show how kNN relaxes explicit boundary definitions, such as the straight line in SVM, and how kNN too can be tuned to create more robust classification.

Bzdok, D., Krzywinski, M. & Altman, N. (2018) Points of Significance: Machine learning: a primer. Nature Methods 15:5–6.

Background reading

Bzdok, D., Krzywinski, M. & Altman, N. (2017) Points of Significance: Machine learning: a primer. Nature Methods 14:1119–1120.

...more about the Points of Significance column

Human Versus Machine

Tue 16-01-2018
Balancing subjective design with objective optimization.

In a Nature graphics blog article, I present my process behind designing the stark black-and-white Nature 10 cover.

Nature 10, 18 December 2017

Machine learning: a primer

Thu 18-01-2018
Machine learning extracts patterns from data without explicit instructions.

In this primer, we focus on essential ML principles— a modeling strategy to let the data speak for themselves, to the extent possible.

The benefits of ML arise from its use of a large number of tuning parameters or weights, which control the algorithm’s complexity and are estimated from the data using numerical optimization. Often ML algorithms are motivated by heuristics such as models of interacting neurons or natural evolution—even if the underlying mechanism of the biological system being studied is substantially different. The utility of ML algorithms is typically assessed empirically by how well extracted patterns generalize to new observations.

Martin Krzywinski @MKrzywinski mkweb.bcgsc.ca
Nature Methods Points of Significance column: Machine learning: a primer. (read)

We present a data scenario in which we fit to a model with 5 predictors using polynomials and show what to expect from ML when noise and sample size vary. We also demonstrate the consequences of excluding an important predictor or including a spurious one.

Bzdok, D., Krzywinski, M. & Altman, N. (2017) Points of Significance: Machine learning: a primer. Nature Methods 14:1119–1120.

...more about the Points of Significance column

Snowflake simulation

Tue 16-01-2018
Symmetric, beautiful and unique.

Just in time for the season, I've simulated a snow-pile of snowflakes based on the Gravner-Griffeath model.

Martin Krzywinski @MKrzywinski mkweb.bcgsc.ca
A few of the beautiful snowflakes generated by the Gravner-Griffeath model. (explore)

The work is described as a wintertime tale in In Silico Flurries: Computing a world of snow and co-authored with Jake Lever in the Scientific American SA Blog.

Gravner, J. & Griffeath, D. (2007) Modeling Snow Crystal Growth II: A mesoscopic lattice map with plausible dynamics.

Genes that make us sick

Wed 22-11-2017
Where disease hides in the genome.

My illustration of the location of genes in the human genome that are implicated in disease appears in The Objects that Power the Global Economy, a book by Quartz.

Martin Krzywinski @MKrzywinski mkweb.bcgsc.ca
The location of genes implicated in disease in the human genome, shown here as a spiral. (more...)

Ensemble methods: Bagging and random forests

Wed 22-11-2017
Many heads are better than one.

We introduce two common ensemble methods: bagging and random forests. Both of these methods repeat a statistical analysis on a bootstrap sample to improve the accuracy of the predictor. Our column shows these methods as applied to Classification and Regression Trees.

Martin Krzywinski @MKrzywinski mkweb.bcgsc.ca
Nature Methods Points of Significance column: Ensemble methods: Bagging and random forests. (read)

For example, we can sample the space of values more finely when using bagging with regression trees because each sample has potentially different boundaries at which the tree splits.

Random forests generate a large number of trees by not only generating bootstrap samples but also randomly choosing which predictor variables are considered at each split in the tree.

Krzywinski, M. & Altman, N. (2017) Points of Significance: Ensemble methods: bagging and random forests. Nature Methods 14:933–934.

Background reading

Krzywinski, M. & Altman, N. (2017) Points of Significance: Classification and regression trees. Nature Methods 14:757–758.

...more about the Points of Significance column