Martin Krzywinski / Genome Sciences Center / mkweb.bcgsc.ca Martin Krzywinski / Genome Sciences Center / mkweb.bcgsc.ca - contact me Martin Krzywinski / Genome Sciences Center / mkweb.bcgsc.ca on Twitter Martin Krzywinski / Genome Sciences Center / mkweb.bcgsc.ca - Lumondo Photography Martin Krzywinski / Genome Sciences Center / mkweb.bcgsc.ca - Pi Art Martin Krzywinski / Genome Sciences Center / mkweb.bcgsc.ca - Hilbertonians - Creatures on the Hilbert Curve
Twenty — minutes — maybe — more.Naomichoose four wordsmore quotes

pi day: exciting


In Silico Flurries: Computing a world of snow. Scientific American. 23 December 2017


visualization + design

Martin Krzywinski @MKrzywinski mkweb.bcgsc.ca
The 2018 Pi Day art celebrates the 30th anniversary of `\pi` day and connects friends stitching road maps from around the world. Pack a sandwich and let's go!

`\pi` Day 2016 Art Posters


Pi Day 2016 Art Posters
 / Martin Krzywinski @MKrzywinski mkweb.bcgsc.ca
2018 `\pi` day shrinks the world and celebrates road trips by stitching streets from around the world together. In this version, we look at the boonies, burbs and boutique of `\pi` by drawing progressively denser patches of streets. Let's go places.

Pi Day 2016 Art Posters
 / Martin Krzywinski @MKrzywinski mkweb.bcgsc.ca
2017 `\pi` day

Pi Day 2016 Art Posters
 / Martin Krzywinski @MKrzywinski mkweb.bcgsc.ca
2016 `\pi` approximation day

Pi Day 2016 Art Posters
 / Martin Krzywinski @MKrzywinski mkweb.bcgsc.ca
2016 `\pi` day

Pi Day 2016 Art Posters
 / Martin Krzywinski @MKrzywinski mkweb.bcgsc.ca
2015 `\pi` day

Pi Day 2016 Art Posters
 / Martin Krzywinski @MKrzywinski mkweb.bcgsc.ca
2014 `\pi` approx day

Pi Day 2016 Art Posters
 / Martin Krzywinski @MKrzywinski mkweb.bcgsc.ca
2014 `\pi` day

Pi Day 2016 Art Posters
 / Martin Krzywinski @MKrzywinski mkweb.bcgsc.ca
2013 `\pi` day

Pi Day 2016 Art Posters
 / Martin Krzywinski @MKrzywinski mkweb.bcgsc.ca
Circular `\pi` art

On March 14th celebrate `\pi` Day. Hug `\pi`—find a way to do it.

For those who favour `\tau=2\pi` will have to postpone celebrations until July 26th. That's what you get for thinking that `\pi` is wrong.

If you're not into details, you may opt to party on July 22nd, which is `\pi` approximation day (`\pi` ≈ 22/7). It's 20% more accurate that the official `\pi` day!

Finally, if you believe that `\pi = 3`, you should read why `\pi` is not equal to 3.

Martin Krzywinski @MKrzywinski mkweb.bcgsc.ca
All art posters are available for purchase.
I take custom requests.

This year's `\pi` day art collection celebrates not only the digit but also one of the fundamental forces in nature: gravity.

In February of 2016, for the first time, gravitational waves were detected at the Laser Interferometer Gravitational-Wave Observatory (LIGO).

The signal in the detector was sonified—a process by which any data can be encoded into sound to provide hints at patterns and structure that we might otherwise miss—and we finally heard what two black holes sound like. A buzz and chirp.

The art is featured in the Gravity of Pi article on the Scientific American SA Visual blog.

this year's theme music

All the art was processed while listening to Roses by Coeur de Pirate, a brilliant female French-Canadian songwriter, who sounds like a mix of Patricia Kaas and Lhasa. The lyrics Oublie-moi (Forget me) are fitting with this year's theme of gravity.

Mais laisse-moi tomber, laisse-nous tomber
Laisse la nuit trembler en moi
Laisse-moi tomber, laisse nous tomber
Cette fois

But let me fall, let us fall
Let the night tremble in me
Let me fall, let us fall
This time

The art is generated by running a simulation of gravity in which digits of `\pi` are each assigned a mass and allowed to collide eand orbit each other.

The mathematical details of the simulation can be found in the code section.

exploring force of gravity in `\pi`

A simulation starts with taking `n` digits of `\pi` and arranging them uniformly around a circle. The mass of each digit, `d_i` (e.g. 3), is given by `(1+d)^k` where `k` is a mass power parameter between 0.01 and 1. For example, if `k=0.42` then the mass of 3 is `(1+3)^{0.42} = 1.79`.

collapsing three digits—3.14 collide

The figure below shows the evolution of a simulation with `n=3` digits and `k=1`. The digits 3 and 4 collide to form the digit `3+4 = 7` and immediately collides with 1 to form `7+1=8`. With only one mass left in the system, the simulation stops.


Pi Day 2016 Art Posters
 / Martin Krzywinski @MKrzywinski mkweb.bcgsc.ca
The evolution of a simulation of gravity using `n=3` digits of `\pi` and the mass power `k=1`. The masses are initialized with zero velocity. (zoom)

adding initial velocity to each mass

When masses have initial velocities, the patterns quickly start to get interesting. In the figure above, the masses are initalized with zero velocity. As soon as the simulation, each mass immediately starts to move directly towards the center of mass of the other two masses.

When the initial velocity is non-zero, such as in the figure below, the masses don't immediately collapse towards one another. The masses first travel with their initial velocity but immediately the gravitational force imparts acceleration that alters this velocity. In the examples below, only those simulations in which the masses collapsed within a time cutoff are shown.


Pi Day 2016 Art Posters
 / Martin Krzywinski @MKrzywinski mkweb.bcgsc.ca
The evolution of a simulation of gravity using `n=3` digits of `\pi` and the mass power `k=1` in which all masses collapsed. The masses are initialized with a random velocity. (zoom)

Pi Day 2016 Art Posters
 / Martin Krzywinski @MKrzywinski mkweb.bcgsc.ca
The evolution of 16 simulations of gravity using `n=3` digits of `\pi` and the mass power `k=1` in which all masses collapsed. The masses are initialized with a random velocity. (zoom)

Pi Day 2016 Art Posters
 / Martin Krzywinski @MKrzywinski mkweb.bcgsc.ca
The evolution of 49 simulations of gravity using `n=3` digits of `\pi` and the mass power `k=1` in which all masses collapsed. The masses are initialized with a random velocity. (zoom)

allowing the simulation to evolve

Depending on the initial velocities, some systems collapse very quickly, which doesn't make for interesting patterns.

For example, the simulations above evolved over 100,000 steps and in some cases the masses collapsed within 10,000 steps. In the figure below, I require that the system evolves for at least 15,000 steps before collapsing. Lovely doddles, don't you think?


Pi Day 2016 Art Posters
 / Martin Krzywinski @MKrzywinski mkweb.bcgsc.ca
The evolution of 36 simulations of gravity using `n=3` digits of `\pi` and the mass power `k=1` in which all masses collapsed after a minimum amount of time. The masses are initialized with a random velocity. (zoom)

exploring ensembles

When a simulation is repeated with different initial conditions, the set of outcomes is called an ensemble.

Below, I repeat the simulation 100 times with `n=3` and `k=0.2`, each time with slightly different initial velocity. The velocities have their `x`- and `y`-components normally distributed with zero mean and a fixed variance. Each of the four ensembles has its simulations evolve over progressively more time steps: 5,000, 7,500, 10,000, and 20,000.

You can see that with 5,000 steps the masses don't yet have a chance to collide. After 7,500, there have been collisions in a small number of systems. The blue mass corresponds to the 3 colliding with 4 and the green mass to 1 colliding with 4. After 10,000, even more collisions are seen and in 3 cases we see total collapse (all three digits collided). After 20,000,


Pi Day 2016 Art Posters
 / Martin Krzywinski @MKrzywinski mkweb.bcgsc.ca
The evolution of 100 simulations of gravity over total time `t` using `n=3` digits of `\pi` and the mass power `k=0.2`. Within each ensemble, the masses are initialized with a different random velocity in each instance. (zoom)

varying masses

The value of `k` greatly impacts the outcome of the simulation. When `k` is very small, all the digits have essentially the same mass. For example, when `k=0.01` the 0 has a mass of 1 and 9 has a mass of 1.02.

When `k` is large, the difference in masses is much greater. For example, for `k=2` the lightest mass is `(1+0)^2=1` and the heaviest `(1+9)^2=10`. Because the acceleration of a mass is proportional to the mass that is attracting it, in a pair of masses the light mass will accelerate faster.


Pi Day 2016 Art Posters
 / Martin Krzywinski @MKrzywinski mkweb.bcgsc.ca
Larger values of `k` create greater diversity among the masses. Shown are simulations of 36 digits with `k` values varying from 0.1 to 3. The total mass of the system, `\Sigma m`, is also shown.`. (zoom)

increasing number of masses

As the number of digits is increased, the pattern of collapse doesn't qualitatively change.


Pi Day 2016 Art Posters
 / Martin Krzywinski @MKrzywinski mkweb.bcgsc.ca
Simulations for `n = 50, 100, 250` and `500` masses with `k = 0.5`. (zoom)

gravity makes beautiful doodles

I ran a large number of simulations. For various values of `n` and `k`, I repeated the simulation several times to sample different intial velocities.


Pi Day 2016 Art Posters
 / Martin Krzywinski @MKrzywinski mkweb.bcgsc.ca
Thumbnails of `\pi` digit orbital simulations for various values of `n` and `k`. (zoom)

Pi Day 2016 Art Posters
 / Martin Krzywinski @MKrzywinski mkweb.bcgsc.ca
Gravitational attraction paths of the first 100 digits of `\pi` for `k = 0.3`, `0.6` and `0.8` with initial velocities randomly set. Three instances of the simulation are shown, each with different intital velocities. (zoom)

Pi Day 2016 Art Posters
 / Martin Krzywinski @MKrzywinski mkweb.bcgsc.ca
Gravitational attraction paths of the first 60 digits of `\pi` for `k = 1`. After 100,000 time steps, some masses are still orbiting within the canvas (e.g. green mass at bottom right). The numbers next to the masses correspond to the digits (those around the circle are the first 50 digits of `\pi` and others are the sum (mod 10) of digits that collided). Also shown next to the numbers is their mass, index and indices of masses that formed them. (zoom)

Pi Day 2016 Art Posters
 / Martin Krzywinski @MKrzywinski mkweb.bcgsc.ca
Gravitational attraction paths of the first 50 digits of `\pi` for `k = 0.4`. The numbers next to the masses correspond to the digits (those around the circle are the first 50 digits of `\pi` and others are the sum (mod 10) of digits that collided). (zoom)

Below is a great example of how a stable orbital pattern of a pair of masses can be disrupted by the presence of another mass. You can see on the left that once the light red mass moves away from the orange/green pair, they settle into a stable pattern.


Pi Day 2016 Art Posters
 / Martin Krzywinski @MKrzywinski mkweb.bcgsc.ca
Gravitational attraction paths of the first 50 digits of `\pi` for `k = 0.9`. The numbers next to the masses correspond to the digits (those around the circle are the first 50 digits of `\pi` and others are the sum (mod 10) of digits that collided). (zoom)

The figure below shows one of my favourite patterns. As the digits collide, three masses remain, which leave the system. They remain under each other's gravitational influence, but are moving too quickly to return to the canvas within the time of the simulation.


Pi Day 2016 Art Posters
 / Martin Krzywinski @MKrzywinski mkweb.bcgsc.ca
Gravitational attraction paths of the first 90 digits of `\pi` for `k = 0.8`. The digits collide, leaving three rapidly-moving masses, which leave the canvas. (zoom)

how the idea developed

interactive gravity simulator

Use this fun inteactive gravity simulator if you want to drop your own masses and watch them orbit.

VIEW ALL

news + thoughts

Curse(s) of dimensionality

Tue 05-06-2018
There is such a thing as too much of a good thing.

We discuss the many ways in which analysis can be confounded when data has a large number of dimensions (variables). Collectively, these are called the "curses of dimensionality".

Martin Krzywinski @MKrzywinski mkweb.bcgsc.ca
Nature Methods Points of Significance column: Curse(s) of dimensionality. (read)

Some of these are unintuitive, such as the fact that the volume of the hypersphere increases and then shrinks beyond about 7 dimensions, while the volume of the hypercube always increases. This means that high-dimensional space is "mostly corners" and the distance between points increases greatly with dimension. This has consequences on correlation and classification.

Altman, N. & Krzywinski, M. (2018) Points of significance: Curse(s) of dimensionality Nature Methods 15:399–400.

Statistics vs Machine Learning

Tue 03-04-2018
We conclude our series on Machine Learning with a comparison of two approaches: classical statistical inference and machine learning. The boundary between them is subject to debate, but important generalizations can be made.

Inference creates a mathematical model of the datageneration process to formalize understanding or test a hypothesis about how the system behaves. Prediction aims at forecasting unobserved outcomes or future behavior. Typically we want to do both and know how biological processes work and what will happen next. Inference and ML are complementary in pointing us to biologically meaningful conclusions.

Martin Krzywinski @MKrzywinski mkweb.bcgsc.ca
Nature Methods Points of Significance column: Statistics vs machine learning. (read)

Statistics asks us to choose a model that incorporates our knowledge of the system, and ML requires us to choose a predictive algorithm by relying on its empirical capabilities. Justification for an inference model typically rests on whether we feel it adequately captures the essence of the system. The choice of pattern-learning algorithms often depends on measures of past performance in similar scenarios.

Bzdok, D., Krzywinski, M. & Altman, N. (2018) Points of Significance: Statistics vs machine learning. Nature Methods 15:233–234.

Background reading

Bzdok, D., Krzywinski, M. & Altman, N. (2017) Points of Significance: Machine learning: a primer. Nature Methods 14:1119–1120.

Bzdok, D., Krzywinski, M. & Altman, N. (2017) Points of Significance: Machine learning: supervised methods. Nature Methods 15:5–6.

...more about the Points of Significance column

Happy 2018 `\pi` Day—Boonies, burbs and boutiques of `\pi`

Wed 14-03-2018

Celebrate `\pi` Day (March 14th) and go to brand new places. Together with Jake Lever, this year we shrink the world and play with road maps.

Streets are seamlessly streets from across the world. Finally, a halva shop on the same block!

Martin Krzywinski @MKrzywinski mkweb.bcgsc.ca
A great 10 km run loop between Istanbul, Copenhagen, San Francisco and Dublin. Stop off for halva, smørrebrød, espresso and a Guinness on the way. (details)

Intriguing and personal patterns of urban development for each city appear in the Boonies, Burbs and Boutiques series.

Martin Krzywinski @MKrzywinski mkweb.bcgsc.ca
In the Boonies, Burbs and Boutiques of `\pi` we draw progressively denser patches using the digit sequence 159 to inform density. (details)

No color—just lines. Lines from Marrakesh, Prague, Istanbul, Nice and other destinations for the mind and the heart.

Martin Krzywinski @MKrzywinski mkweb.bcgsc.ca
Roads from cities rearranged according to the digits of `\pi`. (details)

The art is featured in the Pi City on the Scientific American SA Visual blog.

Check out art from previous years: 2013 `\pi` Day and 2014 `\pi` Day, 2015 `\pi` Day, 2016 `\pi` Day and 2017 `\pi` Day.

Machine learning: supervised methods (SVM & kNN)

Thu 18-01-2018
Supervised learning algorithms extract general principles from observed examples guided by a specific prediction objective.

We examine two very common supervised machine learning methods: linear support vector machines (SVM) and k-nearest neighbors (kNN).

SVM is often less computationally demanding than kNN and is easier to interpret, but it can identify only a limited set of patterns. On the other hand, kNN can find very complex patterns, but its output is more challenging to interpret.

Martin Krzywinski @MKrzywinski mkweb.bcgsc.ca
Nature Methods Points of Significance column: Machine learning: supervised methods (SVM & kNN). (read)

We illustrate SVM using a data set in which points fall into two categories, which are separated in SVM by a straight line "margin". SVM can be tuned using a parameter that influences the width and location of the margin, permitting points to fall within the margin or on the wrong side of the margin. We then show how kNN relaxes explicit boundary definitions, such as the straight line in SVM, and how kNN too can be tuned to create more robust classification.

Bzdok, D., Krzywinski, M. & Altman, N. (2018) Points of Significance: Machine learning: a primer. Nature Methods 15:5–6.

Background reading

Bzdok, D., Krzywinski, M. & Altman, N. (2017) Points of Significance: Machine learning: a primer. Nature Methods 14:1119–1120.

...more about the Points of Significance column

Human Versus Machine

Tue 16-01-2018
Balancing subjective design with objective optimization.

In a Nature graphics blog article, I present my process behind designing the stark black-and-white Nature 10 cover.

Nature 10, 18 December 2017

Machine learning: a primer

Thu 18-01-2018
Machine learning extracts patterns from data without explicit instructions.

In this primer, we focus on essential ML principles— a modeling strategy to let the data speak for themselves, to the extent possible.

The benefits of ML arise from its use of a large number of tuning parameters or weights, which control the algorithm’s complexity and are estimated from the data using numerical optimization. Often ML algorithms are motivated by heuristics such as models of interacting neurons or natural evolution—even if the underlying mechanism of the biological system being studied is substantially different. The utility of ML algorithms is typically assessed empirically by how well extracted patterns generalize to new observations.

Martin Krzywinski @MKrzywinski mkweb.bcgsc.ca
Nature Methods Points of Significance column: Machine learning: a primer. (read)

We present a data scenario in which we fit to a model with 5 predictors using polynomials and show what to expect from ML when noise and sample size vary. We also demonstrate the consequences of excluding an important predictor or including a spurious one.

Bzdok, D., Krzywinski, M. & Altman, N. (2017) Points of Significance: Machine learning: a primer. Nature Methods 14:1119–1120.

...more about the Points of Significance column