2023 Pi Daylatest newsbuy art
Trance opera—Spente le Stellebe dramaticmore quotes
very clickable

visualization + design

Like paths? Got your lines twisted in a bunch?
Take a look at my 2014 Pi Day art that folds Pi.

Hilbert Curve Art, Hilbertonians and Monkeys

I collaborated with Scientific American to create a data graphic for the September 2014 issue. The graphic compared the genomes of the Denisovan, bonobo, chimp and gorilla, showing how our own genomes are almost identical to the Denisovan and closer to that of the bonobo and chimp than the gorilla.

Martin Krzywinski @MKrzywinski mkweb.bcgsc.ca

Here you'll find Hilbert curve art, a introduction to Hilbertonians, the creatures that live on the curve, an explanation of the Scientific American graphic and downloadable SVG/EPS Hilbert curve files.

Hilbert curve

There are wheels within wheels in this village and fires within fires!
— Arthur Miller (The Crucible)

The Hilbert curve is one of many space-filling curves. It is a mapping between one dimension (e.g. a line) and multiple dimensions (e.g. a square, a cube, etc). It's useful because it preserves locality—points that are nearby on the line are usually mapped onto nearby points on the curve.

The Hilbert curve is a line that gives itself a hug.

It's a pretty strange mapping, to be sure. Although a point on a line maps uniquely onto the curve this is not the case in reverse. At infinite order the curve intersects itself infinitely many times! This shouldn't be a surprise if you consider that the unit square has the same number of points as the unit line. Now that's the real surprise! So surprising in fact that it apparently destabilized Cantor's mind, who made the initial discovery.

Bryan Hayes has a great introduction (Crinkly Curves) to the Hilbert curve at American Scientist.

If manipulated so that its ends are adjacent, the Hilbert curve becomes the Moore curve.

constructing the hilbert curve

The order 1 curve is generated by dividing a square into quadrants and connecting the centers of the quadrants with three lines. Which three connections are made is arbitrary—different choices result in rotations of the curve.

Hilbert curve. / Martin Krzywinski @MKrzywinski mkweb.bcgsc.ca
First 8 orders of the space-filling Hilbert curve. Each square is 144 x 144 pixels. (zoom)

The order 6 curve is the highest order whose structure can be discerned at this figure resolution. Though just barely. The length of this curve is about 64 times the width of the square, so about 9,216 pixels! That's tight packing.

By order 7 the structure in the 620 pixel wide image (each square is 144 px wide) cannot be discerned. By order 8 the curve has 65,536 points, which exceeds the number of pixels its square in the figure. A square of 256 x 256 would be required to show all the points without downsampling.

Two order 10 curves have 1,048,576 points each and would approximately map onto all the pixels on an average monitor (1920 x 1200 pixels).

A curve of order 33 has `7.38 * 10^19` points and if drawn as a square of average body height would have points that are an atom's distance from one another (`10^{-10}` m).

mapping the line onto the square

By mapping the familiar rainbow onto the curve you can see how higher order curves "crinkle" (to borrow Bryan's term) around the square.

Hilbert curve. / Martin Krzywinski @MKrzywinski mkweb.bcgsc.ca
First 8 orders of the space-filling Hilbert curve. Each square is 144 x 144 pixels. (zoom)

properties of the first 24 orders of the Hilbert curve

orderpointssegmentslength
`n``4^n``4^{n-1}``2^n-2^{-n}`
1 4 3 1.5
2 16 15 3.75
3 64 63 7.875
4 256 255 15.9375
5 1,024 1,023 31.96875
6 4,096 4,095 63.984375
7 16,384 16,383 127.9921875
8 65,536 65,535 255.99609375
9 262,144 262,143 511.998046875
10 1,048,576 1,048,575 1023.9990234375
11 4,194,304 4,194,303 2047.99951171875
12 16,777,216 16,777,215 4095.99975585938
13 67,108,864 67,108,863 8191.99987792969
14 268,435,456 268,435,455 16383.9999389648
15 1,073,741,824 1,073,741,823 32767.9999694824
16 4,294,967,296 4,294,967,295 65535.9999847412
17 17,179,869,184 17,179,869,183 131071.999992371
18 68,719,476,736 68,719,476,735 262143.999996185
19 274,877,906,944 274,877,906,943 524287.999998093
20 1,099,511,627,776 1,099,511,627,775 1048575.99999905
21 4,398,046,511,104 4,398,046,511,103 2097151.99999952
22 17,592,186,044,416 17,592,186,044,415 4194303.99999976
23 70,368,744,177,664 70,368,744,177,663 8388607.99999988
24 281,474,976,710,656 281,474,976,710,655 16777215.9999999

You can download the basic curve shapes for orders 1 to 10 and experiment yourself. Both square and circular forms are available.

news + thoughts

Convolutional neural networks

Thu 17-08-2023

Nature uses only the longest threads to weave her patterns, so that each small piece of her fabric reveals the organization of the entire tapestry. – Richard Feynman

Following up on our Neural network primer column, this month we explore a different kind of network architecture: a convolutional network.

The convolutional network replaces the hidden layer of a fully connected network (FCN) with one or more filters (a kind of neuron that looks at the input within a narrow window).

Martin Krzywinski @MKrzywinski mkweb.bcgsc.ca
Nature Methods Points of Significance column: Convolutional neural networks. (read)

Even through convolutional networks have far fewer neurons that an FCN, they can perform substantially better for certain kinds of problems, such as sequence motif detection.

Derry, A., Krzywinski, M & Altman, N. (2023) Points of significance: Convolutional neural networks. Nature Methods 20:.

Background reading

Derry, A., Krzywinski, M. & Altman, N. (2023) Points of significance: Neural network primer. Nature Methods 20:165–167.

Lever, J., Krzywinski, M. & Altman, N. (2016) Points of significance: Logistic regression. Nature Methods 13:541–542.

Neural network primer

Tue 10-01-2023

Nature is often hidden, sometimes overcome, seldom extinguished. —Francis Bacon

In the first of a series of columns about neural networks, we introduce them with an intuitive approach that draws from our discussion about logistic regression.

Martin Krzywinski @MKrzywinski mkweb.bcgsc.ca
Nature Methods Points of Significance column: Neural network primer. (read)

Simple neural networks are just a chain of linear regressions. And, although neural network models can get very complicated, their essence can be understood in terms of relatively basic principles.

We show how neural network components (neurons) can be arranged in the network and discuss the ideas of hidden layers. Using a simple data set we show how even a 3-neuron neural network can already model relatively complicated data patterns.

Derry, A., Krzywinski, M & Altman, N. (2023) Points of significance: Neural network primer. Nature Methods 20:165–167.

Background reading

Lever, J., Krzywinski, M. & Altman, N. (2016) Points of significance: Logistic regression. Nature Methods 13:541–542.

Cell Genomics cover

Mon 16-01-2023

Our cover on the 11 January 2023 Cell Genomics issue depicts the process of determining the parent-of-origin using differential methylation of alleles at imprinted regions (iDMRs) is imagined as a circuit.

Designed in collaboration with with Carlos Urzua.

Martin Krzywinski @MKrzywinski mkweb.bcgsc.ca
Our Cell Genomics cover depicts parent-of-origin assignment as a circuit (volume 3, issue 1, 11 January 2023). (more)

Akbari, V. et al. Parent-of-origin detection and chromosome-scale haplotyping using long-read DNA methylation sequencing and Strand-seq (2023) Cell Genomics 3(1).

Browse my gallery of cover designs.

Martin Krzywinski @MKrzywinski mkweb.bcgsc.ca
A catalogue of my journal and magazine cover designs. (more)

Science Advances cover

Thu 05-01-2023

My cover design on the 6 January 2023 Science Advances issue depicts DNA sequencing read translation in high-dimensional space. The image showss 672 bases of sequencing barcodes generated by three different single-cell RNA sequencing platforms were encoded as oriented triangles on the faces of three 7-dimensional cubes.

More details about the design.

Martin Krzywinski @MKrzywinski mkweb.bcgsc.ca
My Science Advances cover that encodes sequence onto hypercubes (volume 9, issue 1, 6 January 2023). (more)

Kijima, Y. et al. A universal sequencing read interpreter (2023) Science Advances 9.

Browse my gallery of cover designs.

Martin Krzywinski @MKrzywinski mkweb.bcgsc.ca
A catalogue of my journal and magazine cover designs. (more)

Regression modeling of time-to-event data with censoring

Thu 17-08-2023

If you sit on the sofa for your entire life, you’re running a higher risk of getting heart disease and cancer. —Alex Honnold, American rock climber

In a follow-up to our Survival analysis — time-to-event data and censoring article, we look at how regression can be used to account for additional risk factors in survival analysis.

We explore accelerated failure time regression (AFTR) and the Cox Proportional Hazards model (Cox PH).

Martin Krzywinski @MKrzywinski mkweb.bcgsc.ca
Nature Methods Points of Significance column: Regression modeling of time-to-event data with censoring. (read)

Dey, T., Lipsitz, S.R., Cooper, Z., Trinh, Q., Krzywinski, M & Altman, N. (2022) Points of significance: Regression modeling of time-to-event data with censoring. Nature Methods 19:1513–1515.

Music video for Max Cooper's Ascent

Tue 25-10-2022

My 5-dimensional animation sets the visual stage for Max Cooper's Ascent from the album Unspoken Words. I have previously collaborated with Max on telling a story about infinity for his Yearning for the Infinite album.

I provide a walkthrough the video, describe the animation system I created to generate the frames, and show you all the keyframes

Martin Krzywinski @MKrzywinski mkweb.bcgsc.ca
Frame 4897 from the music video of Max Cooper's Asent.

The video recently premiered on YouTube.

Renders of the full scene are available as NFTs.


© 1999–2023 Martin Krzywinski | contact | Canada's Michael Smith Genome Sciences CentreBC Cancer Research CenterBC CancerPHSA