On March 14th celebrate `\pi` Day. Hug `\pi`—find a way to do it.
For those who favour `\tau=2\pi` will have to postpone celebrations until July 26th. That's what you get for thinking that `\pi` is wrong. I sympathize with this position and have `\tau` day art too!
If you're not into details, you may opt to party on July 22nd, which is `\pi` approximation day (`\pi` ≈ 22/7). It's 20% more accurate that the official `\pi` day!
Finally, if you believe that `\pi = 3`, you should read why `\pi` is not equal to 3.
Not a circle in sight in the 2015 `\pi` day art. Try to figure out how up to 612,330 digits are encoded before reading about the method. `\pi`'s transcendental friends `\phi` and `e` are there too—golden and natural. Get it?
This year's `\pi` day is particularly special. The digits of time specify a precise time if the date is encoded in North American day-month-year convention: 3-14-15 9:26:53.
The art has been featured in Ana Swanson's Wonkblog article at the Washington Post—10 Stunning Images Show The Beauty Hidden in `\pi`.
I find this image deeply beautiful and deeply troubling, and I’ll try to explain why. —Max Cooper
The 7-level tree map was used for the Transcendental Tree Map track on Max Cooper's Yearning for the Infinite album. The album is an “audio/visual rendering with our obsession with the unobtainable”.
The video for the track was a collaboration between myself and Nick Cobby. The music contains layered loops whose lengths are based on prime numbers—as the track plays, some loops individually come in and out of phase with others, forming a longer loop. The full set never synchronizes though.
The transcendental tree map encodes the first 20,244 digits of `\pi` = 3.1415...7012.
The video constructs and then chaotically deconstructs a 7 level tree map of the digits of `\pi`. This map is shown below and is similar to other images I made for 2015 Pi Day, except that here the map is formatted for a 16:9 screen.
The video starts with an explicit construction of the map. This process begins with dividing the canvas with 3 vertical lines, which forms 4 rectangles. Each of the four rectangles formed by this process is divided with 1, 4, 1 and 5 horizontal lines, respectively. This forms 2 + 5 + 2 + 6 = 15 rectangles. Each of the 15 rectangles is divided by vertical lines according to the next 15 digits of Pi. This process repeats until we have performed the loop 7 times.
The division of each rectangle is not even—the positions of the lines are slightly jittered. This gives the map a more organic feel.
The number of digits encoded in each loop is 1, 4, 15, 98, 548, 2,962 and 16,616. In total, 17,180 vertical and 3,064 horizontal lines are drawn and these form the backbone of the map.
The video is created by layering numerous animations of the construction of the map, in which the rate and order of line growth is varied. Blinking rectangles indicate that the lines for a digit have completed drawing.
Nature uses only the longest threads to weave her patterns, so that each small piece of her fabric reveals the organization of the entire tapestry. – Richard Feynman
Following up on our Neural network primer column, this month we explore a different kind of network architecture: a convolutional network.
The convolutional network replaces the hidden layer of a fully connected network (FCN) with one or more filters (a kind of neuron that looks at the input within a narrow window).
Even through convolutional networks have far fewer neurons that an FCN, they can perform substantially better for certain kinds of problems, such as sequence motif detection.
Derry, A., Krzywinski, M & Altman, N. (2023) Points of significance: Convolutional neural networks. Nature Methods 20:.
Derry, A., Krzywinski, M. & Altman, N. (2023) Points of significance: Neural network primer. Nature Methods 20:165–167.
Lever, J., Krzywinski, M. & Altman, N. (2016) Points of significance: Logistic regression. Nature Methods 13:541–542.
Nature is often hidden, sometimes overcome, seldom extinguished. —Francis Bacon
In the first of a series of columns about neural networks, we introduce them with an intuitive approach that draws from our discussion about logistic regression.
Simple neural networks are just a chain of linear regressions. And, although neural network models can get very complicated, their essence can be understood in terms of relatively basic principles.
We show how neural network components (neurons) can be arranged in the network and discuss the ideas of hidden layers. Using a simple data set we show how even a 3-neuron neural network can already model relatively complicated data patterns.
Derry, A., Krzywinski, M & Altman, N. (2023) Points of significance: Neural network primer. Nature Methods 20:165–167.
Lever, J., Krzywinski, M. & Altman, N. (2016) Points of significance: Logistic regression. Nature Methods 13:541–542.
Our cover on the 11 January 2023 Cell Genomics issue depicts the process of determining the parent-of-origin using differential methylation of alleles at imprinted regions (iDMRs) is imagined as a circuit.
Designed in collaboration with with Carlos Urzua.
Akbari, V. et al. Parent-of-origin detection and chromosome-scale haplotyping using long-read DNA methylation sequencing and Strand-seq (2023) Cell Genomics 3(1).
Browse my gallery of cover designs.
My cover design on the 6 January 2023 Science Advances issue depicts DNA sequencing read translation in high-dimensional space. The image showss 672 bases of sequencing barcodes generated by three different single-cell RNA sequencing platforms were encoded as oriented triangles on the faces of three 7-dimensional cubes.
More details about the design.
Kijima, Y. et al. A universal sequencing read interpreter (2023) Science Advances 9.
Browse my gallery of cover designs.
If you sit on the sofa for your entire life, you’re running a higher risk of getting heart disease and cancer. —Alex Honnold, American rock climber
In a follow-up to our Survival analysis — time-to-event data and censoring article, we look at how regression can be used to account for additional risk factors in survival analysis.
We explore accelerated failure time regression (AFTR) and the Cox Proportional Hazards model (Cox PH).
Dey, T., Lipsitz, S.R., Cooper, Z., Trinh, Q., Krzywinski, M & Altman, N. (2022) Points of significance: Regression modeling of time-to-event data with censoring. Nature Methods 19:1513–1515.
My 5-dimensional animation sets the visual stage for Max Cooper's Ascent from the album Unspoken Words. I have previously collaborated with Max on telling a story about infinity for his Yearning for the Infinite album.
I provide a walkthrough the video, describe the animation system I created to generate the frames, and show you all the keyframes
The video recently premiered on YouTube.
Renders of the full scene are available as NFTs.