Safe, fallen down this way, I want to be just what I am.safe at lastmore quotes

a: 3

In Silico Flurries: Computing a world of snow. Scientific American. 23 December 2017

data visualization + art

If you like space, you'll love my 2017 Pi Day art which imagines the digits as a star catalogue. Meet the Quagga and Aurochs—the Constellations in this sky are extinct animals and plants.

null
from an undefined
place,
undefined
create (a place)
an account
of us
— Viorica Hrincu

Sometimes when you stare at the void, the void sends you a poem.

Universe—Superclusters and Voids

The Universe — Superclustesr and Voids. The two supergalactic hemispheres showing Abell clusters (blue), superclusters (magenta) and voids (black) within a distance of 6,000 million light-years from the Milky Way.

The average density of the universe is about $10 \times 10^{-30} \text{ g/cm}^3$ or about 6 protons per cubic meter. This should put some perspective in what we mean when we speak about voids as "underdense regions".

expressing distances in the universe

All distances on the poster are expressed in terms of the light travel distance.

light-travel and comoving distance

Distances in the universe can be expressed as either the light-travel distance or the comoving distance to the object. The first tells us how long light took to travel from the object to us.

For example, the furthest object observed is the galaxy GN-Z11 and its light-travel distance is 13 billion light-years (Gly).

But because space has expanded during the time the light from GN-Z11 has been travelling to us, the galaxy is now actually much further away. This is measured by its comoving distance which accounts for space expansion, which is 29.3 Gly for GN-Z11.

The redshift, $z$, is commonly used to specify distance, since it's a quantity that can be observed. For GN-Z11, $z = 11.09$.

calculating distances

To calculate these distances, the redshift $z$ is used along with a few cosmological parameters.

The Hubble parameter, $H(z)$, is the function used for these calculations. It can be derived from the Friedmann equation. $$H(z) = H_0 \sqrt { \Omega_r({1+z})^4 + \Omega_m({1+z})^3 + \Omega_k({1+z})^2 + \Omega_\Lambda }$$

The values of the parameters in $H(z)$ are being continually refined and the values of some depend on various assumptions. I use the Hubble constant $H_0 = 69.6 \text{ km/s/Mpc}$, mass density of relativistic particles $\Omega_r = 8.6 \times 10^{-5}$, mass density $\Omega_m = 0.286$, curvature $\Omega_k = 0$ and dark energy fraction $\Omega_\Lambda = 1 - \Omega_r - \Omega_m - \Omega_k = 0.713914$.

Bennett, C.L. et al The 1% Concordance Hubble Constant Astrophysical Journal 794 (2014)

Now given a redshift, $z$ the light-travel distance is $$d_T(z) = c \int_0^z \frac{dx}{({1+x})E(x)}$$

The age of the universe can be computed from this expression. The edge of the universe has an infinite redshift so w can calculate it using $\lim_{z \rightarrow \infty} d_T(z)$.

The comoving distance to the object with redshift $z$ is $$d_C(z) = c \int_0^z \frac{dx}{E(x)}$$

It's convenient to express the above integrals by making a variable substitution. Using the scale factor $a = 1/(1+z)$, $$E(a) = H_0 \sqrt { \frac{\Omega_r}{a^2} + \frac{\Omega_m}{a} + \Omega_k + a^4\Omega_\Lambda }$$

The light-travel distance is $$D_T(z) = c \int_a^1 \frac{dx}{E(x)}$$

The comoving distance is $$D_C(z) = c \int_a^1 \frac{dx}{xE(x)}$$

The light-travel distance to the edge of the universe is $$D_{T_U}(z) = c \int_0^1 \frac{dx}{E(x)}$$

and the light-travel distance from the edge of the universe to the object as we're observing it now is $$D_{T_0}(z) = c \int_0^a \frac{dx}{E(x)}$$

which can be interpreted as the age of the object when it emitted the light that we're seeing now.

The proper size of the universe is the comoving distance to its edge, $$D_{C_U}(z) = c \int_0^1 \frac{dx}{xE(x)}$$

distance calculator

$### Cosmological distance calculator ### Martin Krzywinski, 2018 # # The full script supports command-line parameters # http://mkweb.bcgsc.ca/universe-voids-and-superclusters/cosmology_distance.py z = 1 # redshift a = 1/(1+z) # scale factor Wm = 0.286 # mass density Wr = 8.59798189985467e-05 # relativistic mass Wk = 0 # curvature WV = 1 - Wm - Wr - Wk # dark matter fraction n = 10000 # integration steps # Hubble parameter, as function of a = 1/(1+z) def Ea(a,Wr,Wm,Wk,WV): return(math.sqrt(Wr/a**2 + Wm/a + Wk + WV*a**2)) H0 = 69.6 # Hubble constant c = 299792.458 # speed of light, km/s pc = 3.26156 # parsec to light-year conversion mult = (c/H0)*pc/1e3 # integrals are in units of c/H0, converts to Gy or Gly sum_comoving = 0 sum_light = 0 sum_univage = 0 sum_univsize = 0 for i in range(n): f = (i+0.5)/n x = a + (1-a) * f # a .. 1 xx = f # 0 .. 1 ex = Ea(x,args.Wr,args.Wm,args.Wk,args.WV) exx = Ea(xx,args.Wr,args.Wm,args.Wk,args.WV) sum_comoving += (1-a)/(x*ex) sum_light += (1-a)/( ex) sum_univsize += 1/(xx*exx) sum_univage += 1/( exx) results = [mult*i for i in [sum_univage,sum_univsize,sum_univage-sum_light, \ sum_light,sum_comoving]] print("z {:.2f} U {:f} Gy {:f} Gly T0 {:f} Gy T {:f} Gly C {:f} Gly". \ format(args.z,*results))$

Use the script to generate distances for a given redshift, $z$. For example,

$# For galaxy GN-Z11, furtest object ever observed ./cosmology_distance.py -z 11.09 z 11.09 U 13.720 Gy 46.441 Gly T0 0.414 Gy T 13.306 Gly C 32.216 Gly$

The galaxy GN-Z11 has a light-travel distance of 13.3 Gly and a comoving distance of 32.2 Gly. We're seeing it now as it was only 0.4 Gy after the beginning of the universe, which is 13.7 Gy old and the distance to its edge is 46.4 Gly.

$# For quasar J1342+0928, furthest quasar ever observed ./cosmology_distance.py -z 7.54 z 7.54 U 13.720 Gy 46.441 Gly T0 0.699 Gy T 13.021 Gly C 29.355 Gly$

The values for U (age and size of universe), will always be the same for a given set of cosmological parameters for any value of $z$. I include them in the output of the script for convenience.

These values match those generated by Ned's online cosmological calculator for a flat universe.

VIEW ALL

Predicting with confidence and tolerance

Wed 07-11-2018
I abhor averages. I like the individual case. —J.D. Brandeis.

We focus on the important distinction between confidence intervals, typically used to express uncertainty of a sampling statistic such as the mean and, prediction and tolerance intervals, used to make statements about the next value to be drawn from the population.

Confidence intervals provide coverage of a single point—the population mean—with the assurance that the probability of non-coverage is some acceptable value (e.g. 0.05). On the other hand, prediction and tolerance intervals both give information about typical values from the population and the percentage of the population expected to be in the interval. For example, a tolerance interval can be configured to tell us what fraction of sampled values (e.g. 95%) will fall into an interval some fraction of the time (e.g. 95%).

Nature Methods Points of Significance column: Predicting with confidence and tolerance. (read)

Altman, N. & Krzywinski, M. (2018) Points of significance: Predicting with confidence and tolerance Nature Methods 15:843–844.

Krzywinski, M. & Altman, N. (2013) Points of significance: Importance of being uncertain. Nature Methods 10:809–810.

4-day Circos course

Wed 31-10-2018

A 4-day introductory course on genome data parsing and visualization using Circos. Prepared for the Bioinformatics and Genome Analysis course in Institut Pasteur Tunis, Tunis, Tunisia.

Composite of the kinds of images you will learn to make in this course.

Oryza longistaminata genome cake

Mon 24-09-2018

Data visualization should be informative and, where possible, tasty.

Stefan Reuscher from Bioscience and Biotechnology Center at Nagoya University celebrates a publication with a Circos cake.

The cake shows an overview of a de-novo assembled genome of a wild rice species Oryza longistaminata.

Circos cake celebrating Reuscher et al. 2018 publication of the Oryza longistaminata genome.

Optimal experimental design

Tue 31-07-2018
Customize the experiment for the setting instead of adjusting the setting to fit a classical design.

The presence of constraints in experiments, such as sample size restrictions, awkward blocking or disallowed treatment combinations may make using classical designs very difficult or impossible.

Optimal design is a powerful, general purpose alternative for high quality, statistically grounded designs under nonstandard conditions.

Nature Methods Points of Significance column: Optimal experimental design. (read)

We discuss two types of optimal designs (D-optimal and I-optimal) and show how it can be applied to a scenario with sample size and blocking constraints.

Smucker, B., Krzywinski, M. & Altman, N. (2018) Points of significance: Optimal experimental design Nature Methods 15:599–600.

Krzywinski, M., Altman, N. (2014) Points of significance: Two factor designs. Nature Methods 11:1187–1188.

Krzywinski, M. & Altman, N. (2014) Points of significance: Analysis of variance (ANOVA) and blocking. Nature Methods 11:699–700.

Krzywinski, M. & Altman, N. (2014) Points of significance: Designing comparative experiments. Nature Methods 11:597–598.

The Whole Earth Cataloguer

Mon 30-07-2018
All the living things.

An illustration of the Tree of Life, showing some of the key branches.

The tree is drawn as a DNA double helix, with bases colored to encode ribosomal RNA genes from various organisms on the tree.

The circle of life. (read, zoom)

All living things on earth descended from a single organism called LUCA (last universal common ancestor) and inherited LUCA’s genetic code for basic biological functions, such as translating DNA and creating proteins. Constant genetic mutations shuffled and altered this inheritance and added new genetic material—a process that created the diversity of life we see today. The “tree of life” organizes all organisms based on the extent of shuffling and alteration between them. The full tree has millions of branches and every living organism has its own place at one of the leaves in the tree. The simplified tree shown here depicts all three kingdoms of life: bacteria, archaebacteria and eukaryota. For some organisms a grey bar shows when they first appeared in the tree in millions of years (Ma). The double helix winding around the tree encodes highly conserved ribosomal RNA genes from various organisms.

Johnson, H.L. (2018) The Whole Earth Cataloguer, Sactown, Jun/Jul, p. 89

Why we can't give up this odd way of typing

Mon 30-07-2018
All fingers report to home row.

An article about keyboard layouts and the history and persistence of QWERTY.

My Carpalx keyboard optimization software is mentioned along with my World's Most Difficult Layout: TNWMLC. True typing hell.

TNWMLC requires seriously flexible digits. It’s 87% more difficult than using a standard Qwerty keyboard, according to Martin Krzywinski, who created it (Credit: Ben Nelms). (read)

McDonald, T. (2018) Why we can't give up this odd way of typing, BBC, 25 May 2018.