This love's a nameless dream.try to figure it outmore quotes

art is science is art

UCD Computational and Molecular Biology Symposium, Dublin, Ireland. 1-2 Dec 2016.

visualization + design

Typography geek? If you like the geometry and mathematics of these posters, you may enjoy something more letter ed. Visions of type: Type Peep Show: The Private Curves of Letters posters.

The art of Pi ($pi$), Phi ($phi$) and $e$

This section contains various art work based on $\pi$, $\phi$ and $e$ that I created over the years. $pi$ day and $pi$ approximation day artwork is kept separate.

The accidental similarity number (ASN) is a kind of overlap between numbers. I came up with this concept after creating typographical art about the $i$-ness of $\pi$.

The poster shows the accidental similarity number for $\pi$, $\phi$ and $e$.

The accidental similarity number for $\pi$, $\phi$ and $e$ created from the first 1,000,000 digits of each number. (posters, BUY ARTWORK)

VIEW ALL

Model Selection and Overfitting

Tue 13-09-2016

With four parameters I can fit an elephant and with five I can make him wiggle his trunk. —John von Neumann.

By increasing the complexity of a model, it is easy to make it fit to data perfectly. Does this mean that the model is perfectly suitable? No.

When a model has a relatively large number of parameters, it is likely to be influenced by the noise in the data, which varies across observations, as much as any underlying trend, which remains the same. Such a model is overfitted—it matches training data well but does not generalize to new observations.

Nature Methods Points of Significance column: Model Selection and Overfitting (read)

We discuss the use of training, validation and testing data sets and how they can be used, with methods such as cross-validation, to avoid overfitting.

Altman, N. & Krzywinski, M. (2016) Points of Significance: Model Selection and Overfitting. Nature Methods 13:703-704.

Lever, J., Krzywinski, M. & Altman, N. (2016) Points of Significance: Classifier evaluation. Nature Methods 13:603-604.

Lever, J., Krzywinski, M. & Altman, N. (2016) Points of Significance: Logistic regression. Nature Methods 13:541-542.

Classifier Evaluation

Tue 13-09-2016

It is important to understand both what a classification metric expresses and what it hides.

We examine various metrics use to assess the performance of a classifier. We show that a single metric is insufficient to capture performance—for any metric, a variety of scenarios yield the same value.

Nature Methods Points of Significance column: Classifier Evaluation (read)

We also discuss ROC and AUC curves and how their interpretation changes based on class balance.

Lever, J., Krzywinski, M. & Altman, N. (2016) Points of Significance: Classifier evaluation. Nature Methods 13:603-604.

Lever, J., Krzywinski, M. & Altman, N. (2016) Points of Significance: Logistic regression. Nature Methods 13:541-542.

Happy 2016 $\pi$ Approximation, roughly speaking

Sun 24-07-2016

Today is the day and it's hardly an approximation. In fact, $22/7$ is 20% more accurate of a representation of $\pi$ than $3.14$!

Time to celebrate, graphically. This year I do so with perfect packing of circles that embody the approximation.

By warping the circle by 8% along one axis, we can create a shape whose ratio of circumference to diameter, taken as twice the average radius, is 22/7.

If you prefer something more accurate, check out art from previous $\pi$ days: 2013 $\pi$ Day and 2014 $\pi$ Day, 2015 $\pi$ Day, and 2016 $\pi$ Day.

Logistic Regression

Tue 13-09-2016

Regression can be used on categorical responses to estimate probabilities and to classify.

The next column in our series on regression deals with how to classify categorical data.

We show how linear regression can be used for classification and demonstrate that it can be unreliable in the presence of outliers. Using a logistic regression, which fits a linear model to the log odds ratio, improves robustness.

Nature Methods Points of Significance column: Logistic regression? (read)

Logistic regression is solved numerically and in most cases, the maximum-likelihood estimates are unique and optimal. However, when the classes are perfectly separable, the numerical approach fails because there is an infinite number of solutions.

Lever, J., Krzywinski, M. & Altman, N. (2016) Points of Significance: Logistic regression. Nature Methods 13:541-542.