Twenty — minutes — maybe — more.choose four wordsmore quotes

# letters: fun

In Silico Flurries: Computing a world of snow. Scientific American. 23 December 2017

# Snellen Optotype Font with Upper and Lowercase characters

In the process of designing my Snellen Eye Chart typographical posters, I came across the Snellen font by Andrew Howlett. I wasn't happy with all the letters, so I made attempts at giving the font an update.

Not being a font designer, I will likely get myself into trouble.

## snellen chart posters

While making my Snellen chart series, I entered the rabbit hole of optotype fonts ... and I can't get out!

A technically accurate Snellen chart using four genetic bases A T G C rendered as optotypes. The chart begins with the start codon ATG and ends in the stop codon TGA, which appears only once in the chart. Print at 16 in × 24 in. (BUY ARTWORK)
A technically accurate Snellen chart using the nautical flag alphabet rendered as optotypes. Print at 16 in × 24 in. (BUY ARTWORK)

The charts don't necessarily use the latest version of my Snellen font design, which fluctuates as my mood about some of the letters changes.

## optotype fonts

The optotype requirement is that letters be designed on a 5 × 5 grid, and have constant stroke width. This means that both lower and upper case letters need to share the grid and stroke. To stay compatible with the eyechart paradigm, letters should be as obvious as possible.

Lorrie Frear's article What are Optotypes? Eye Charts in Focus is a great read about optotypes and eye charts.

## Snellen Optotype Letter Design

### uppercase

The uppercase letter design uses Herman Snellen's original chart as inspiration.

I have modified the design by Andrew Howlett (see below) for some letters. All the changes are relatively minor: more serifs and consistent stroke width for bars on R and K.

### lowercase

The lowercase characters should be considered experimental.

The progress of my redesign is shown below. I would greatly appreciate feedback and suggestions!

The distribution contains both Andrew's version and my redesign.

v7.1 4-jun-2018 — Download Snellen optotype font

#### version 7.1 — 4 Jun 2017

Tidied all letter forms with Fontlab 6.

Snellen optotype font (version v7.1 4-jun-2018) that includes both upper and lower case characters, along with most punctuation and some symbols. Based on design by Andrew Howlett. (zoom, download Snellen optotype font)

#### version 7 — 6 Feb 2017

Fixed g and e. Thanks to Makeesha Fisher for suggestions.

Snellen optotype font (mk.v.7). Original design by Andrew Howlett (left) and my redesign (right), which includes both upper and lowercase letters as well as digits and symbols. (zoom, download Snellen optotype font)

#### version 6 — 5 Feb 2017

Adjusted serifs on f, j, l, o, t to extend the full width of the grid. Added a lot more symbols.

Snellen optotype font (mk.v.6). Original design by Andrew Howlett (left) and my redesign (right), which includes both upper and lowercase letters as well as digits and symbols. (zoom, download Snellen optotype font)

#### version 5 — 4 Feb 2017

Added lowercase, digits and symbols.

Snellen optotype font (mk.v.5). Original design by Andrew Howlett (left) and my redesign (right), which includes both upper and lowercase letters as well as digits and symbols. (zoom, download Snellen optotype font)

#### version 4 — 23 Feb 2017

Snellen optotype font (mk.v.4). Original design by Andrew Howlett (left) and my redesign (right), which includes both upper and lowercase letters as well as digits. (zoom)

#### version 3 — 22 Feb 2017

I'm exploring the lowercase characters. I don't know what I want to do with them. Make this into a more standard font in which lowercase letters are smaller, so that letters can fit their roles clearly when text is set in sentence case, or fill out the full optotype grid.

Snellen optotype font (mk.v.3). Original design by Andrew Howlett (left) and my redesign (right), which includes both upper and lowercase letters. (zoom)

#### version 2 — 22 Feb 2017

Flushed out some inconsistencies in the uppercase characters. Added serifs to more letters.

Now all the letters occuppy the full 5 × 5 grid, including the I, whose serifs were widened to allow this. While this new uppercase I isn't as pretty as the old one, it makes the entire typeface more consistent to its optotype roots.

Still struggling with the G. In the original version, the descending stroke was cut off in the middle of a grid, which I didn't like.

The S has been fixed—thanks to Elanor Lutz for feedback.

I've color coded the characters slightly differently, drawing attention to ones that I feel need more thought.

The lowercase characters aren't color coded (yet) because ... most of them need help. Primarily, I'm vacillating between making them fill the full size of the 5 × 5 square, just like the uppercase characters, and keeping them confined to a 4 × 4 square, which incurs loss of legibility. If I make the letters the same size, it will be impossible to distinguish lowercase and uppercase characters some cases (e.g. c, i). Perhaps this is desired?

Snellen optotype font (mk.v.2). Original design by Andrew Howlett (left) and my redesign (right), which includes both upper and lowercase letters. (zoom)

#### version 1 — 22 Feb 2017

First attempt at lowercase characters.

Snellen optotype font (mk.v.1). Original design by Andrew Howlett (left) and my redesign (right), which includes both upper and lowercase letters. (zoom, download font)
VIEW ALL

# Curse(s) of dimensionality

Tue 05-06-2018
There is such a thing as too much of a good thing.

We discuss the many ways in which analysis can be confounded when data has a large number of dimensions (variables). Collectively, these are called the "curses of dimensionality".

Nature Methods Points of Significance column: Curse(s) of dimensionality. (read)

Some of these are unintuitive, such as the fact that the volume of the hypersphere increases and then shrinks beyond about 7 dimensions, while the volume of the hypercube always increases. This means that high-dimensional space is "mostly corners" and the distance between points increases greatly with dimension. This has consequences on correlation and classification.

Altman, N. & Krzywinski, M. (2018) Points of significance: Curse(s) of dimensionality Nature Methods 15:399–400.

# Statistics vs Machine Learning

Tue 03-04-2018
We conclude our series on Machine Learning with a comparison of two approaches: classical statistical inference and machine learning. The boundary between them is subject to debate, but important generalizations can be made.

Inference creates a mathematical model of the datageneration process to formalize understanding or test a hypothesis about how the system behaves. Prediction aims at forecasting unobserved outcomes or future behavior. Typically we want to do both and know how biological processes work and what will happen next. Inference and ML are complementary in pointing us to biologically meaningful conclusions.

Nature Methods Points of Significance column: Statistics vs machine learning. (read)

Statistics asks us to choose a model that incorporates our knowledge of the system, and ML requires us to choose a predictive algorithm by relying on its empirical capabilities. Justification for an inference model typically rests on whether we feel it adequately captures the essence of the system. The choice of pattern-learning algorithms often depends on measures of past performance in similar scenarios.

Bzdok, D., Krzywinski, M. & Altman, N. (2018) Points of Significance: Statistics vs machine learning. Nature Methods 15:233–234.

Bzdok, D., Krzywinski, M. & Altman, N. (2017) Points of Significance: Machine learning: a primer. Nature Methods 14:1119–1120.

Bzdok, D., Krzywinski, M. & Altman, N. (2017) Points of Significance: Machine learning: supervised methods. Nature Methods 15:5–6.

# Happy 2018 $\pi$ Day—Boonies, burbs and boutiques of $\pi$

Wed 14-03-2018

Celebrate $\pi$ Day (March 14th) and go to brand new places. Together with Jake Lever, this year we shrink the world and play with road maps.

Streets are seamlessly streets from across the world. Finally, a halva shop on the same block!

A great 10 km run loop between Istanbul, Copenhagen, San Francisco and Dublin. Stop off for halva, smørrebrød, espresso and a Guinness on the way. (details)

Intriguing and personal patterns of urban development for each city appear in the Boonies, Burbs and Boutiques series.

In the Boonies, Burbs and Boutiques of $\pi$ we draw progressively denser patches using the digit sequence 159 to inform density. (details)

No color—just lines. Lines from Marrakesh, Prague, Istanbul, Nice and other destinations for the mind and the heart.

Roads from cities rearranged according to the digits of $\pi$. (details)

The art is featured in the Pi City on the Scientific American SA Visual blog.

Check out art from previous years: 2013 $\pi$ Day and 2014 $\pi$ Day, 2015 $\pi$ Day, 2016 $\pi$ Day and 2017 $\pi$ Day.

# Machine learning: supervised methods (SVM & kNN)

Thu 18-01-2018
Supervised learning algorithms extract general principles from observed examples guided by a specific prediction objective.

We examine two very common supervised machine learning methods: linear support vector machines (SVM) and k-nearest neighbors (kNN).

SVM is often less computationally demanding than kNN and is easier to interpret, but it can identify only a limited set of patterns. On the other hand, kNN can find very complex patterns, but its output is more challenging to interpret.

Nature Methods Points of Significance column: Machine learning: supervised methods (SVM & kNN). (read)

We illustrate SVM using a data set in which points fall into two categories, which are separated in SVM by a straight line "margin". SVM can be tuned using a parameter that influences the width and location of the margin, permitting points to fall within the margin or on the wrong side of the margin. We then show how kNN relaxes explicit boundary definitions, such as the straight line in SVM, and how kNN too can be tuned to create more robust classification.

Bzdok, D., Krzywinski, M. & Altman, N. (2018) Points of Significance: Machine learning: a primer. Nature Methods 15:5–6.