Safe, fallen down this way, I want to be just what I am.safe at lastmore quotes

# words: exciting

In Silico Flurries: Computing a world of snow. Scientific American. 23 December 2017

# daily quotation server archives

In the late 90’s I started (a good decade for starts) a daily quotation server project at www.quoteserver.ca. The domain is now defunct—some pages are partially viewable at the Way Back Machine.

Below is the list of quotes I had collected by the end of the life of the project. Most are about love—duh—and a few are jolly jests from funny trenches. You know, that place where mustard gas makes your eyes water.

The quotes weren’t scraped from quote archives—each is meaningful and hand-picked.

## the quote archive

And now for full list of 1,600 other things worth reading. Such as everything Dorothy Parker has written and ... yes, even the Pinky and Brain quotes, which are a special kind of special.

Quote collections about love, heart, desire, life, death, god, mind, science.

Feeling lucky? Read 10 random quotes. Well, will you, punk?

## A subset of random quotes

119
Tirez le rideau, la farce est jouee.
280
Science is a cemetery of dead ideas.
1180
Wherefore I say: O love, as summer goes,
I must be gone, steal forth with silent drums,
That you may hail anew the bird and rose
When I come back to you, as summer comes.
Else will you seek, at some not distant time,
Even your summer in another clime.
Sonnets, xxvii
1184
Think not for this, however, the poor treason
Of my stout blood against my staggering brain,
I shall remember you with love, or season
My scorn with pity,—let me make it plain:
I find this frenzy insufficient reason
For conversation when we meet again.
Sonnets, xli
1266
I’ll never be a bride,
Nor yet celibate,
So I’m living now with Pride—
A cold bedmate.
He must not hear nor see,
Nor could he forgive
That Sorrow still visits me
Each day I live.
Light of Love
1322
i like my body when it is with your
body. It is so quiet new a thing.
7
1374
I don’t care what is written about me so long
as it isn’t true.
1406
Dogs come when they’re called; cats take a message
and get back to you.
1502
I must down to the seas again to the vagrant gypsy life.
To the gull’s way and the whale’s way where the wind’s like a whetted knife;
And all I ask is a merry yarn from a laughing fellow-rover,
And quiet sleep and a sweet dream when the long trick’s over.
Sea Fever
1591
Henceforth there will be such a oneness between us that when one weeps
the other will taste salt.
VIEW ALL

# Curse(s) of dimensionality

Tue 05-06-2018
There is such a thing as too much of a good thing.

We discuss the many ways in which analysis can be confounded when data has a large number of dimensions (variables). Collectively, these are called the "curses of dimensionality".

Nature Methods Points of Significance column: Curse(s) of dimensionality. (read)

Some of these are unintuitive, such as the fact that the volume of the hypersphere increases and then shrinks beyond about 7 dimensions, while the volume of the hypercube always increases. This means that high-dimensional space is "mostly corners" and the distance between points increases greatly with dimension. This has consequences on correlation and classification.

Altman, N. & Krzywinski, M. (2018) Points of significance: Curse(s) of dimensionality Nature Methods 15:399–400.

# Statistics vs Machine Learning

Tue 03-04-2018
We conclude our series on Machine Learning with a comparison of two approaches: classical statistical inference and machine learning. The boundary between them is subject to debate, but important generalizations can be made.

Inference creates a mathematical model of the datageneration process to formalize understanding or test a hypothesis about how the system behaves. Prediction aims at forecasting unobserved outcomes or future behavior. Typically we want to do both and know how biological processes work and what will happen next. Inference and ML are complementary in pointing us to biologically meaningful conclusions.

Nature Methods Points of Significance column: Statistics vs machine learning. (read)

Statistics asks us to choose a model that incorporates our knowledge of the system, and ML requires us to choose a predictive algorithm by relying on its empirical capabilities. Justification for an inference model typically rests on whether we feel it adequately captures the essence of the system. The choice of pattern-learning algorithms often depends on measures of past performance in similar scenarios.

Bzdok, D., Krzywinski, M. & Altman, N. (2018) Points of Significance: Statistics vs machine learning. Nature Methods 15:233–234.

Bzdok, D., Krzywinski, M. & Altman, N. (2017) Points of Significance: Machine learning: a primer. Nature Methods 14:1119–1120.

Bzdok, D., Krzywinski, M. & Altman, N. (2017) Points of Significance: Machine learning: supervised methods. Nature Methods 15:5–6.

# Happy 2018 $\pi$ Day—Boonies, burbs and boutiques of $\pi$

Wed 14-03-2018

Celebrate $\pi$ Day (March 14th) and go to brand new places. Together with Jake Lever, this year we shrink the world and play with road maps.

Streets are seamlessly streets from across the world. Finally, a halva shop on the same block!

A great 10 km run loop between Istanbul, Copenhagen, San Francisco and Dublin. Stop off for halva, smÃ¸rrebrÃ¸d, espresso and a Guinness on the way. (details)

Intriguing and personal patterns of urban development for each city appear in the Boonies, Burbs and Boutiques series.

In the Boonies, Burbs and Boutiques of $\pi$ we draw progressively denser patches using the digit sequence 159 to inform density. (details)

No color—just lines. Lines from Marrakesh, Prague, Istanbul, Nice and other destinations for the mind and the heart.

Roads from cities rearranged according to the digits of $\pi$. (details)

The art is featured in the Pi City on the Scientific American SA Visual blog.

Check out art from previous years: 2013 $\pi$ Day and 2014 $\pi$ Day, 2015 $\pi$ Day, 2016 $\pi$ Day and 2017 $\pi$ Day.

# Machine learning: supervised methods (SVM & kNN)

Thu 18-01-2018
Supervised learning algorithms extract general principles from observed examples guided by a specific prediction objective.

We examine two very common supervised machine learning methods: linear support vector machines (SVM) and k-nearest neighbors (kNN).

SVM is often less computationally demanding than kNN and is easier to interpret, but it can identify only a limited set of patterns. On the other hand, kNN can find very complex patterns, but its output is more challenging to interpret.

Nature Methods Points of Significance column: Machine learning: supervised methods (SVM & kNN). (read)

We illustrate SVM using a data set in which points fall into two categories, which are separated in SVM by a straight line "margin". SVM can be tuned using a parameter that influences the width and location of the margin, permitting points to fall within the margin or on the wrong side of the margin. We then show how kNN relaxes explicit boundary definitions, such as the straight line in SVM, and how kNN too can be tuned to create more robust classification.

Bzdok, D., Krzywinski, M. & Altman, N. (2018) Points of Significance: Machine learning: a primer. Nature Methods 15:5–6.