Fixed layout, added new post

master
Levi Pearson 2014-10-22 23:33:43 -06:00
parent 259b67b4a0
commit b78e1a7643
2 changed files with 136 additions and 0 deletions

View File

@ -372,3 +372,9 @@ typography = do
marginBottom $ px baselinePx
"list-style-type" -: "none"
blockquote ? do
backgroundColor "#F9F9F9"
borderLeft solid (px 3) accent1
paddingLeft (em 1)

View File

@ -0,0 +1,130 @@
---
title: Operator Sections in Haskell: A History
---
Operator Sections
-----------------
I was explaining the Haskell notational trick of partially applying
the *second* argument of a two-parameter function via a combination of
back-quotes turning a named function into an operator and the operator
section syntax. For example, you can express the function that gives
a value modulo 10 as:
~~~ { .haskell }
(`mod` 10)
~~~
The question came up of just where the idea for operator sections came
from. Because this is precisely the kind of useless information I
can't help but be curious about, I resolved to find an answer. And
after a bit of digging, I was successful.
Haskell History
---------------
Our first stop on the journey through functional programming history
is at the wonderful paper by several of the major contributors to
Haskell, [A History of Haskell: Being Lazy With Class][HH]. Since I came
across it, this has been my first source for the answers to questions
of Haskell history, and once again it didn't fail me. From section 4.2:
> The solution to this problem was to use a generalised notion of
> *sections*, a notation that first appeared in David Wile's
> dissertation (Wile, 1973) and was then disseminated via IFIP
> WG2.1--among others to Bird, who adopted it in his work, and Turner,
> who introduced it into Miranda.
Turner, of course, refers to David Turner who is the man behind the
languages SASL, KRC, and Miranda. Miranda was a commercial product of
Turner's company Research Software Limited, and was the most mature
and widely used of the family of non-strict functional languages at
the time. The business needs of Research Software and the desire of
the functional language community for a standard language didn't
*quite* converge, though, so Haskell arose as a non-commercial
alternative.
So, Haskell got operator sections (as it got a great deal of its
syntax) from Miranda. That's not very surprising and didn't really
satisfy my curiosity, so I followed up on the next breadcrumb in the
trail, Wile's dissertation.
[HH]: http://research.microsoft.com/en-us/um/people/simonpj/papers/history-of-haskell/ "A History of Haskell: being lazy with class"
One More Step
-------------
The document in question is entitled [A Generative, Nested-Sequential
Basis for General Purpose Programming Languages][WD], and was submitted
to Carnegie-Mellon University in November, 1973 by David Sheridan
Wile. It describes the idea of taming the wild pointers of data
structures via similar structuring techniques to those that were being
applied to tame the wild control flow of GOTO-based code, and cites
the languages BLISS, APL, and LISP as primary influences.
When first asked about operator sections, I guessed that the influence
had come somehow through APL, so I was gratified to see my instict
validated. In fact, the notation presented borrows heavily from APL
but marries it with ideas from non-strict functional programming such
as natural representations of infinite data and the way that such data
mediates the interaction of co-routines.
Sure enough, on page 16 we find the following:
> Partially instantiated functions are called "sections" in
> mathematical literature, and we adopt the term here for
> convenience. The nature of sections is ambiguous: they are both
> program and data, and attempts to define them as one or the other
> rely on a preconceived implementation.
And on page 30, he explains further:
> A unique primitive operation which produces primitive operators is
> also permitted; this operation is termed "partial
> instantiation". The "section" or "partially instantiated function"
> was motivated in Chapter 1 as a natural mechanism for expressing
> data structure concepts of restriction. In fact, they play a much
> more significant role in the bsais in that many programs are
> sequences of partially instantiated functions. In particular, we
> allow the partial instantiation of any binary operator to produce
> either a left- or right-unary operator.
So, that's certainly the source of Haskell's notion of operator
sections! But what about that reference to the mathematical
literature? Can we trace the idea back further?
[WD]: http://digitalcollections.library.cmu.edu/awweb/awarchive?type=file&item=362714 "A GENERATIVE, NESTED-SEQUENTIAL BASIS FOR GENERAL PURPOSE PROGRAMMING LANGUAGES"
Recursive Functions
-------------------
This turns out to be [Theory of Recursive Functions and Effective
Computability][RF], by Hartley Rogers, Jr. Rogers was a Ph.D. student
under none other than Alonzo Church, father of the Lambda
Calculus. This text began as a set of notes published by the MIT Math
department in '57 and grew into its first publication as a book in '67
during a period of huge amounts of progress in its field.
The citation of the book points to a page in section 5.4 on projection
theorems relating to recursively enumerable sets. Specifically, the
definition of "section" is given in terms of a $k$-ary projection
relation $R$ in which one of the $k$ terms, $n$, is fixed. Since binary
operators form such a relation, fixing one of the parameters qualifies
to be called a section of the overall relation described by the
operator.
[RF]: http://mitpress.mit.edu/books/theory-recursive-functions-and-effective-computability "Theory of Recursive Functions and Effective Computability | The MIT Press"
Relation to Categorical Sections?
---------------------------------
I recalled having heard some other mathematical definition of the term
section, and on a hunch I looked it up at [nLab][NL], which is a site
discussing Category Theory. It turns out that a [section][NS] in that
context is a right-inverse to a mapping; i.e. if you compose it on the
right of a map f: A -> B, you get A -> B -> A, or the identity
map. This doesn't seem to be a related concept to what Rogers described,
but perhaps I'm missing some deeper connection.
[NL]: http://ncatlab.org/nlab/show/HomePage "nLab: Home Page"
[NS]: http://ncatlab.org/nlab/show/section "nLab: section"