Archive for the 'Hacking' Category


Sunday, December 11th, 2005

Although it might seem all quiet on the Epigram front, in actual fact there is somewhat of a blitz going on at HQ and I thought I’d let you in on some of what has been going on.

As you can see in the last post Conor changed his mind slightly on what Terms where going to be, in those pictures, Up terms are those you can sythesize a type for (and are now called Synth), Down terms are terms you have to push the type into and check (and are now called Check).

Since changing to this representation would break a lot of what is already there we took our experiment ‘offline’ and Conor, James, Wouter and I set to work on implementing all the kit we needed for the new underlying terms (and values). When I say offline, I really mean we created ourselves a Darcs project as it seemed the perfect experiment of our plan to move wholesale to darcs. The results of the tinkering and the experiment can be grabbed by entering ‘darcs get’. It’s not finished, but a few days work bore many fruits.

Next up, holes and Datatypes.


Thursday, July 28th, 2005

In between coats of paint, I’ve managed to get a bit of concurrent editor-elaborator communication going, without possessing either an editor or an elaborator. The fake elaborator I wrote just likes everything and invents nothing, which should serve to let folk fiddle with editing technology. I’m just a-thinking that we should actually figure out how to elaborate module imports (of pre-elaborated objects) before we do anything cleverer. But anyhow, check it out!


Monday, June 27th, 2005

Scoping traverses concrete syntax, threading some effects:

  • reading the scope, being (concretely) an alist of strings to namerefs; what is its abstract interface?
  • generating new namerefs, due either to an explicit binding (in a pattern or a shadow), or to an implicit binding (a previously unseen name in an lhs or a sig)
  • accumulating the set of new namerefs, when we see a name which is not in an explicit binding position, we should look it up if we can and generate a new nameref if not; we must accumulate the set of these names so we can add them to the relevant shadow; this way, the elaborator will know exactly where everything is to be bound
  • building imperative syntax trees: we may as well get our node identifiers from the ref supplier; also, we can equip trees with backpointers, and possibly other editor-related gubbins

Scoping will also need to generate binary applications, handling any weird operators we might allow. I’ll have a stab at the datatype of nodes in a bit.

Phases of source analysis

Tuesday, June 7th, 2005

Figuring out the phases of source analysis to be done outside the elaborator will help figure out the details in the concrete syntax, so here goes.

  • the ‘lexer’ takes String to lists of tokens (actually with location info, thanks Peter);
  • the ‘grouper’ takes token lists to Doc, the type of documents with their brackets resolved;
  • the ‘parser’ takes Doc and produces unmarked concrete syntax with a Doc at each shed and (String, Int) for each identifier x^i (where x is short for x^0); we might imagine a more sophisticated editor which produces this format directly; remark—applications
    in this format are flat sequences of ‘small’ terms;
  • the ‘scoper’ takes this unvarninshed concrete syntax and
    • resolves the identifiers as variables, given the bindings in scope—the elaborator will provide the context of variables, but the front end must maintain the map to identifiers; hole—figure out how to represent locally bound variables and unbound variables; forbid shadowing in left-hand sides (but allow it in binders?);
    • resolves the structure of applications into binary form, dealing with operator precedence or whatever
    • labels the nodes with editor marks;
    • erases the sheds; concrete syntax should be polymorphic in the shed type, and so should the scoper, meaning that it can’t reveal the contents of sheds;


Representing Concrete Syntax

Monday, June 6th, 2005

We need a generic protocol for communicating and elaborating concrete syntax. This creates a problem. How specific should we be wrt the syntactic categories of Epigram? My inclination is not at all. We either need a generic technology which will adapt to arbitrary syntactic categories, or we need to flatten the syntactic categories. At the moment, I don’t feel like figuring out how to do the former. So I suggest we adopt some very general setup like this

type Concrete shed = (Mark, CTree shed)
data CTree shed
  = CShed shed
  | CNode Node [Concrete shed]


  • It’s polymorphic in the contents of sheds, but the elaborator is only ever sent Concrete ()
  • The type Mark is some equality type which belongs to the UI: it is used to mark information coming back from the elaborator as concrete syntax is processed.
  • The type Node represents both the terminal and nonterminal symbols, whatever they may be. Blah code will need to interrogate this type in arbitrary ways, so things might get a touch existential. As in
    data Blah
      = ...
      | ∀α. Case (Node → Maybe α, [String]) (α → Blah) Blah

    The function tests if the node suits the success continuation. If so, the strings name the subnodes and the success continuation executes; if not, the failure case runs. Of course, the whole thing suspends if we have a shed rather than a node.

Parsing to this representation requires that we can generate suitable elements of Mark (I suspect this may involve refs.) The real exciting stuff in the parser is more likely to involve figuring names out—more on that anon.

Printing. Hmm.


Wednesday, May 18th, 2005

Further to recent remarks, I have now implemented a toy elaborator for Hutton’s Razor (the language of integers with addition). You can find it here. Seems to work. The elaborator itself is

hutton :: [Value] → Blah
hutton _ =
  Conc (MNode PLUS ["x", "y"] $
    Bind ("x", Call hutton []) $ λx →
    Bind ("y", Call hutton []) $ λy →
    Solve (plus x y)) $
  Conc (MNum $ λi → Solve (VC i Nothing)) $
  Error "unrecognized input"

More chat about this later.


Thursday, February 24th, 2005

Hugs and GHC seem to disagree on this one:

ERROR “./DeBruijn.lhs”:111 - Cannot justify constraints in explicitly typed binding
*** Expression : (\)
*** Type : (Named a, Bind b) => Bwd a -> b -> b
*** Given context : (Named a, Bind b)
*** Constraints : Named a

Peter directed me towards:[email protected]/msg15390.html

So presumably some more type annotations are required. Can’t work out what though…

Fear and Parsing in 2D Syntax

Thursday, February 24th, 2005

Over the past couple of weeks I’ve been working on improving the error messages you get from the Parser, they will now give you some idea of what was being tried when the error occurred and where you might find the error in your code.

This is not the finished article, but I thought it was best that it got commited as soon as it compiled and ran. For instance I haven’t used the new parser combinators anywhere in the parser for frames so, for the moment, it’s error messages may be a little misleading.

Please feel free to try out testConcrete!


Thursday, February 17th, 2005

After a lovely time with mfix making cyclic data structures, so that Nat knows about its bits which are typed in terms of Nat, I then had a horrid time chasing down a peculiar bug. It turns out that

(t :->> w) @@ e = t @@ e :->> w @@ e

ain’t the same as

(t :->> w) @@ e = (t @@ e) :->> (w @@ e)

which is what I meant. Differently grouped, it loops rather tightly. Took me ages to find it. Now that I’ve nailed that little sod, I can confirm that 2+2 is indeed 4.

I’ll commit the changes tomorrow, when I’ve yanked out all the tracing.

Yesterday’s stuff

Thursday, February 17th, 2005

You may have noticed that Term.lhs is a little shorter. I just shifted everything not strictly necessary for the syntax and operational semantics into other files, viz.

  • TermTools.lhs
    with yer abstractor and yer instantiator // and yer discharger |- and yer quantifier-stripper -| and a’that. Most especially, the β-behaviour of typed terms (which must only be used in a type-correct way). See here. This is what the typechecker will call to assemble values with which it’s happy. See here.
  • Normal.lhs now has the weak-head and full normalization algs. The latter will be tweaked to use typed β next time I commit.

You’ll also have noticed that I’m up to something, down the bottom of the file, with some form of Code. More on that in a bit.

Some loud pictures from today’s meeting

Tuesday, February 15th, 2005
Conor's whiteboard 0 Matters arising
Conor's whiteboard 1 Conor's whiteboard 2 Infer strike one
Conor's whiteboard 3 Conor's whiteboard 4 Infer strike two
Conor's whiteboard 5 Conor's whiteboard 6 Conor's whiteboard 7 Equality

Toy compiler

Thursday, February 3rd, 2005

I’ve been writing a toy compiler, which sort of half works now, except that there’s no optimisations, the generated code is quite bad and the virtual machine is too slow - you can get it from It spits out C, and you have to do a bit of work to get your program to do anything, but it’s a start (and at least there’s an example). I was going to spit out GHC core, but then I realised that writing abstract machines was too much fun.

Anyway, I thought I’d let you know just to show that I was actively doing something ;) .

It’s probably still worth having a GHC core back end, and certainly generating .hs/.hi files. But I think this way is going to work better for compile-time evaluation.

Progress towards object files

Wednesday, January 26th, 2005

More better documentation tomorrow, but I’ve hacked up the abstract syntax and operational semantics of a framework for definitions, holes and modules with parameters. I’ve implemented a parser from a textual format. The abstract syntax uses long names and is fully λ-lifted, whilst the concrete syntax is suitably localised. The code is a bit of a mess, but I’m too tired to tidy it now. I hasten to add that I haven’t got a checker for object files, just a somewhat trusting reader. Of course, in a lazy language, you can cache the semantics on loading the text as long as you promise not to use it until you’ve checked the corresponding syntax…

Test cases

Tuesday, January 25th, 2005

What’s happening with these?

It would be nice to have a collection of samples in files, so we can all see the intended form of input, and do automatic testing later. Repeatedly typing in test cases isn’t good.

A spot of tidying

Monday, January 24th, 2005

In a fit of anal retentiveness, I’ve spent this morning tidying up yesterday’s developments. In particular, I’ve separated out DeBruijn.lhs as a more generic bunch of utilities. Correspondingly, Scope has become a functor (I’ve also added a constructor for vacuous scopes, as these give rise to optimized execution for λ and cute print forms for ∀ and ∃). For any idiom which counts how many binders you’re under, the operator ‘shiftyMap’ pushes functions inside scopes, doing the right thing. Yesterday’s nu operation has been slightly refactored to make it less value specific. You’ll also be delighted to find that I’ve added a few more comments…


Thursday, January 20th, 2005

I’ve just created Term.lhs. I’ll be filling it up with glue very shortly.