I’m hitting on something deeply twisted this week. It’s called homespring. Homespring is interesting for two reasons. First, it’s got a sort of reverse flavor to it: it consists of active agents in a passive structure. So, for example, you can’t do anything like control flow – that’s a dynamic change in the control flow of the program; in Homespring, you need to trick the agents into doing something using the static, unchanging structure of the program. And second, it challenges you to write your program in the form of a poem! And the icing on the cake? The agents are salmon,
and the passive structure is a network of streams.
Category Archives: Good Math
Another Example Sheaf: Vector Fields on Manifolds
There’s another classic example of sheaves; this one is restricted to manifolds, rather than general topological spaces. But it provides the key to why we can do calculus on a manifold. For any manifold, there is a sheaf of vector fields over the manifold.
Probabilistic Complexity
As I’ve mentioned in the past, complexity theory isn’t really one of my favorite topics. As a professional computer scientist, knowing and loving complexity up to the level of NP-completeness
is practically a requirement. But once you start to get beyond P and NP, there are thousands of complexity classes that have been proposed for one reason or another, and really understanding all of them can get remarkably difficult, and what’s worse, can feel like an utterly pointless exercise,
particularly if the writer/teacher you’re learning from isn’t very good.
I’m not going to write a long series of posts on the more esoteric complexity classes. But there
are a few that are interesting, and might even have some relevance to actual practical problems in
selecting algorithms for a software system.
Basic Complexity Classes: P and NP
Now that we’ve gone through a very basic introduction to computational complexity, we’re ready
to take a high-level glimpse at some of the more interesting things that arise from it. The one
that you’ll hear about most often is “P vs NP”, which is what I’m going to write about today.
Basic Computational Complexity
In the comments thread of the post on Turing Equivalent vs Turing Complete, there’ve been a flurry of questions and comments about some stuff involving computational complexity. So I thought it would be interesting to write a couple of posts about computational complexity. I’m not going to get into a lot of detail, because I would need to dig out some books that I really don’t want to read again :-). But I can do the basics without them. It’ll take a few posts to get through this.
Pathological Macros for BrainF**k
Today, I have something really fun and twisted for you. It’s my favorite variation on
BrainF**k, called “BFM”, which is short for “BrainFunct-Macro”. It’s a doubly-Turing-equivalent language – it’s got two complete and distinct Turing equivalent computing systems in it: it’s
got regular BF on the bottom… But before BF gets going to run the program,
the program is pre-processed by a complete lambda-calculus macro expander.
Turing Equivalent vs. Turing Complete
In my discussion with Sal Cordova in this post, one point came up which I thought was interesting, and worth taking the time to flesh out as a separate post. It’s about the distinction
between a Turing equivalent computing system, and a Turing complete computation. It’s true
that in informal use, we often tend to muddy the line between these two related but distinct concepts. But in fact, they are distinct, and the difference between them can be extremely important. In some sense, it’s the difference between “capable of” and “requires”; another way of looking at it is
“sufficient” versus “necessary”.
Examples of Sheaves
Since the posts of sheaves have been more than a bit confusing, I’m going to take
the time to go through a couple of examples of real sheaves that are used in
algebraic topology and related fields. Todays example will be the most canonical one:
a sheaf of continuous functions over a topological space. This can be done for *any* topological space, because a topological space *must* be continuous and gluable with
other topological spaces.
Balanced Binary Trees in Haskell
So, we’ve built up some pretty nifty binary trees – we can use the binary tree both as the
basis of an implementation of a set, or as an implementation of a dictionary. But our
implementation has had one major problem: it’s got absolutely no way to maintain balance. What
that means is that depending on the order in which things are inserted to the tree, we might
have excellent performance, or we might be no better than a linear list. For example, look at
these trees. As you can see, a tree with the same values can wind up quite different. In a good insert order, you can wind up with a nicely balanced tree: the minimum distance from root to leaf is 3; the maximum is 4. On the other hand, take the same values, and insert them in a different order and you
get a rotten tree; the minimum distance from root to leaf is 1, and the maximum is 7. So depending on
luck, you can get a tree that gives you good performance, or one that ends up giving you no better than a plain old list.
Today we’re going to look at fixing that problem. That’s really more of a lesson in data
structures than it is in Haskell, but we’ll need to write more complicated and interesting
data structure manipulation code than we have so far, and it’ll be a lollapalooza of pattern
matching. What we’re going to do is turn our implementation into a red-black tree.
A Second Stab at Sheaves
I’ve mostly been taking it easy this week, since readership is way down during the holidays, and I’m stuck at home with my kids, who don’t generally give me a lot of time for sitting
and reading math books. But I think I’ve finally got time to get back to the stuff
I originally messed up about sheaves.
I’ll start by talking about the intuition behind the idea of sheaves. The basic idea of
a sheave is to provide a way of taking some local property of a topological space, and
demonstrating that it holds everywhere. The classic example of this is manifolds, where the *local* property of being locally almost euclidean around a point is expanded to being almost euclidean around *all* points.