Books for Young Mathgeeks: "A Place for Zero"

I recently had the opportunity to get hold of a collection of children’s picture books with math stories. A fellow scienceblogger had been contacted by a publisher, who offered to send review copies of their books to interested SBers.

The publisher turned out to be the folks who publish the “Sir Cumference” books. My wife bought me a copy of the first of that series as a joke, and my daughter immediately appropriated it, and absolutely loved it. So I requested copies of a large bunch of their math adventures, and I’ll be posting reviews as my daughter and I finish them.

The first one that we read together is “A Place for Zero”:, by Angeline Sparagna Lopresti. My daughter picked this one because of the artwork: it’s done in a really attractive style – simple enough to be engaging, and yet complex enough to really be a part of the story.

Continue reading

Mathematical proof that God Spoke Creation (if you buy his book)

One of my fellow SBers, Kevin over at Dr. Joan Bushwell’s Chimpanzee Refuge wrote a scathing article reviewing an incredibly bad anti-evolution blog. There’s no way that I can compete with Kevin’s writing on the topic – you should really check it out for a great example of just how to take a moronic creationist, and reduce him to a whimpering puddle of protoplasm.

But while looking at the site that Kevin shredded, I can across a link to another really, really bad site, and this one is clearly in my territory:
Science Proves Creation, a site set up by an individual named “Samuel J. Hunt”. Mr. Hunt claims to have developed mathematical proof that the universe was created by Gods words.

Continue reading

Homotopy

I’ve been working on a couple of articles talking about homology, which is an interesting (but difficult) topic in algebraic topology. While I was writing, I used a metaphor with a technique that’s used in homotopy, and realized that while I’ve referred to it obliquely, I’ve never actually talked about homotopy.

When we talked about homeomorphisms, we talked about how two spaces are homeomorphic (aka topologically equivalent) if and only if one can be continuously deformed into the other – that is, roughly speaking, transformed by bending, twisting, stretching, or squashing, so long as nothing gets torn.

Homotopy is a formal equivalent of homeomorphism for functions between topological spaces, rather than between the spaces themselves. Two continuous functions f and g are homotopic if and only if f can be continuously transformed into g.

The neat thing about the formal definition of homotopy is that it finally gives us a strong formal handle on what this continuous deformation stuff means in strictly formal terms.

So, let’s dive in and hit the formalism.

Suppose we’ve got two topological spaces, S and T, and two continuous functions f,g:ST. A homotopy is a function h which associates every value in the unit interval [0,1] with a function from S to T. So we can treat h as a function from S×[0,1]→T, where ∀x:h(x,0)=f(x) and h(x,1)=g(x). For any given value x, then, h(x,·) is a curve from f(x) to g(x).

Thus – expressed simply, the homotopy is a function that precisely describes the transformation between the two homotopical functions. Homotopy defines an equivalence relation between continuous functions: continuous functions between topological spaces are topologically equivalent if there is a homotopy between them. (This paragraph originally included an extremely confusing typo – in the first sentence, I repeatedly wrote “homology” where I meant “homotopy”. Thanks to commenter elspi for the catch!)

We can also define a type of homotopy equivalence between topological spaces. Suppose again that we have two topological spaces S and T. S and T are homotopically equivalent if there are continuous functions f:ST and g:TS where gºf is homotopic to the identity function for T, 1T, and fºg is homotopic to the identity function for S, 1S. The functions f and g are called homotopy equivalences.

This gives us a nice way of really formalizing the idea of continuous deformation of spaces in homeomorphism – every homeomorphism is also a homotopy equivalence. But it’s not both ways – there are homotopy equivalences that are not homeomorphisms.

The reason why is interesting: if you look at our homotopy definition, the equivalence is based on a continuous deformations – including contraction. So, for example, a ball is not homeomorphic to a point – but it is homotopically equivalent. The contraction all the way from the ball to the point doesn’t violate anything about the homotopical equivalence. In fact, there’s a special name for the set of topological spaces that are homotopically equivalent to a single point: they’re called contractible spaces. (Originally, I erroneously wrote “sphere” instead of “ball” in this paragraph. I can’t even blame it on a typo – I just screwed up. Thanks to commenter John Armstrong for the catch.

Addendum: Commenter elspi mentioned another wonderful example of a homotopy that isn’t a homeomorphism, and I thought it was a good enough example that I wish I’d included it in the original post, so I’m promoting it here. The mobius band is homotopically equivalent to a circle – compact the band down to a line, and the twist “disappears” and you’ve got a circle. But it’s pretty obvious that the mobius is not homeomorphic to a circle!. Thanks again, elspi – great example!

Clear Object-Oriented Programming? Not in Glass

Todays bit of programming insanity is a bit of a novelty: it’s an object-oriented programming language called Glass, with an interpreter available here. So far in all of my Friday Pathological Programming columns, I haven’t written about a single object-oriented language. But Glass is something
special. It’s actually sort of a warped cross between Smalltalk and Forth – two things that should never have gotten together; in the words of the language designer, Gregor Richards, “No other language is implemented like this, because it would be idiotic to do so.”

Continue reading

Basics: Discrete vs Continuous

One thing that I frequently touch on casually as I’m writing this blog is the distinction between continuous mathematics, and discrete mathematics. As people who’ve been watching some of my mistakes in the topology posts can attest, I’m much more comfortable with discrete math than continuous.

Continue reading

Not Quite Basics: Sorting Algorithms

Multiple people have written to me, after seeing yesterday’s algorithms basics post, asking me to say more about sorting algorithms. I have to say that it’s not my favorite topic – sorting is one of those old bugaboos that you can’t avoid, but which gets really dull after a while. But there is a kernel of interest to it – sorting can be used to demonstrate a lot of interesting ideas about
computational complexity.

Continue reading

Basics: Algorithm

A kind reader pointed out that I frequently mention algorithms, but that I haven’t defined them in the basics posts. To me, they’re so fundamental to the stuff I do for a living that I completely forgot that they’re an interesting idea.

It’s really pretty simple: an algorithm is description of a mechanical procedure for performing a mathematical task. But that’s actually a fairly sophisticated idea – the general concept of things that describe a procedure, and which can be discussed, described, and reasoned about as entities in their own right is something quite different from nearly anything that came before it.

Continue reading

The Jackpot of Crankery: Woo Physics, Woo Medicine, Woo Politics, and Woo Math

Over in the thread about Engineer Borg and his wacked-out electromagnetic theory
of gravity, a commenter popped up and pointed at the web-site of someone named Tom Bearden, who supposedly has shown how to generate free “vacuum” energy using electronic and/or electromagnetic devices.

I hadn’t heard of Dr. Bearden before, and promised to take a look at his website.

So I went and took a look. And wow, I hit the jackpot! This is an absolute masterwork of crackpottery. Dr Bearden’s lunacy covers just about every conceivable topic, from conspiracy theories, to HIV denalism, to wacky physics, magical woo healing devices, post-Soviet KGB collaborations with the Japanese government to shoot down American planes and manipulate weather….

To give you a bit of flavor: he’s got a bibliography of information that allegedly supports his theories. If you take a look at it, the first thing you see is listed as “National Science Foundation letter favorably reviewing Bearden Paper”. The contents of that link consist of a scanned letter from the NSF replying to an email sent by Dr. Bearden, which consists of a basic standardized form letter inviting him to submit an actual proposal, and warning that he’d better include some proof that his perpetual motion machine really works, and an explanation of how.

Continue reading

A Math Geek on Dr. Egnor's Evasions of Evolutionary Information

PZ has already commented on this, but I thought that I’d throw in my two cents. A surgeon, Dr. Michael Egnor, posted a bunch of comments on a Time magazine blog that was criticizing ID. Dr. Egnor’s response to the criticism was to ask: “How much new information can Darwinians mechanisms generate?”

Continue reading

The Most Pathological Machine I've Ever Seen: Tag

What we have here is a truly warped language.

Back in the very early days of what eventually became computer science, many of the people working in the field invented all sorts of automatons/computing formalisms. The one that I’ve always found the most confounding is the Tag machine invented by Emil Post.

The tag machine is simple to the point of triviality. The machine is a queue of characters (with one character designated as “Halt”), and a set of rules. Each rule has a different character that selects the rule, and a string of characters. Each step, the machine looks at the first character of the queue, and selects the rule that that is associated with that character. If the character is “Halt”, the machine just stops, and whatever is left on the queue is the result of the computation. Otherwise, it appends the selected rule’s string of characters to the end of the queue, and then removed a fixed number of characters from the front of the queue. The machines are called “n-Tag” machines where “n” is number of character dropped each step.

That’s it. Look at the front of the queue, use it to pick a set of characters to append, and then remove and discard the first N characters from the queue..

For any N≥2, a Post N-tag machine is Turing equivalent.

Like I said above – I’ve always found the Post Tag machine to be thoroughly confounding. I can grasp why it’s Turing equivalent, but for the life of me, I’ve never been able to figure out how to actually implement anything interesting on one. So, I figured, there are tons of esoteric/pathological language fans out there who love nothing more than the challenge of writing interesting programs for bizarre languages; I’ll write a post-tag language, and throw it to the wolves, and see what happens!

Continue reading