Stepping Up Divison By Zero to Perfect Encryption

An alert reader sent me a link to a really dreadful piece of drek. In some ways, it’s a
rehash of the “Nullity” nonsense from a couple of years ago, but with a new spin.

If you don’t remember nullity, it was the attempt of one idiot to define division by zero.
He claimed to have “solved” the great problem of dividing by zero, and by doing so, to be able
to do all manner of amazing things, such as to build better computers that would be less prone
to bugs.

Today’s garbage is in the
same vein: another guy, this one named Jeff Cook, who claims to have “solved” the problem of
division by zero. But this one claims that this gives him a way to prove the Reimann
hypothesis, to rapidly crack RSA public key encryption, and to devise a new “theoretically
unbreakable” encryption algorithm.

The grandiosity of this Mr. Cook is astonishing. He’s started a company (which is looking
for investors!); here’s a quote from his company’s homepage:

Great scientific discoveries mark the milestones of human history.

Such are the accomplishments achieved by the men and women of Singularics. Standing on the
shoulders of giants such as Albert Einstein and Bernhard Riemann, we have reached up through
nature’s veil and seen what lies hidden there more clearly than anyone else before us. Our
discoveries have yielded a new mathematical framework, one that provides a profound
understanding of nature’s basic mechanics. We have discovered The Science of
Singularics™
, the study of the singularity.

We have already found a variety of important applications of Singularic Technology™,
but perhaps the most immediately useful are Neutronic Encryption™, a new theoretically
unbreakable public key encryption algorithm and Singularic Power™, a new form of clean
power generation.

Neutronic Encryption, our next generation public key encryption algorithm, will play a
vital role in the digital age by ensuring that the electronic information of governments,
industry and individuals is kept secure and private in a world where cyber-terrorism is on the
rise.

We have also developed a new primary power generation system capable of delivering
abundant, clean and inexpensive energy that can satisfy power requirements on any scale.
Singularic Power production technology generates zero pollution and can therefore play an
instrumental role in promoting a harmonious coexistence between human civilization and the
Earth’s fragile ecosystem.

To date, our analysis of the mathematics and physics at the singularity has lead us to
eight important new inventions, most notably in the fields of information security and clean
energy. All eight inventions (patents pending), have significant and immediate application in
the global market.

It is our vision to use these advances to bring about great improvements for everyone
through new technology, intelligently applied.

Mr. Cook doesn’t have too high an opinion of himself, does he?

Continue reading

Gap Buffers, or, Don't Get Tied Up With Ropes?

Last week, I promised to continue my discussion of ropes. I’m going to
break that promise. But it’s in a good cause.

If you’re a decent engineer, one of the basic principles you should
follow is keeping this as simple as possible. One of the essential skills
for keeping things simple is realizing when you’re making something
overly complicated – even if it’s cool, and fun, and interesting.

Working out the rope code for more complex operations, I found myself
thinking that it really seemed complex for what I was doing. What I’m
doing is writing myself a text editor. I’m not writing a string
processing framework for managing gigabytes of data; I’m just writing a
text editor.

So I decided to try, for the purposes of testing, to look at one of
the simpler data structures. A classic structure for implementing editor
buffers is called a gap buffer. I’ll explain the details below.
The drawback of a gap buffer is that large cursor motions are relatively
expensive – the cost of moving the cursor is proportional to the distance
that you move it.

So I wrote a test. I implemented a basic gap buffer. Then I built a
test that did 300,000 inserts of five characters; then slid the cursor
from the end, to the beginning, back to the end, back to the beginning,
and back to the end again. Total execution time? 150 milliseconds. And
that includes resizing the buffer 12 times, because of a deliberately
stupid buffer sizing and copying strategy.

And with that, out the window went the ropes. Because in Java,
using a slapped together, sloppy piece of test code, which does things in a dumb
brute force way, it can do something much more complicated than anything an editor
will encounter in routine use, the gap buffer achieves performance that is more than
adequately fast. (Based on some experiments I did back at IBM, delays of 1/10th
of a second are roughly when people start to notice that an editor is slow. If you can respond is less than 1/10th of a second, people don’t perceive a troublesome delay.)
If the gap buffer can achieve the needed perfomance that easily, then there’s
absolutely no reason to do anything more complicated.

Continue reading

Financial Morons, and Quadratics vs. Linears

I wasn’t going to write about this, because I really don’t have much to add. But people keep mailing it to me, so in order to shut you all up, I’ll chip in.

As everyone knows by now, we’re in the midst of a really horrible
financial disaster. I’ve argued in the past on this blog that the root cause of the entire disaster is pure, simple stupidity on the part of people in the financial business. People gave out mortgages that any
sane rational person would have considered ridiculous. And then they built huge, elaborate financial structures on top of those mortgages, pretending that by somehow piling layer upon layer, loan upon loan, that
they were somehow creating something that could be considered real wealth.

bankcap6-1024x747.jpg

They gave themselves bonuses that boggled the mind. Even after the whole ridiculous system came tumbling down, they continue to give themselves ridiculous bonuses. Insane bonuses. They’ve been writing themselves checks for millions of dollars to continue to operate their
businesses – even after taking billions of dollars in loans from the government to prevent them from going out of business. I consider
this to be downright criminal. But even if it’s not criminal, it’s
incredibly stupid. The very people who ran those firms right to the edge of bankruptcy, who nearly took down our entire financial system
are being rewarded. Not only are they being allowed to continue
to rut the businesses that they pretty much destroyed, but they’ve
been paying themselves an astonishing amount of money to do it. And now they’re complaining bitterly about the fact that the government
wants to limit them to a paltry half-million dollars of salary per year.

They argue that they must be allowed to earn more than that. Because after all, the people who run those businesses are special. They’re “the best and the brightest”. They’re
extra-smart. No one else could possibly run those businesses. We can’t rely on anyone who’d accept a puny half-mil – they won’t be smart enough. They don’t have the special knowledge of the business that these people do.

There’s one minor problem with that argument: it doesn’t work. A couple of weeks ago, some idiot at JP Morgan circulated a chart that was supposed to summarize just how bad the financial disaster has been. The chart circulated for a couple of weeks – bounced from mailbox to mailbox, sent from one financial genius to another.

Only the chart was blatantly, obviously, trivially wrong, and anyone who had the slightest damned clue of the assets those businesses managed – i.e., the kind of thing that the idiot who drew the chart was supposed to know – should have been able to tell at a glance how wrong it was. But they didn’t. In fact, the damned thing didn’t stop circulating until (of all people) Bob Cringely
flamed it. Go look at the chart – it’s up at the top of this post.

Continue reading

It Never Stops: Another Silly Creationist Argument

As I’ve mentioned before, I get lots of email from all sorts of people. Lots of it is interesting, and lots of it is stupid. This morning, when I was checking my mail, I found an email from a creationist in my mailbox, which puts forth an “proof” against atheism that I hadn’t seen before. It’s about as idiotic as most creationist arguments are,
but it’s one that I’ve never seen before, and it’s interesting to shred it from the viewpoint of mathematical logic.

Here’s his argument:

Now to the main point, and somewhat more interesting stuff. I recently ran across a proof (perhaps not in the mathematical sense; I don’t know) against atheism. Atheism, both in the forms I’ve encountered and in logical necessity, requires that matter/energy is eternal, since the atheist would by necessity argue against any point at which matter/energy came into existence. Nothing does not create something. For anything to be eternal and be in the present, it must by necessity have come from eternity past. However, between eternity past and the present, there is an infinite amount of time. Therefore, traveling forward from eternity past, one could never reach the present day. Therefore, atheism cannot be true, and a definite point at which matter/energy came into existence is necessary.

What’s wrong with this? Where to begin?

Continue reading

Metric Abuse – aka Lying with Statistics

I’m behind the curve a bit here, but I’ve seen and heard a bunch of
people making really sleazy arguments about the current financial stimulus
package working its way through congress, and those arguments are a perfect
example of one of the classic ways of abusing statistics. I keep mentioning metric errors – this is another kind of metric error. The difference between this and some of the other examples that I’ve shown is that this is deliberately dishonest – that is, instead of accidentally using the wrong metric to get a wrong answer, in this case, we’ve got someone deliberately taking one metric, and pretending that it’s an entirely different metric in order to produce a desired result.

As I said, this case involves the current financial stimulus package that’s working its way through congress. I want to put politics aside here: when it comes to things like this financial stimulus, there’s plenty of room for disagreement.
Economic crises like the one we’re dealing with right now are really uncharted territory – they’re very rare, and the ones that we have records of have each had enough unique properties that we don’t have a very good collection of evidence
to use to draw solid conclusions about recoveries from them work. This isn’t like
physics, where we tend to have tons and tons of data from repeatable experiments; we’re looking at a realm where there are a lot of reasonable theories, and there isn’t enough evidence to say, conclusively, which (if any) of them is correct. There are multiple good-faith arguments that propose vastly different ways of trying
to dig us out of this disastrous hole that we’re currently stuck in.

Of course, it’s also possible to argue in bad faith, by
creating phony arguments. And that’s the subject of this post: a bad-faith
argument that presents real statistics in misleading ways.

Continue reading

Sloppy Arguments about Mutation Rates

My friend Razib, who is one of my
fellow ScienceBloggers sent me a
link to an interest attempt
by creationist at arguing why evolution can’t
possibly work.

I say interesting, because it’s at least a little bit unusual in
its approach. It’s not just the same old regurgitation of the same talking
points. It’s still not a great argument, but it’s at least not as boring as
staring at the same stupid arguments over and over again. Alas, it’s not entirely
new, either. It’s an argument that the mutation rates required for humans to have evolved from a primate ancestor would have to be impossibly high.

Continue reading

The Continuum Hypothesis Solved: All Infinities are the Same? Nope.

Of all of the work in the history of mathematics, nothing seems to attract so much controversy, or even outright hatred as Cantor’s diagonalization. The idea of comparing the sizes of different infinities – and worse, of actually concluding that there are different infinities, where some infinities are larger than others – drives some people absolutely crazy. As a result, countless people bothered by this have tried to come up with all sorts of arguments about why Cantor was wrong, and there’s only one infinity.

Today’s post is another example of that. This one is sort of special. Unless I’m very much mistaken, the author of this sent me his argument by email last year, and I actually exchanged several messages with him, before he concluded, roughly “We’ll just have to agree to disagree.” (I didn’t keep the email, so I’m not certain, but it’s exactly the same argument, and the authors name is vaguely familiar. If I’m wrong, I apologize.)

Anyway, this author actually went ahead and wrote the argument up as a full technical paper, and submitted it to arXiv, where you can download it in all it’s glory. I’ll be honest, and admit that I’m a little bit impressed by this. The proof is still completely wrong, and the arguments that surround it range from wrong to, well, not even wrong. But at least the author has the Chutzpah to treat his work seriously, and submit it to a place where it can actually be reviewed, instead of ranting about conspiracies.

For those who aren’t familiar with the work of Cantor, you can read my article on it here. A short summary is that Cantor invented set theory, and then used it to study the construction of finite and infinite sets, and their relationships with numbers. One of the very surprising conclusions was that you can compare the size of infinite sets: two sets have the same size if there’s a way to create a one-to-one mapping between their members. An infinite set A is larger than another infinite set B if every possible mapping from members of B to members of A will exclude at least one member of A. Using that idea, Cantor showed that if you try to create a mapping from the integers to the real numbers, for any possible mapping, you can generate a real number that isn’t included in that mapping – and therefore, the set of reals is larger than the set of integers, even though both are infinite.

This really bothers people, including our intrepid author. In his introduction, he gives his motivation:

Cantor’s theory mentioned in fact that there were several dimensions for infinity. This, however, is questionable. Infinity can be thought as an absolute concept and there should not exist several dimensions for the infinite.

Philosophically, the idea of multiple infinities is uncomfortable. Our intuitive notion of infinity is of an absolute, transcendent concept, and the idea of being able to differentiate – or worse, to be able to compare the sizes of different infinities seems wrong.

Of course, what seems wrong isn’t necessarily wrong. It seems wrong that the mass of something can change depending on how fast it’s moving. It seems even more wrong that looked at from different viewpoints, the same object can have different masses. But that doesn’t change the fact that it’s true. Reality – and even worse, abstract mathematics – isn’t constrained by what makes us comfortable.

Back to the paper. In the very next sentence, he goes completely off the rails.

Continue reading

Ropes: Twining Together Strings for Editors

It’s been a while since I’ve written about any data structures. But it just so happens that I’m actually really working on implementing a really interesting and broadly useful data structure now, called a Rope.

A bit of background, to lead in. I’ve got this love-hate relationship with some of the development tools that Rob Pike has built. (Rob is one of the Unix guys from Bell labs, and was one of the principal people involved in both the Plan9 and Inferno operating systems.) Rob has implemented some amazing development tools. The two that fascinate me were called Sam and Acme. The best and worst features of both are a sort of extreme elegant minimalism. There’s no bloat in Rob’s tools, no eye-candy, no redundancy. They’re built to do a job, and do it well – but not to do any more than their intended job. (This can be contrasted against Emacs, which is a text editor that’s grown into a virtual operating system.) The positive side of this is that they’re incredibly effective, and they demonstrate just how simple a programmers text editor should be. I’ve never used another tool that is more effective than Acme or Sam. In all seriousness, I can do more of my work more easily in Sam than I can in Emacs (which is my everyday editor). But on the other hand, that extreme minimalist aesthetic has the effect of strictly eliminating any overlaps: there’s one way to do things, and if you don’t like it, tough. In the case of Acme and Sam, that meant that you used the mouse for damn-near everything. You couldn’t even use the up and down arrows to move the cursor!

This is a non-starter for me. As I’ve mentioned once or twice, I’ve got terrible RSI in my wrists. I can’t use the mouse that much. I like to keep my hands on my keyboard. I don’t mind using the mouse when it’s appropriate, but moving my hand from the keyboard to the mouse every time I want to move the cursor?. No. No damned way. Just writing this much of this post, I would have had to go back and forth between the keyboard and mouse over 50 times. (I was counting, but gave up when I it 50.) A full day of that, and I’d be in serious pain.

I recently got reminded of Acme, because my new project at work involves using a programming language developed by Rob Pike. And Acme would really be incredibly useful for my new project. But I can’t use it. So I decided to bite the bullet, and use my free time to put together an Acme-like tool. (Most of the pieces that you need for a prototype of a tool like that are available as open-source components, so it’s just a matter of assembling them. Still a very non-trivial task, but a possible one.)

Which finally, leads us back to today’s data structure. The fundamental piece of a text editor is the data structure that you use to represent the text that you’re editing. For simplicity, I’m going to use Emacs terminology, and refer to the editable contents of a file as a Buffer.

How do you represent a buffer?

As usual with data structures, you start by asking: What do I need it to do? What performance characteristics are important?

In the case of a text buffer, you can get by with a fairly small set of basic operations:

  • Fast concatenation: concatenating blocks of text needs to be really fast.
  • Fast insert: given a point in a block of text, you need to be able to quickly insert text at that point.
  • Fast delete: given two points in a block of text, you need to be able to quickly remove the text between those points.
  • Reasonably fast traversal: Lots of algorithms, ranging from printing out the text to searching it are based on linear traversals of the contents. This doesn’t have to be incredibly fast – it is an intrinsically linear process, and it’s usually done in the context of something with a non-trivial cost (I/O, regular-expression scanning). But you can’t afford to make it expensive.
  • Size: you need to be able to store effectively unlimited amounts of text, without significant performance degradation in the operations described above.

Continue reading

Lottery Probabilities and Clueless Reporters

A simple, silly, but entertaining example of mathematical illiteracy by way of the Associated Press:

OMAHA, Neb. (AP) — The odds are against something this odd. But a Nebraska Lottery official says there was no mistake: The same three numbers in Nebraska’s Pick 3 lottery were drawn two nights in a row this week.

Lottery spokesman Brian Rockey said one of two lottery computers that randomly generate numbers produced the numbers 1, 9 and 6 — in that order — for Monday night’s Pick 3 drawing. Rockey says the next night, the lottery’s other computer produced the same three numbers in the same sequence.

The odds of such an occurrence? One in a million.

Close… Only off by three orders of magnitude…

Continue reading