Another chaos theory post is in progress. But while I was working on it, a couple of
comments arrived on some old posts. In general, I’d reply on those posts if I thought
it was worth it. But the two comments are interesting not because they actually lend
anything to the discussion to which they are attached, but because they are perfect
demonstrations of two of the most common forms of crackpottery – what I call the
“Education? I don’t need no stinkin’ education” school, and the “I’m so smart that I don’t
even need to read your arguments” school.
Monthly Archives: October 2009
Chaos and Initial Conditions
One thing that I wanted to do when writing about Chaos is take a bit of time to really home in on each of the basic properties of chaos, and take a more detailed look at what they mean.
To refresh your memory, for a dynamical system to be chaotic, it needs to have three basic properties:
- Sensitivity to initial conditions,
- Dense periodic orbits, and
- topological mixing
The phrase “sensitivity to initial conditions” is actually a fairly poor description of what we really want to say about chaotic systems. Lots of things are sensitive to initial conditions, but are definitely not chaotic.
Before I get into it, I want to explain why I’m obsessing over this condition. It is, in many ways, the least important condition of chaos! But here I am obsessing over it.
As I said in the first post in the series, it’s the most widely known property of chaos. But I hate the way that it’s usually described. It’s just wrong. What chaos means by sensitivity to initial conditions is really quite different from the more general concept of sensitivity to initial conditions.
To illustrate, I need to get a bit formal, and really define “sensitivity to initial conditions”.
To start, we’ve got a dynamical system, which we’ll call f. To give us a way of talking about “differences”, we’ll establish a measure on f. Without going into full detail, a measure is a function which maps each point x in the phase space of f to a real number, and which has the property that points that are close together in f have measure values which are close together.
Given two points x and y in the phase space of f, the distance between those points is the absolute value of the difference of their measures, .
So, we’ve got our dynamical system, with a measure over it for defining distances. One more bit of notation, and we’ll be ready to get to the important part. When we start our system at an initial point , we’ll write it .
What sensitivity to initial conditions means is that no matter how close together two initial points and are, if you run the system for long enough starting at each point, the results will be separated by as large a value as you want. Phrased informally, that’s actually confusing; but when you formalize it, it actually gets simpler to understand:
Take the system with measure . Then f is sensitive to initial conditions if and only if:
- Select any two points and such that:
- Let diff(t) = . (Let diff(t) be the distance between and at time .)
- (No matter what value you chose for G, at some point in time T, diff(T) will be larger than G.)
Now – reading that, a naive understanding would be that the diff(T) increases monotonically as T increases – that is, that for any two values and , if with measure . And for our non-chaotic system, we’ll use , with .
Think about arbitrarily small differences starting values. In the quadratic equation, even if you start off with a miniscule difference – starting at v0=1.00001 and v1=1.00002 – you’ll get a divergence. They’ll start off very close together – after 10 steps, they only differ by 0.1. But they rapidly start to diverge. After 15 steps, they differ by about 0.5. By 16 steps, they differ by about 1.8; by 20 steps, they differ by about 1.2×109! That’s clearly a huge sensitivity to initial conditions – an initial difference of 1×10-5, and in just 20 steps, their difference is measured in billions. Pick any arbitrarily large number that you want, and if you scan far enough out, you’ll get a difference bigger than it. But there’s nothing chaotic about it – it’s just an incredibly rapidly growing curve!
In contrast, they logistic curve is amazing. Look far enough out, and you can find a point in time where the difference in measure between starting at 0.00001 and 0.00002 is as large as you could possibly want; but also, look far enough out past that divergence point, and you’ll find a point in time where the difference is as small as you could possible want! The measure values of systems starting at x and y will sometimes be close together, and sometimes far apart. They’ll continually vary – sometimes getting closer together, sometimes getting farther apart. At some point in time, they’ll be arbitrarily far apart. At other times, they’ll be arbitrarily close together.
That’s a major hallmark of chaos. It’s not just that given arbitrarily close together starting points, they’ll eventually be far apart. That’s not chaotic. It’s that they’ll be far apart at some times, and close together at other times.
Chaos encompasses the so-called butterfly effect: a butterfly flapping its wings in the amazon could cause an ice age a thousand years later. But it also encompasses the sterile elephant effect: a herd of a million rampaging giant elephants crushing a forest could end up having virtually no effect at all a thousand years later.
That’s the fascination of chaotic systems. They’re completely deterministic, and yet completely unpredictable. What makes them so amazing is how they’re a combination of incredibly simplicity and incredible complexity. How many systems can you think of that are really much simpler to define that the logistic map? But how many have outcomes that are harder to predict?
Chaos: Bifurcation and Predictable Unpredictability
Let’s look at one of the classic chaos examples, which demonstrates just how simple a chaotic system can be. It really doesn’t take much at all to push a system from being nice and smoothly predictable to being completely crazy.
This example comes from mathematical biology, and it generates a graph commonly known as the logistical map. The question behind the graph is, how can I predict what the stable population of a particular species will be over time?
If there was an unlimited amount of food, and there were no predators, then it would be pretty easy. You’d have a pretty straightforward exponential growth curve. You’d have a constant, R, which is the growth rate. R would be determined by two factors: the rate of reproduction, and the rate of death from old age. With that number, you could put together a simple exponential curve – and presto, you’d have an accurate description of the population over time.
But reality isn’t that simple. There’s a finite amount of resources – that is, a finite amount of food for for your population to consume. So there’s a maximum number of individuals that could possibly survive – if you get more than that, some will die until the population shrinks below that maximum threshold. Plus, there are factors like predators and disease, which reduce the available population of reproducing individuals. The growth rate only considers “How many children will be generated per member of the population?”; predators cull the population, which effectively reduces the growth rate. But it’s not a straightforward relationship: the number of individuals that will be consumed by predators and disease is related to the size of the population!
Modeling this reasonably well turns out to be really simple. You take the maximum population based on resources, Pmax. You then describe the population at any given point in time as a population ratio: a fraction of Pmax. So if your environment could sustain one million individuals, and the population is really 500,000, then you’d describe the population ratio as 1/2.
Now, you can describe the population at time T with a recurrence relation:
P(t+1)= R × P(t) × (1-P(t))
That simple equation isn’t perfect, but it’s results are impressively close to accurate. It’s good enough to be very useful for studying population growth.
So, what happens when you look at the behavior of that function as you vary R? You find that below a certain threshold value, it falls to zero. Cross that threshold, and you get a nice increasing curve, which is roughly what you’d expect. Up until you hit R=3. Then it splits, and you get an oscillation between two different values. If you keep increasing R, it will split again – your population will oscillate between 4 different values. A bit farther, and it will split again, to eight values. And then things start getting really wacky – because the curves converge on one another, and even start to overlap: you’ve reached chaos territory. On a graph of the function, at that point, the graph becomes a black blur, and things become almost completely unpredictable. It looks like the beautiful diagram at the top of this post that I copied from wikipedia (it’s much more detailed then anything I could create on my own).
But here’s where it gets really amazing.
Take a look at that graph. You can see that it looks fractal. With a graph like that, we can look for something called a self-similarity scaling factor. The idea of a SS-scaling factor is that we’ve got a system with strong self-similarity. If we scale the graph up or down, what’s the scaling factor where a scaled version of the graph will exactly overlap with the un-scaled graph/
For this population curve, the SSSF turns out to about 4.669.
What’s the SSSF for the Mandelbrot set? 4.669.
In fact, the SSSF for nearly all bifurcating systems that we see, and their related fractals, is virtually always exactly 4.669. There’s a basic structure which underlies all systems of this sort.
What’s this sort? Basically, it’s a dynamical system with a quadratic maximum. In other words, if you look at the recurrence relation for the dynamical system, it’s got a quadratic factor, and it’s got a maximum value. The equation for our population system can be written: P(t+1) = R×P(t)-P(t)2, which is obviously quadratic, and it will always produce a value between zero and one, so it’s got a fixed maximum value, and Pick any chaotic dynamical system with a quadratic maximum, and you’ll find this constant in it. Any dynamical system with those properties will have a recurrence structure with a scaling factor of 4.669.
That number, 4.669 is called the Feigenbaum constant, after Mitchell Fiegenbaum, who first discovered it. Most people believe that it’s a transcendental number, but no one is sure! We’re not really sure of quite where the number comes from, which makes it difficult to determine whether or not it’s really transcendental!
But it’s damned useful. By knowing that a system is subject to recurrence at a rate determined by Feigenbaum’s constant, we know exactly when that system will become chaotic. We don’t need to continue to observe it as it scales up to see when the system will go chaotic – we can predict exactly when it will happen just by virtue of the structure of the system. Feigenbaum’s constant predictably tell us when a system will become unpredictable.
Sorry, Denise – but God didn't make numbers
I was planning on ignoring this one, but tons of readers have been writing
to me about the latest inanity spouting from the keyboard of Discovery
Institute’s flunky, Denise O’Leary.
Here’s what she had to say:
Even though I am not a creationist by any reasonable definition,
I sometimes get pegged as the local gap tooth creationist moron. (But then I
don’t have gaps in my teeth either. Check unretouched photos.)As the best gap tooth they could come up with, a local TV station interviewed
me about “superstition” the other day.The issue turned out to be superstition related to numbers. Were they hoping
I’d fall in?The skinny: Some local people want their house numbers changed because they
feel the current number assignment is “unlucky.”Look, guys, numbers here are assigned on a strict directional rota. If the
number bugs you so much, move.Don’t mess up the street directory for everyone else. Paramedics, fire chiefs,
police chiefs, et cetera, might need a directory they can make sense of. You
might be glad for that yourself one day.Anyway, I didn’t get a chance to say this on the program so I will now: No
numbers are evil or unlucky. All numbers are – in my view – created by God to
march in a strict series or else a discoverable* series, and that is what
makes mathematics possible. And mathematics is evidence for design, not
superstition.The interview may never have aired. I tend to flub the gap-tooth creationist
moron role, so interviews with me are often not aired.* I am thinking here of numbers like pi, that just go on and on and never
shut up, but you can work with them anyway.(You just decide where you want
to cut the mike.)
Saturday Recipe: Chicken Mole Enchiladas
I forgot to take a picture of this dish – so Physioprof, shut up 🙂
I don’t even pretend that this is an authentic mexican mole. It’s
something that I whipped together because I felt like a mole, and
I worked from very vague memories of a mole recipe I read years ago,
and ad-libbed this. So it’s absolutely not authentic – but it is
yummy.
Ingredients
- 2 pounds chicken breasts, bone in.
- One large onion, diced.
- 2 cloves garlic, minced.
- 1 teaspoon coriander powder.
- 1 teaspoon cumin powder.
- 1/4 teaspoon cinammon powder.
- 1 teaspoon mexican oregano.
- 1/2 teaspoon epazote.
- 1 chipotle pepper in adobo sauce, finely minced.
- 1 large dried ancho chili pepper.
- 1 dried serrano chile pepper.
- One can diced tomatoes.
- 2 ounces dark chocolate, chopped.
- 1/4 cup tequila.
- 1 dozen corn tortillas, lightly toasted.
- 1 tablespoon whole almonds.
- chicken stock.
- Cheese. (I use cheddar; you should use a mexican queso blanco,
but I don’t have access to a decent one.)
Instructions
- Put a pan on high heat. When it’s good and hot, start
adding chicken thighs, skin side down, to the dry
pan. (You’re going to get fat from the chicken skin.)
Brown them well on both sides, then remove. - Reduce the heat to medium, and add the onions to the pan with
the chicken fat. Stir, and let them cook for several minutes until
they’re translucent. - Take the dried peppers, remove the seeds, and crush/chop them
finely. (Depending on the peppers, they may be brittle, in which case
you’ll need to just crush them in a mortar and pestle; or they may be
leathery, in which case you’ll need to mince them.) - Add the garlic, chipotle, and dried chilis to the onions, and
let them cook for about 3 minutes. - Add the tequila, and let it cook until most of the
liquid has evaporated. - Add the can of tomatoes, the cumin, the cinammon, and the coriander.
Stir it to mix, and then re-add the chicken. Add chicken stock until
the the chicken is covered. - Let it simmer on medium-low heat for about 20 minutes.
- Turn off the heat, and remove the chicken from the sauce. Set it
aside and let it cool. - In small portions, move the sauce to a blender, and puree it to
a smooth sauce. - Put the pureed sauce back into the pan, and turn the heat on low. Let
it simmer for another 10 minutes. - Pull the chicken meat from the thighs, and shred it. Move it into
another pan. Add a couple of tablespoons of the sauce, a cup
of chicken stock, and simmer it for half an hour. - Shred one half of a corn tortilla, and the almonds into
the blender. Add just enough chicken stock to cover them,
and puree until smooth. - Add the pureed tortilla and almonds into the sauce, and stir
them in. Let it cook until the sauce starts to thicken. - Lower the heat on the sauce to low. Add the chocolate to the sauce, and
stir until it’s melted and well-blended in. - Taste the sauce, and add salt, black pepper, and sugar to taste.
- Toast the tortillas lightly until they’re softened.
- Into each tortilla, spoon a couple of teaspoons of the shredded
chicken, roll it, and then put it into a baking dish. - Spoon the sauce over the fill tortillas. Don’t overdo it – you want
them nicely coated, but not drowned. - Shred cheese over the top of the sauce.
- Bake the casserole with the tortillas for 30 minutes at 350 degrees.
Serve it with a nice mexican rice and beans.
Humans not yet Perfect? There must be a god involved!
First, a quick status note: the blog has been really slow lately because I
fell behind schedule on my book, and I’ve been putting all of my free time
into catching up. I’m finally pretty much caught up, so I should have time to
get back to the Chaos theory posts. I need a few days of study time to get
myself back up to speed, and then some actual good contentful posts should
start showing up.
In the meantime, for your entertainment, I’ve been looking at a really
silly website that was sent to me by a reader with entirely too much free time
on his hands. It’s another one of those supposed proofs of the existence of
God and the correctness of fundamentalist Christianity. In a typically humble
(and ungrammatical) fashion, the site is called “4 Step Perfect Proof for God
of the Bible, above all other claims on the uncreated creator”. And to give
the author a miniscule amount of credit, it’s not an argument that
I recall seeing before. It’s a crappy argument that I haven’t seen
before, but at least it’s a sort-of novel crappy argument that
I haven’t seen before.
The basic idea of it? The fact that we are not perfect means that we must have been created by a perfect God. Is it me, or is there something a bit weird about that argument?
Saturday Recipe: Home-Made Roasted Tomato Salsa
Lately, friday’s have just been too busy for me to get around to posting a recipe. So I decided to switch my recipe posts to saturday. I’ll try to be reliable about posting a recipe every saturday.
I tried making homemade salsa for the first time about about two months ago. Once I’d made a batch of homemade, that was pretty much the end of buying salsa. It’s really easy to make, and fresh is just so much better than anything out of a jar. When it takes just five minutes of cooking to make, there’s just no reason to pay someone else for a jar of something that’s not nearly as good.
This recipe isn’t much to look at. It’s a tomato salsa – it looks pretty much like a salsa you’d buy in a store, except that it’s a paler pink, because the tomatoes weren’t cooked down. But in terms of taste, it’s an absolute knockout.
The original version of this recipe came from Mark Miller’s Salsa cookbook., which is a fantastic little book. But since I first made it, I’ve made enough changes that it’s really a very different salsa. Obviously, I like mine better :-).
The Conservative Rewrite of the Bible
This is really off-topic for GM/BM, but I just can’t resist
mocking the astonishing stupidity of the Conservapedia folks.
I’m sure you’ve heard by now that Andy Schafly and his pals are
working on a “new translation” of the bible. They say that they need to do this
in order to remove liberal bias, which is “the single biggest distortion in modern
Bible translations”. You see, “translation bias in converting the original language
to the modern one” is the largest source of what they call translation errors, and it
“requires conservative principles to reduce and eliminate”.
Plenty of people have mocked the foolishness of this. So many, in fact, that
I can’t decide which one to link to! But what’s been left out of all of the mockings
that I’ve seen so far is one incredibly important point.
What the “Conservative Bible Project” is doing is not translating
the bible. It is rewriting the bible to make it say what they want it to
say, without regard for what it actually says. These people, who insist
that every word of their holy texts must be taken as absolute literal truth
without interpretation — are rewriting their bibles to make it say
what they want it to say.
Friday Random Ten, 10/2
- Dead Soul Tribe, “Goodbye City Life”: mediocre prog metal. Not bad,
but nothing special either. - Dave Matthews Band, “Lying in the Hands of God”: I know, lots of people think
I’m crazy to like DMB. But I do. And I find this song terribly depressing. One of the
members of the DMB was an amazing saxaphone player named LeRoi Moore. Moore’s
saxaphone play was absolutely fantastic – incredibly skillfull, tasteful, with a huge
range. Moore was killed in an auto accident, and his place was taken in live shows
by Jeff Coffin from the Flecktones. Coffin is, in my opinion, a godawful
gimmicky player with no taste, no style, and who knows one volume setting: way too
loud. This track uses old samples of Moore from before he died – the last time we’ll
get to hear his beautiful playing. - Marillion, “The Space” (electric): this one is actually a double. I just got
the digital version of Marillions new album, which consists of acoustic rewrites
of a selection of their old songs. This is one of the tracks that they chose.
The original version is from “Season’s End”, the band’s first recording with Steve
Hogarth as the lead singer. It’s a great song – one of the best from that album. The
original version is very interesting – because it’s recognizably Marillion, and yet
there’s a huge difference to the sound of the song compared to the stuff they’d been
performing with Fish on vocals – and that basic difference emerged all at once on
this album, and stayed with them through the dozen albums since. Like I said,
it’s classic Marillion, with beautiful transitions, elegant instrumental
breaks, intricate structure. A lovely song, which is carried by Hogarths vocals,
Kelly’s keyboards, and Rothery’s electric guitar. - Marillion, “The Space” (acoustic rewrite): An amazing difference. From an
incredibly dense electric song, to a sparse, intimate acoustic. It’s not just an
acoustic remix, but a really deep rewrite of the song. The rhythm of the vocals has
changed. The main vocals are now sung mainly against acoustic bass guitar and
a but of rythmic chunking on the guitar. Everything is much more syncopated. It’s
hard to believe it’s the same song. I need a few more listens – but I think I actually
prefer this newer version – the rhythmic changes and the sparse arrangement just
increase the emotional impact of the song. It’s really quite impressive. - IQ, “Breathtaker”: Bit of a jarring change after the acoustic version of
“The Space”. But IQ is one of the very best neo-progressive bands out there. Like
Marillion, they started off as a Genesis sound-alike, but grew into their own sound.
Great song, from “Subterranea”, the IQ album to buy if you’ve never heard
them before. - Isis, “From Sinking”: Post-rock, from one of the harder/louder post-rock
bands. Isis is a bit of a harder listen for many people, because they include
death-metal-style screeched vocals, which can really grate. But their overall
sound is brilliant – it’s worth getting over the vocals to enjoy them. - Dirty Three, “Feral”: Another big transition, but still post-rock. Dirty
three is a mostly-acoustic post-rock ensemble from the more classical end of the
spectrum. Their compositional style is much more minimalistic than a lot of others.
But it’s beautiful stuff. Highly recommended. - The Flower Kings, “Flight 999 Brinstone Air”: What can I say about the
Flower Kings that I haven’t said before? THey’re a neo-progressive band that’s
fit to drop the neo – they could stand up well next to pretty much any of the
original wave of prog in both quality and creativity. This is a typical
instrumental track from them. If you’ve never listened to the Flower Kings,
give them a try. It’s pure brilliance. - Isotope 217, “New Beyond”: This is hard to classify. It might be sort-of
progressive rock. It might be sort-of odd Jazz fusion. I just don’t even know where
to put it. It’s a recent acquisition, and to be honest, I haven’t formed a firm opinion
of it yet. (That could be good or bad. Much of my favorite music is stuff that I wasn’t
sure about at first. I tend to like things that challenge me as a listener, and so
that sometimes means listening a few times to absorb it.) - Abigail’s Ghost, “d_letion”: Abigail’s Ghost was recommended to me by
a reader as an American neo-prog band that I’d probably like. Unfortunately, I’m really
not wild about it. I don’t know if this album is typical of their sound. But I really
don’t like this one.
Sloppy Dualism on Bad Astronomy
In the history of this blog, I’ve gone after lots of religious folks. I’ve mocked
lots and lots of christians, a few muslims, some Jews, some newagers, and even one
stupid Hindu.
Today, I’m doing something that’s probably going to get me into trouble
with a lot of readers. I’m going to mock a very well-known atheist. No, not PZ.
As much as I disagree with PZ, as far as I can tell, he’s consistent about his
worldview.
Over at Bad Astronomy, Phil Plait has been a major voice for skepticism and
a vocal proponent of atheism. He has, quite rightly, gone after people of all stripes
for foolishness and silly supernaturalism. He’s frequently talked about how silly he
thinks religion is. All well and good.
But Phil just really
screwed up. And I’ve got to call him on it.