Friday Recipe: Chinese-Style Roasted Beef Shortribs

It’s been a while since I posted a recipe, and last week, I came up
with a real winner, so I thought I’d share it.

I absolutely love beef short ribs. They’re one of the nicest cuts
of beef – they’ve got lots of meat, but they’re well marbled with fat, and they’re up against the bone, which gives them extra flavor. When cooked well, they’ve got an amazing flavor and a wonderful texture.

This recipe produces the best short ribs I’ve ever had. It’s based,
loosely, on a chinese recipe, but it’s cooked more in a western style.
There’s one unusual ingredient, which is a chinese sauce that I’ve mentioned before on the blog, called sha cha sauce. It’s made from brill shrimp,
garlic, and chili peppers. You can get it in a chinese grocery store. The english label is, unfortunately, “barbeque sauce”, but you can identify it
by the ingredients, and by the picture of the jar over to the side.
xia cha.

Ingredients

<

ul>

  • 4 lbs shortribs, bone in, cut flanken style. (That means
    cut perpendicular to the bone, in chunks about 2 inches long.)
  • One large onion.
  • 4 cloves garlic. (more if you really like garlic)
  • 1 cup soy sauce.
  • 1 cup beef stock.
  • 1 cup dry gin.
  • 4 tablespoons sugar.
  • One teaspoon xia cha sauce.
  • <

    ul>

    Instructions

    1. Put the garlic and onion into a food processor, and
      run it until they’re nicely chopped. Then add the liquids to
      the processor, and run it until the garlic and onions are a puree
      mixed into the liquids.
    2. Put the short ribs into an oven-safe deep dish, and cover them with
      the liquid. Put this into the fridge for a few hours to marinate.
    3. Heat the oven to 350, and put the marinated shortribs into the
      oven – marinade and all. Cook for 3 hours, taking it out and basting it every 30 minutes.
    4. By now, you’ve got some very well-cooked shortribs, sitting in the marinade, along with a huge amount of fat that cooked out of them. Take them out of the liquid, and set them aside. Good thing I have my source here of proper dieting practises, without them I would feel quite guilty about this recipe.
    5. Strain the liquid, and skim the fat. What’s left is a very strong, but very flavorful sauce.
    6. Put the shortribs back into the now empty pan. Give them a light baste
      with the sauce. Heat the oven up to broil, and when it’s hot, put the
      short ribs back in, just long enough to brown and crisp the outside.

    And they’re ready to eat. Serve it with the sauce on the side, along
    with rice and some stir-fried vegetables.

    Comments Should be Working

    Seed’s tech guy did a reset and restart of the server, and it appears that now I’m able to turn off registration without completely disabling comments. So everyone who’s been having trouble commenting, please give it a try again, and let me know if you have any trouble.

    Friday Random 10, 3/20

    1. Valley of the Giants, “Back to God’s Country”: I mentioned Valley of the Giants a few weeks ago, as one of my favorite post-rock bands. A few weeks of listening to them incessantly hasn’t changed that. They’re absolutely brilliant. This track is very typical
      of them; it’s got a slow start, with an almost droning main melody. And they take that,
      and develop it, through rhythm and harmony, until it’s almost unrecognizable. And then
      everything changes.
    2. Hawkwind, “World of Tiers”: typical Hawkwind. If you like them, you’ll like
      this. If you don’t, you won’t.
    3. The Flower Kings, “Rumble Fish Twist”: a live track by the Flower Kings. Every time I go for a while without listening to tFK, I’m amazed when I turn them on. Roine Stolt and company are just so incredible. To me, there’s a kind of near perfection about the Flower Kings work that no one else comes close to.
    4. Kruzenshtern and Parahod, “Focus Pocus”: Some of the strangest stuff I’ve ever
      listened to. K&P are somewhere between progressive Klezmer, Jazz, and noise… They’re really amazing, but hard to describe or classify. If you can find a copy of one of their CDs, I highly recommend it, but they’re very hard to find.
    5. Gong, “Infinitea”: This band is yet another example of the “How did I not know about these guys?” phenomenon. They’ve been around for quite a while, coming out of
      the Manchester scene. They’re basically a spinoff of sorts from Soft Machine. They are
      a really amazing progressive band, from the Jazzy side of things. They’ve been doing stuff
      since the 70s, and are still making new albums now.
    6. The Reasoning, “Dark Angel”: This is a band that I can’t make up my mind about. They’re neo-prog. They’ve got brilliant moments, and they’ve got a lot of moments that are rather dull. I can’t quite decide what I think on balance; I need to listen to them a bit more. On the good side, they’ve got three members with good (but very different) voices, and do a lot of really nice vocal harmony work, which is unusual.
    7. Uriah Heep, “What Kind of God?”: A great disappointment. I’ve heard about
      Uriah Heep for the longest time, and I finally got around to buying one of their albums. I find it just intolerably dull. Really profoundly mediocre music.
    8. Sonic Youth, “Silver Rocker (live)”: old Sonic Youth. I really love SY, and I
      think that their songwriter has gotten stronger over the years. But there’s still a raw
      energy to their early stuff which the new can’t match. It’s still the same sound, and the
      older songs can sometimes tend towards being a bit on the simple side, but there’s still
      something really special in their older material.
    9. Sylvan, “Strange Emotion”: And another mixed bag. I was looking at other reviews of Sylvan, and someone described them as “Emo Prog”. Not a bad description. It’s definitely neo-prog, with the kinds of sound and structure that you’d expect; but it’s got that mopey, self-absorbed feeling of emo-dreck.
    10. The Wishing Tree, “Ostara”: And still another mixed one. This is Steve Rothery’s band. (Rothery is the guitarist from Marillion.) I’m a huge Rothery fan – he’s got both
      fantastic technical chops, and also fantastic musical taste. He’s not just a loud fancy
      guitarist; he’s a very musical guitarist. He’s got an extremely distinctive style,
      and yet also manages to fit himself into whatever’s going on around him. This album has
      some absolutely wonderful material; but it’s also got a lot of really dull
      derivative stuff. The singer (Hannah Stobart) has a really beautiful voice, but she
      doesn’t have her own style. She always sounds like she’s trying to be someone else. Mostly that’s Kate Bush, but at times, she sounds like she’s trying to be Tori Amos, or
      Melissa Etheridge. But you can almost always listen to her and say “She’s trying to
      sound like X”. On the whole, I like them, but think they’d be much better if Ms. Stobart
      just figured out how to sound like herself.

    Mr. Spock is Not Logical (book draft excerpt)

    As I mentioned, I’ll be posting drafts of various sections of my book here on the blog. This is a rough draft of the introduction to a chapter on logic. I would be extremely greatful for comments, critiques, and corrections.

    I’m a big science fiction fan. In fact, my whole family is pretty
    much a gaggle of sci-fi geeks. When I was growing up, every
    Saturday at 6pm was Star Trek time, when a local channel show
    re-runs of the original series. When Saturday came around, we
    always made sure we were home by 6, and we’d all gather in front of
    the TV to watch Trek. But there’s one one thing about Star Trek for
    which I’ll never forgive Gene Roddenberry or Star Trek:
    “Logic”. As in, Mr. Spock saying “But that would
    not be logical.”.

    The reason that this bugs me so much is because it’s taught a
    huge number of people that “logical” means the same
    thing as “reasonable”. Almost every time I hear anyone
    say that something is logical, they don’t mean that it’s logical –
    in fact, they mean something almost exactly opposite – that it
    seems correct based on intuition and common sense.

    If you’re being strict about the definition, then saying that
    something is logical by itself is an almost meaningless
    statement. Because what it means for some statement to be
    logical is really that that statement is inferable
    from a set of axioms in some formal reasoning system. If you don’t
    know what formal system, and you don’t know what axioms, then the
    statement that something is logical is absolutely meaningless. And
    even if you do know what system and what axioms you’re talking
    about, the things that people often call “logical” are
    not things that are actually inferable from the axioms.

    Continue reading

    The Blog and… the Book!

    I’d like to apologize for the slowness of the blog. Fortunately, there’s a very good reason: I’ve got a book contract! “Good Math” will be published by “The Pragmatic Programmers” press. The exact publication date isn’t set yet, but my schedule plans for a complete draft of the book by summer. (And I used the scheduling rules proposed by one of my favorite managers. He said that when a programmer gives you an estimate of how long something should take, multiply it by two and increase the unit. So if they say it’ll take a day, assume two weeks. If they say a week, assume two months. In my experience, it’s actually a really good predictor.)

    Anyway… For the last couple of weeks, I’ve been setting up a new computer to use for writing the book (gotta keep my Google work and my private work separate!), finishing the first three chapters, and trying to get comfortable with the PP markup system.

    While I’m working on the book, I’m going to be posting drafts of some sections as posts on the blog. As a result, you’ll see some re-runs of older posts in a slightly different format. There will also be some brand new material in the book format. The book draft posts will be clearly marked, and for those, even more than usual, I’d appreciate feedback and corrections.

    Of course, I’ll also be posting non-book related stuff. For example, I hope to have a new data structures post ready this evening. As a result of my work on the book, I’m back on a Haskell binge, and I’m working up a post about a fascinating functional data structure called a finger-tree.

    Perverse Incentives

    A lot of people, reading the reporting on the current financial
    disaster, have been writing me to ask what people mean when they talk
    about incentives. The traders, the bankers, the fund managers, and all
    of the other folks involved in this giant cluster-fuck aren’t
    stupid. So naturaly, the question keeps coming up, why would they go
    along with it? And the answer that we keep hearing is something along
    the lines of “perverse incentives”.

    The basic idea is that the way that the people in the industry got paid,
    it was actually in their interest do do things that they knew would
    eventually cause a disaster. How could that work?

    Continue reading

    I Get Mail: Iterative Compression

    Like a lot of other bloggers, I often get annoying email from people. This week, I’ve been dealing with a particularly annoying jerk, who’s been bothering me for multiple reasons. First, he wants me to “lay off” the Christians (because if I don’t, God’s gonna get me). Second, he wants to convince me to become a Christian. And third, he wants to sell me on his brilliant new compression scheme.

    See, aside from the religious stuff, he’s a technical visionary. He’s invented a method where he can take a source document, and repeatedly compress it, making it smaller each time.

    This is a stupid idea that I’ve seen entirely too many times. But instead of just making fun of it, I thought it would be interesting to explain in detail why it doesn’t work. It touches on a bunch of basic facts about how data compression works, and provides a nice excuse for me to write a bit about compression.

    basic-compression.png

    The basic idea of data compression is that you’re eliminating redundancies in the original text. You can’t discard information. Mathematically, a compression function is an invertible function C from an array of characters to an array of characters (or you could use bits if you prefer), such that if y=C(x), then on the average input, the length of y is smaller than the length of x.

    An ideal compression system is one where for all possible values of x, C(x) is shorter than x. C is your compressor; and since C is reversible, it has a unique inverse function C-1 such that C-1(C(x)) = x. An illustration of this basic compression system is in the diagram to the side.

    Continue reading

    Commenting Problems

    Just a quick status notice: a bunch of commenters have been having problems with the system demanding authetication to be able to comment. I’m trying to fix it with the help of the SB tech folks. My first attempt made things worse, and made it impossible for anyone to comment. I’m trying to re-enable comments now, but since I’m not sure what disabled them, I’m not sure of what will work. Commenting ability using typekey authentication will be re-enabled ASAP; and commenting without authentication will be re-enabled as soon as the SB techs can figure out what’s causing the authentication requirement.

    Tax Thresholds: Why the horror stories about the Obama tax plan are lies

    Watching news reports about President Obama’s proposed tax changes,
    I’ve seen a number of variations on a very annoying theme, which involves
    a very stupid math error.

    A typical example is this story on ABC news, which contains a non-correction
    correction:

    President Barack Obama’s tax proposal — which promises to increase taxes for those families with incomes of $250,000 or more — has some Americans brainstorming ways to decrease their pay in an attempt to avoid paying higher taxes on every dollar they earn over the quarter million dollar mark.

    A 63-year-old attorney based in Lafayette, La., who asked not to be named, told ABCNews.com that she plans to cut back on her business to get her annual income under the quarter million mark should the Obama tax plan be passed by Congress and become law.

    “We are going to try to figure out how to make our income $249,999.00,”
    she said.

    “We have to find a way out where we can make just what we need to just under the line so we can benefit from Obama’s tax plan,” she added. “Why kill yourself working if you’re going to give it all away to people who aren’t working as hard?”

    The original version of this article continued to follow this basic theme. The
    updated article pretends to correct it, while still basically mantaining the
    same focus.

    The idea behind this, and similar stories, is that raising the income tax
    rate on people earning over $250,000 per year creates a threshold, where earning
    more than that threshold will result in your taking home less
    after-taxes pay than if you earned less.

    Continue reading

    Basics: Significant Figures

    After my post the other day about rounding errors, I got a ton of
    requests to explain the idea of significant figures. That’s
    actually a very interesting topic.

    The idea of significant figures is that when you’re doing
    experimental work, you’re taking measurements – and measurements
    always have a limited precision. The fact that your measurements – the
    inputs to any calculation or analysis that you do – have limited
    precision, means that the results of your calculations likewise have
    limited precision. Significant figures (or significant digits, or just “sigfigs” for short) are a method of tracking measurement
    precision, in a way that allows you to propagate your precision limits
    throughout your calculation.

    Before getting to the rules for sigfigs, it’s helpful to show why
    they matter. Suppose that you’re measuring the radius of a circle, in
    order to compute its area. You take a ruler, and eyeball it, and end
    up with the circle’s radius as about 6.2 centimeters. Now you go to
    compute the area: π=3.141592653589793… So what’s the area of the
    circle? If you do it the straightforward way, you’ll end up with a
    result of 120.76282160399165 cm2.

    The problem is, your original measurement of the radius was
    far too crude to produce a result of that precision. The real
    area of the circle could easily be as high as 128, or as low as
    113, assuming typical measurement errors. So claiming that your
    measurements produced an area calculated to 17 digits of precision is
    just ridiculous.

    Continue reading