Sunday, August 31, 2025

Meltdown

In the old days, they used to say light speed was the limit, the absolute limit. Not even information could travel faster than c, let alone matter. They went off on the wrong path trying to at least make quantum entanglement work for them, but they never could. They had no idea what we could accomplish when we finally got on the right path.

The doctor had the idea, but my design team and I made it happen. We made everything happen. We put human bootprints on Proxima Centauri B! Light, ambling along like a slug, would take four years to get there – the team we sent got there in seconds, and back safe just as fast. We can go anywhere.

The first tests were so colossally successful that we hired whole teams of new technicians – including Betsy, with her cute eyes – to expand the system. We’re scaling it up until we can colonize, that’s the end goal.

“The tremendous reward is worth the risk”, the doctor said.

The consequences of actual criticality would be absolutely catastrophic, in the most literal sense. Cataclysmic. Apocalyptic. You know in Norse mythology how the gods are all supposed to die in Ragnarök at the end of time? Well, good luck to the gods themselves if this thing goes critical. We’re not just talking about any simple explosion. This would be nothing so prosaic as mere entropy; it would be so much worse than even the mere heat death of the universe. When we talk about this, we start throwing around words like rift and void and we need a new tense for verbs that happen after there’s no such thing as time anymore.

“Infinitesimal risk,” I said, “harder to get closer to zero risk than this. You take a thousand more likely risks when you shower, get dressed, and eat breakfast in the morning. A million. So yes, worth the almost zero risk.”

I built so many safeguards into the system. It’s idiot-proof and bandit-proof. If you leave it running too long, it shuts down. If it runs too hot, it shuts down. If there’s a fire, it shuts down. If it doesn’t hear from the control hub for an hour, it shuts down. If any safeguard is tampered with, it shuts down. If bad guys get into the facility – well, they can’t, not without about a million bullet-holes, but if we pretend there aren’t all the autoturret emplacements, the fact remains that if bad guys get close enough to worry anybody in the building, and anybody hits any of the handy buttons that are everywhere in the facility, it shuts down.

Even if you get past the buttons, you can’t make it go critical from any one place. There are so many redundant fail-safes in the whole system, you pretty much must have somebody in a hundred places disabling every safeguard and backup system and alarm and governor and auto-shutoff. And each safeguard has half a dozen alarms that go off if you tamper with it. And only one person, head of the design team, even really knows anything about every safeguard. And you’d have to disable every safeguard in one day, before everything gets automatically reset to base settings at midnight.

My team and I really did design it to be absolutely impossible for it to go critical through malice or negligence or any combination thereof. Impossible.

But Betsy. She’s got cute eyes and she’s smart enough to understand the math behind the system, so of course I liked her, so I invited her over; it should have been as simple as that.

But you should have heard what she said about my collection when she saw it! She called it weird! The note of mockery in her voice! Then she laughed about it! And she turned and left right out my door, snickering all the way, and went home!

Now I really don't want to talk to her today.

I don’t want to deal with her.

Or with people.

Ever again.

No comments:

Post a Comment