The initial murder case Kovacs investigates in both the book/show involves a wealthy man who is killed but has his backup mind inserted into his backup clone so he is “recovered” so the victim has no memory of his killing since the backup was prior to killing instead of optimally just prior to death. There is also a big subplot that is show only with Kovacs’ sister and clones.
Imagine that you are a consultant. You get a call that starts with, "Hi, this is Joe Schmoe and Schmoe Law Firm. I need a new billing system. Can you build me one?"
And you respond by saying that you can, but you need to do a _lot_ of work with him to spec this billing system out. You can't just build "a new billing system" without any more details. You tell him that this will take many hours of work between the two of you where you ask him questions, write a spec, get his feedback, and repeat that a number of times.
At this point, he says "wow, that sounds like a ton of of work for me just get started", and he gives up.
AI does not fix any of this, and this is the thing that I think most people will not want to do, and that's why I think this blog post is making a very good point. The amount of work it takes to build a new software system, even with a super competent programmer as a partner, is still quite significant. And it requires thinking about hundreds of tiny little details in a way that drives a lot of people nuts. They will only do it if they _really_ have to do it.
Think payroll. I used to think payroll is relatively simple. Then I spent some time on government of Canada Phoenix pay system (go ahead... Google and weep). And it's... Insane. System has been live for a decade and still regularly gets hit with some weird scenario from some department that nobody foresaw, it wasn't captured in requirements, but upon review by business analysts is a valid scenario. Bob was a CS5 in department of defense and speaks French so gets bilingual bonus and his boss was away for half a day so Bob gets acting cs5 pay and is in public alliance union so these are the dues, and it is second Tuesday of a month and blue moon, but then Bob got moved to department of agriculture and then 3 months later realized that his previous manager at defense didn't put in his promotion on time so now you have to figure out his retro pay for when he was in defense even though everything on his file now has agriculture labour agreement and codes and rates etc etc etc. And this made up example is a fraction of the complex examples.
Clear and comprehensive Requirements are always the tricky bit, at least in business software. Twilight zone covered it perfectly and presciently decades before AI, with genies taking your requests literally and giving you unpredictable and usually negative outcomes.
Or AI won't fix diffusion of responsibility that you see in companies through outsourcing, offshoring or matrix organizations...Or to go through committees to know if they should change shh root access with abc123 as password.
I'd love to see this for Taiwan. I'm here now (on my Nth trip where N >= 11), and my impression is that 7-11 is the clear winner in all the cities I've been to, followed by Family Mart and then some stragglers like Hilife and OK Mart (it's ok).
My personal favorite is probably Family Mart, because they have multiple very delicious vegan rice balls to choose from.
Yeah, it's quite difficult to walk for any length of time without encountering one, usually a 7-11. They're everywhere. And they actually have some decent food. It's a bizarre thing to experience for an American like myself.
I was wondering about this myself. My guess is no, since AFAIK the only way to do this sort manual memory management is to use unsafe code. But there's also things like the (bumpalo)[https://docs.rs/bumpalo/latest/bumpalo] crate in Rust, so maybe you wouldn't need to do this sort of thing by hand, in which case you're as leak-free as the bumpalo crate.
Same for database transaction roll back and roll forward actions.
And most enterprises, including banks, use databases.
So by bad luck, you may get a couple of transactions reversed in order of time, such as a $20 debit incorrectly happening before a $10 credit, when your bank balance was only $10 prior to both those transactions. So your balance temporarily goes negative.
Now imagine if all those amounts were ten thousand times higher ...
I was just starting off in life (kid, girl, job, apartment, bank, debit card, bills) when I made a 63-cent error with my bank account. Which is to say: If all of the negative debits and all of the positive credits were summed, then the account would have been in the negative by 63 cents.
I screwed up. And in a fair and just world, I'd owe the bank some extra tithing for quite clearly having spent some money that I very definitely did not have. Maybe $20 for the error and $10 per day spent in the red, plus the 63 cents, or something along those lines.
But it wasn't that way. Because the transactions were processed in batches that were re-ordered to be highest-first, I discovered that I owed the bank a little more than $430.
At the time (around 25 years ago, now) that was an absolute mountain of money to me.
The banker at the branch I went into was unapologetic and crass about the fees.
Looking over my transactions, they said "You know how to do math, right? You should have known that the account was overdrawn, but you were just out spending money willy-nilly all over town anyway."
I replied with something like "I made a single error of 63 cents. The rest of what you said is an invention."
This went back and forth for a bit before I successfully managed to leave the building without any handcuffs, fire trucks, or ambulances becoming involved -- and I still owed them more than $430.
The lesson I learned was very simple: Seriously, fuck those guys.
($12 billion in 2024, huh? That's all? Maybe the no-fee fintechs are winning.)
I regret to inform you that as a consequence of that sustained time travel, your mind and body will be slowly deteriorating and you’ll sooner or later end up dead.
I couldn't comment on the causal hazards but since time is currently having an outage they've got an improved shot at getting away with it. I say go for it.
> What's an example of a language that's as bad as Perl, that's used as widely as Perl was, that's still in use today?
PHP? I don't know how widely it's still used, but I'd guess it's more widely used than Perl. Also, PHP is not "as bad" as Perl. It's much, much, much worse. It's Perl without the charm.
I was writing a comment asking if it was really easier. Then I took a look at Cython. Yes, this looks easier than Perl's XS, which I have some experience with! There are ways to do something similar in Perl these days, notably https://metacpan.org/pod/FFI::Platypus. But these are relatively new (starting in the 2010s) compared to the history of Perl, and Cython goes back to the early 2000s.
Somewhere in the continuum from SWIG through XS and on to Platypus there are also the Inline modules these days. They allow one to put inline sections of other languages into Perl code the way many language tools used to allow one to inline assembly into C or Pascal code.
There are some of these modules for other languages than those listed here, a lot of them as high level as Perl (including Raku and even another Perl system for some reason).
True. For whatever reason, these never displaced XS. For wrapping C libraries in particular, it's not clear to me how much Inline::C helps with this. You're still stuck using a lot of Perl C API calls, AFAICT, which I think is the biggest challenge of using XS (I still have nightmares from trying to figure out where and when to add `sv2mortal` in my XS code).
One or two calls into a library with a simple C interface isn’t that bad with Inline. You just use Inline to handle the Perl to C to Perl part, and actually do the interfacing in the inline C. It’s a lot more mess if you’re doing complex things and bringing complex data structures back to Perl, or having Perl make a lot of intermediate decisions and doing a lot of round trips. So if you use Perl to get the data ready to pass in, pass it in, do all the work in C that you need the library for, then pass the results back to Perl once it’s not terrible.
I’ve tried not to get into XS, so I couldn’t really compare.
Using Inline::C with your own C code in places where Perl is too much overhead is certainly easier than wrapping a complex existing library with a lot of interactions across the boundary.
FFI::Platypus or something like it really is the way of the future though for existing C calling convention libraries in other languages.
I don't remember there being anything about growing replacement clones, but it would make sense given the other tech in the story.
reply