I would fix that bug but the complete rewrite that management has had me working on for the past two years will make it obsolete anyway.
coder
I would fix that bug but the complete rewrite that management has had me working on for the past two years will make it obsolete anyway.
Mom, put down the phone, I’m using the modem!
That’s when you break out valgrind because you certainly are using uninitialized memory.
I’m trying to remember the last time I actually had a core file. I think core dumps have been disabled by default on Linux since at least 2000.
I don’t use Ruby anymore, but I still use irb
everyday as a command line calculator.
Tradition is just dead people’s baggage. Doug Stanhope.
Known to cause heisenbugs. They’re bugs that disappear when you try to measure them with a debugger or a printf.
Yeah back before github existed, we used sourceforge to host opensource, and you had to use CVS. Then later Subversion.
One of the people reverse engineering the M1 GPU for Asahi Linux is a catgirl vtuber: https://www.youtube.com/asahilina
Nah… wrap entire templates in @if
statements.
It’s kinda amazing how someone can work so hard to sabotage their own public image.
My problem with C/C++ is the people behind the spec have sacrificed our sanity in the name of “compiler optimization”. Signed overflow behaves the same on every cpu on the planet, why is it undefined behaviour? Even more insane, they specify intN_t
must be implemented via 2s complement… but signed overflow is still undefined because compilers want to pretend they run on pixie dust instead of real hardware.
And yet it’s still easy to write spaghetti code in Java. Just abuse inheritance. Where is this function implemented? No one knows but the compiler!