I got one of those desks with a vertical pneumatic lift so I can stack the computers vertically in a rack and just raise/lower it so the right one is at eye height
I got one of those desks with a vertical pneumatic lift so I can stack the computers vertically in a rack and just raise/lower it so the right one is at eye height
FTS? fuck that
I wonder if you applied inflation from the time that idiom was first popularized what the modern price would be.
MAAAN would be a much better acronym though
That’s when you update your sig with your address and a link to a local delivery venue
Alternatively all 504 Gateway Timeout
deleted by creator
That joke was constant in the early 00s.
They probably have a bunch of 1 hour ‘books’ that mess with the average as shorter is cheaper to help pad out their numbers.
Looking at my personal library, the median length audiobook is The Last Wish at a tad over 10 hours. So it’d be equal to 1.5 books going by that, not the worst marketing exaggeration I’ve ever seen.
Data size and user expectations is the main difference. It’s possible but there’d be a lot of latency and overhead for just scrolling down a page with a bunch of images. Maybe there’s fancy stuff you could do by batching images together and reusing connection pools but it feels sisyphean.
Mastodon and lemmy handle this in slightly different ways. Mastodon (according to the link) replicates media on every instance while lemmy (mostly) only replicates thumbnails. That means a popular post doesn’t cause load for one server on mastodon but does on lemmy. But Mastodon has a higher aggregate cost due to all the replicated data, which is what the linked proposal solves by making it sublinear.
If the torrent is instance to instance I don’t see any real benefit (and instance to client is infeasible). On Mastodon side you still have data duplication driving storage costs and bandwidth usage regardless of whether it’s delivered via direct http or torrent. On the lemmy side it wouldn’t gain much (asymmetric load is based on subscription count and so not very bursty) but would add a lot of non-determinism and complexity to the already fragile federation process.
Conventional solutions like cache/CDN/Object Storage or switching to a shared hosting solution (decoupled from instances like your link proposes) seems like a more feasible way to address things.
I was not expecting that. Trump continues to shock.
Oh wow newsite comments are always fun.
Jewish voters vote for leftists by a vast margin. They must switch their vote to Trump and Conservatives. Otherwise they are doomed.
Trump for Prime Minister!
No
You could hire a team of security experts to audit it for you
I love the concept. I hate many of the language design choices.
The demo was so fucking creepy. Would rather be in a dark room surrounded by victorian dolls that sometimes seem to turn their head towards you and blink.
Looking at NASA and Webb sites it appears this is a poorly cropped version of pictures from over a year ago, not something new like the article claims.
There are absolutely laptops with fingerprint sensors.
I’d say the main reason it’s more common in phones than computers is because of the different markets. Phones are mostly consumer purchases, the business market is smaller and the software is more locked down so you can rely on a software disable better sufficing for those cases. Laptops are increasingly dominated by business use cases. Businesses have IT groups that care about security who would prefer models without biometrics.
Secondarily, you login to your phone a lot more often than laptops so the convenience factor is less impactful for laptops. So people don’t consider the fingerprint sensor a mandatory requirement as much as with phones.
GPL FAQ: https://www.gnu.org/licenses/gpl-faq.html#NonFreeTools
In the old days proprietary compilers was the norm. If “blue” is of value an open source equivalent will be made eventually. But looking at the blue examples and sdesk repo I doubt it.
Going just by the examples, Blue itself seems more an incomplete templating/code generation layer for getting some syntax sugar than anything else. Like you write Blue targeting C, write super high level constructs in Blue, then include C headers and snippets of C code for all the stuff you can’t write in Blue, and finally transpile Blue into C which is then compiled conventionally.