![](/static/253f0d9b/assets/icons/icon-96x96.png)
![](https://lemmy.dbzer0.com/pictrs/image/a18b0c69-23c9-4b2a-b8e0-3aca0172390d.png)
Completely true, but also compression can make anything bad. I’ve seen 480p better 1080p simply because the 480p was using more bitrate, where the 1080p is encoded without enough relatively speaking.
Completely true, but also compression can make anything bad. I’ve seen 480p better 1080p simply because the 480p was using more bitrate, where the 1080p is encoded without enough relatively speaking.
I use phone every day at office so I don’t need to get the wallet out of my jacket when going to the canteen to buy lunch. It’s literally the reason I started using my phone to pay. Too many times I forgot my card…
New car smell, it’s awful. Sort of stale plastic if I were to describe it.
Amplified by long trips on bad roads as kid. Guaranteed to make you feel like vomiting on some sections. Now when I anticipate/pack for a trip I tend to smell that again even though I’m not even in a car.
It’s worth adding I greatly prefer MS Auth style authentication, since I don’t have to find the right entry to read the Auth code and then write it on the other computer. Instead MS pops a notification and you either type or select the right number, verify with fingerprint and done. Much more convenient.
It often tells you what you login into and where you are attempt to log in from, so it’s a few extra layers of security for those that have that awareness to check those details.
Played a lot of rainbow six siege, where you have to shoot those 360° security cameras when you are attacking. So, now I’m trained to spot those on instinct.
Pretty sure that site is satire, iirc with some right leaning bias but my memory is vague.
So it’s political ads, but at least fake politics :P
Python, C#, Rust
Used a bit of C++ and Matlab, but saying I know them is a stretch really.
Not to completely spring to IKEAs defense here, but I heard they really were affected by production and shipping problems during covid. It’s reasonable prices would go up, and at least good that they are going down again.
The peak might be higher for induction.
Not in the US, so electrical grid is different but induction on boost can use much more wattage for short periods, triggering the breaker. In my case the circuit was 16A if I remember correctly while a powerful induction should be on 25A.
Reminds me of folding cardboard boxes. If you are taking a flat piece and make a box of it, are you folding a box or unfolding the cardboard. Or both. And when you do the reverse, you do the same, do you not?
I think it goes out of the cycle somewhat, kinda like all that is held up
If 100L rain down a day, and you use 10L for cooling then you will have still have 100L flowing but now only 90L for actual use. And then datacenters take up a significant amount of the total then you have a lot less to use elsewhere such as watering fields for example.
There used to be boxed ice cream with blueberry and egg yolk flavor. Loved it as a kid, got discontinued by the ice cream truck that had it. No replacement found. I probably won’t even like it since I have forgotten the taste at this point.
Why wait and hope for C++ to get where modern languages are now? I know there’s value in the shared experience in C++ that if adapted would make it stronger, but I can only see a development of C++ having to force a drop of a lot of outdated stuff to even get started on being more suitable.
But the language is just not comfortable to me. From large amounts of anything creating undefined behavior, the god awful header files which I hate with a passion, tough error messages and such. I also met a fun collision of C++ in Visual Studio and using it with CMake in CLion.
I’ve just started looking at rust for fun, and outside not understanding all the errors messages with the bounded stuff yet, figuring out what type of string I should use or pass, and the slow climb up the skill curve, it’s pretty nice. Installing stuff is as easy as copy pasting the name into the cargo file!
Rust is just the prospective replacement of C++ though, people act like the White house said that C++ should be replaced by rust now. But the just recommend it and other languages, C# will do for a lot of people that does not need the performance and detail that the aforementioned languages target. Python is targeting a whole different use, but often combined with the faster ones.
C++ will live a long time, and if the popularity dies down it will surely be very profitable to be a developer on the critical systems that use it many years from now. I just don’t think an evolution of C++ is going to bring what the world needs, particularly because of the large amount of existing memory related security vulnerabilities. If things were good as they are now, this recommendation would not be made to begin with.
I made do with my IDE, even after getting a developer job. Outside shenanigans involving a committed password, and the occasional empty commit to trigger a build job on GitHub without requiring a new review to be approved, I still don’t use the commandline a lot.
But it’s true, if you managed to commit and push, you are OK. Even the IDE will make fixing most merges simple.
It’s probably more common that scientific notation is used. So 3.2 *10^4 or simply 3.2e4. From the little physics I had, you often used kilometers instead of something like megameters. Or used just lightyears when you got on a big enough scale.
Already been explained a few times, but GPU encoders are hardware with fixed options, with some leeway in presets and such. They are specialized to handle a set of profiles.
They use methods which work well in the specialized hardware. They do not have the memory that a software encoder can use for example to comb through a large amount of frames, but they can specialize the encoding flow and hardware to the calculations. Hardware encoded can not do everything software encoders do, nor can they be as thorough because of constraints.
Even the decoders are like that, for example my player will crash trying to hardware decode AV1 encoded with super resolution frames, frames that have a lower resolution that are supposed to be upscale by the decoder. (a feature in AV1, that hardware decoder profiles do not support, afaik.)
I stopped auto updating the 3rd time my god damn app was force closed when using it. Either update for the app itself or damn webview. Been many years since then, so not sure if things changed but man it was frustrating having things just go poof in the middle of something.
It’s just the sum. Monitors have 8bit per color, making for 24bit per pixel, giving the millions mentioned. 16bit is actually 4bit per color and then another 4 for a single of those colors. But this has downsides as explained in the article when going form higher bit depth to lower.
HDR is 10bit per color, and upwards for extreme uses. So it’s sorta true they are 24 or 30 bit, but usually this isn’t how they are described. They normally talk about the bit depth of the individual color.
I don’t have this issue, except once when I got my first desktop connected with wired internet. Turns out, yeah the wired internet (or the adapter/driver) can actually wake the computer… Turned it off and been mostly problem free from wakeups.
I kinda get where he is coming for though. AI is being crammed into everything, and especially in things where they are not currently suited to be.
After learning about Machine learning, you kind realize that unlike “regular programs” that ML gives you “roughly what you want” answers. Approximations really. This is all fine and good for generating images for example, because minor details being off of what you wanted probably isn’t too bad. A chat bot itself isn’t wrong here, because there are many ways to say the same thing. The important thing is that there is a definite step after that where you evaluate the result. In simpler ML you can even figure out the specifics of the process, but for the most part we evaluate what the LLM said or if the image is accurate to our expectations. But we can’t control or constrain the output to exactly our needs, because our restrictions largely are just input in a almost finished approximation engine.
The problem is, that companies take these approximation engines, put them in their product and consider their output fact. Like Ai chatbots doing customer support, and make up facts like the user that was told about rules that didn’t exist for an airline, or the search engines that parrot jokes or harmful advice. Sure you and I might realize that these things come from a machine that doesn’t actually think about it’s answers, but others don’t. And throwing a “*this might be wrong because its AI” on it is not an acceptable waiver of accountability.
Despite this, I use chatgpt and gemini a lot to help me program, they get a lot of things wrong but also do great. It’s a great tool, exactly because I step in after the approximation step, review and decide. I’m aware of the limits. But putting these things in front of “users” without a review step means you are advertising that you are either unaware of this flaw, or just see the cost-benefit analysis and see that if noting else it’ll generate interest during the hype.
There is a huge potential, but throwing AI into a situation where facts are needed when it’s only making rough guesses, is the wrong way about it.