• 0 Posts
  • 24 Comments
Joined 11 months ago
cake
Cake day: August 7th, 2023

help-circle
  • I kinda get where he is coming for though. AI is being crammed into everything, and especially in things where they are not currently suited to be.

    After learning about Machine learning, you kind realize that unlike “regular programs” that ML gives you “roughly what you want” answers. Approximations really. This is all fine and good for generating images for example, because minor details being off of what you wanted probably isn’t too bad. A chat bot itself isn’t wrong here, because there are many ways to say the same thing. The important thing is that there is a definite step after that where you evaluate the result. In simpler ML you can even figure out the specifics of the process, but for the most part we evaluate what the LLM said or if the image is accurate to our expectations. But we can’t control or constrain the output to exactly our needs, because our restrictions largely are just input in a almost finished approximation engine.

    The problem is, that companies take these approximation engines, put them in their product and consider their output fact. Like Ai chatbots doing customer support, and make up facts like the user that was told about rules that didn’t exist for an airline, or the search engines that parrot jokes or harmful advice. Sure you and I might realize that these things come from a machine that doesn’t actually think about it’s answers, but others don’t. And throwing a “*this might be wrong because its AI” on it is not an acceptable waiver of accountability.

    Despite this, I use chatgpt and gemini a lot to help me program, they get a lot of things wrong but also do great. It’s a great tool, exactly because I step in after the approximation step, review and decide. I’m aware of the limits. But putting these things in front of “users” without a review step means you are advertising that you are either unaware of this flaw, or just see the cost-benefit analysis and see that if noting else it’ll generate interest during the hype.

    There is a huge potential, but throwing AI into a situation where facts are needed when it’s only making rough guesses, is the wrong way about it.





  • It’s worth adding I greatly prefer MS Auth style authentication, since I don’t have to find the right entry to read the Auth code and then write it on the other computer. Instead MS pops a notification and you either type or select the right number, verify with fingerprint and done. Much more convenient.

    It often tells you what you login into and where you are attempt to log in from, so it’s a few extra layers of security for those that have that awareness to check those details.










  • Why wait and hope for C++ to get where modern languages are now? I know there’s value in the shared experience in C++ that if adapted would make it stronger, but I can only see a development of C++ having to force a drop of a lot of outdated stuff to even get started on being more suitable.

    But the language is just not comfortable to me. From large amounts of anything creating undefined behavior, the god awful header files which I hate with a passion, tough error messages and such. I also met a fun collision of C++ in Visual Studio and using it with CMake in CLion.

    I’ve just started looking at rust for fun, and outside not understanding all the errors messages with the bounded stuff yet, figuring out what type of string I should use or pass, and the slow climb up the skill curve, it’s pretty nice. Installing stuff is as easy as copy pasting the name into the cargo file!

    Rust is just the prospective replacement of C++ though, people act like the White house said that C++ should be replaced by rust now. But the just recommend it and other languages, C# will do for a lot of people that does not need the performance and detail that the aforementioned languages target. Python is targeting a whole different use, but often combined with the faster ones.

    C++ will live a long time, and if the popularity dies down it will surely be very profitable to be a developer on the critical systems that use it many years from now. I just don’t think an evolution of C++ is going to bring what the world needs, particularly because of the large amount of existing memory related security vulnerabilities. If things were good as they are now, this recommendation would not be made to begin with.


  • I made do with my IDE, even after getting a developer job. Outside shenanigans involving a committed password, and the occasional empty commit to trigger a build job on GitHub without requiring a new review to be approved, I still don’t use the commandline a lot.

    But it’s true, if you managed to commit and push, you are OK. Even the IDE will make fixing most merges simple.



  • Already been explained a few times, but GPU encoders are hardware with fixed options, with some leeway in presets and such. They are specialized to handle a set of profiles.

    They use methods which work well in the specialized hardware. They do not have the memory that a software encoder can use for example to comb through a large amount of frames, but they can specialize the encoding flow and hardware to the calculations. Hardware encoded can not do everything software encoders do, nor can they be as thorough because of constraints.

    Even the decoders are like that, for example my player will crash trying to hardware decode AV1 encoded with super resolution frames, frames that have a lower resolution that are supposed to be upscale by the decoder. (a feature in AV1, that hardware decoder profiles do not support, afaik.)