At least there was a distinction between web of documents (WWW) and shipped apps with custom canvas. Rendering apps with web’s DOM is stupid. It makes websites a mess and relies on everyone using the same monoculture of browsers (like we now have Chromium, WebKit and Gecko, all nearly identical).

If browser does not support one feature (like CSS’s transform), the whole house of cards breaks. It’s like making ASCII art in notepad and then expecting everyone to use the same notepad app with the same font and style, to not break our art proportions.

We need to split web into websites and webapps, with webapps being browser dependent or full custom canvases and websites being immutable human-readable and editable format.

  • Lmaydev@programming.dev
    link
    fedilink
    arrow-up
    49
    arrow-down
    3
    ·
    11 months ago

    They were an absolute nightmare for security. Now that could be mitigated with better design but attacks are much more sophisticated nowadays.

    You just straight don’t want a website executing things on your computer. It’s got too many potential problems.

    Visiting a website would allow hackers to execute a likely vulnerable application on your system.

    This is exactly why everyone moved to html5 and websites being trapped inside the browser’s sandbox. They literally have no access to your system.

    The other issue is you don’t want to have to install software to visit a website at all. So the ones that use it will straight away be at a disadvantage with less tech savvy or even more privacy focused users.

    • smileyhead@discuss.tchncs.deOP
      link
      fedilink
      arrow-up
      3
      arrow-down
      3
      ·
      edit-2
      11 months ago

      Why I added “standardized and open source” in the title. I don’t want random things executed eather and JavaScript is exacly that, but isolated. My post is about building apps heavly relient of all HTML/CSS/JS spec working exaclly the same just to bend and hack upon it to make something like custom canvas. In other words, modern sandoxed applets, if standardized, build-in and open source, would be much more healthy for web ecosystem. Why? Because to open this app browser need a compiler, OpenGL and http support, instead of impossible task of implementing all current CSS and JS APIs.

      • Lmaydev@programming.dev
        link
        fedilink
        arrow-up
        6
        arrow-down
        1
        ·
        edit-2
        11 months ago

        I don’t see how creating new specs and standards would help with the issues mentioned. Namely specs and standards that have to be implemented.

        OpenGL is already accessible on webpages. As are canvasses.

        The only way to achieve it is to have an installable plugin that browsers just embed. Which is exactly what we had before and comes with the issues I’ve mentioned.

        This also requires everyone agreeing to a single spec. Which hasn’t even been achieved with CSS/js.

        • smileyhead@discuss.tchncs.deOP
          link
          fedilink
          arrow-up
          3
          arrow-down
          1
          ·
          11 months ago

          With OpenGL there is no need for much of new spec needed actually :). What we need to stop is <div><div><div><div>... with complicated JavaScript boilerplate. And yes, I know dev is paid to make new function by boss that does not care if it breaks what web was created for, but I take my rant.</div></div></div></div>

  • DoomBot5@lemmy.world
    link
    fedilink
    arrow-up
    21
    ·
    11 months ago

    You downloaded and installed Flash on your machine, including all it’s security vulnerabilities. In fact Flash security became a meme on the early internet for how bad it was.

    • smileyhead@discuss.tchncs.deOP
      link
      fedilink
      arrow-up
      1
      ·
      11 months ago

      It’s not about how do you get the functionality, if this is build-in, plugin or additional OS-level software. But rather about using HTML+CSS like if they would forever only have one implementation (WebKit and adopted to it Gecko), which is super unhealthy for the web.

      • DoomBot5@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        11 months ago

        Old web stuff is either super broken, or just looks ugly because modern standards have evolved significantly. I don’t see how what’s done today is any different.

        • smileyhead@discuss.tchncs.deOP
          link
          fedilink
          arrow-up
          1
          ·
          11 months ago

          All modern standards are great. HTML5 should enable far more user control over the look of the website.

          But how we use the standards is the problem. We treat them as if Chromium/Gecko is the one ever lasting implementation and hacking around it. Example are animated icons done not by simple .webp file, but using many nested divs and hacky CSS, which is going to work… Unless it does not work.

  • severien@lemmy.world
    link
    fedilink
    arrow-up
    17
    arrow-down
    1
    ·
    11 months ago

    A major issue of flash and java was that they were foreign citizens - they didn’t interoperate with the rest of the web platform. With webapps you can inspect the application, change CSS styles or fonts, you can compose and layout them with other content. Not possible with flash/java.

    • grue@lemmy.world
      link
      fedilink
      arrow-up
      4
      ·
      11 months ago

      With Java at least, that was by design and arguably a good thing. Instead of the bastardized “web apps” we have now that try to shoehorn an application into a web page, Java Web Start was designed to run a full-featured desktop application (complete with Swing UI that mimicked the native OS’s UI) with its own windows and such, just launched from a hyperlink instead of needing to be installed.

      The only real problem with it was that “AJAX” style tech hadn’t been invented yet, so it had to download the whole thing before it could run instead of streaming parts of the app on the fly, and (I think) tended to interact with server-side code with RPC calls instead of the REST style APIs that folks prefer these days.

      In other words, it failed mostly because it was ahead of its time, and Electron apps/PWAs are merely a poor reinvention of it.

      • severien@lemmy.world
        link
        fedilink
        arrow-up
        4
        ·
        edit-2
        11 months ago

        The only real problem with it was that “AJAX” style tech hadn’t been invented yet, so it had to download the whole thing before it could run instead of streaming parts of the app on the fly

        AJAX is a JavaScript specific technology, Java applets had access to full network stack so it could do whatever it wanted in this regard. Java supports natively custom classloaders which could dynamically load classes from the network, but it’s not widely used and I don’t know if applets leveraged that.

        and (I think) tended to interact with server-side code with RPC calls instead of the REST style APIs that folks prefer these days.

        Java applets had access to the full network stack, so you could use REST style calls or RPC styled network calls if you wanted. Java applets also had native RPC capability (with network being transparent), perhaps that’s what you mean. But all this is an implementation detail invisible to the user and not a reason why applets sucked.

        In other words, it failed mostly because it was ahead of its time, and Electron apps/PWAs are merely a poor reinvention of it.

        I disagree. They sucked, because they were kinda something between desktop app and web app, largely combining disadvantages of both.

        • grue@lemmy.world
          link
          fedilink
          arrow-up
          2
          ·
          edit-2
          11 months ago

          AJAX is a JavaScript specific technology, Java applets had access to full network stack so it could do whatever it wanted in this regard.

          Java applets had access to the full network stack, so you could use REST style calls or RPC styled network calls if you wanted. Java applets also had native RPC capability (with network being transparent), perhaps that’s what you mean.

          That’s why I said “AJAX-style” and not “AJAX.” Although it would’ve been technically possible to do whatever kind of communication they wanted, folks hadn’t really thought of trying to stream parts of the app itself after the rest of it was already running the way they do with javascript stuff. You had to wait for the entire .jar to download before it would start, when what it really needed was the ability to download a little stub .jar, start running, and then stream classes on the fly as you accessed them.

          • severien@lemmy.world
            link
            fedilink
            arrow-up
            1
            ·
            edit-2
            11 months ago

            folks hadn’t really thought of trying to stream parts of the app itself after the rest of it was already running the way they do with javascript stuff.

            It kinds of seems like you have some confusion in the terminology. AJAX doesn’t mean streaming the app parts dynamically, it’s just client-server request controlled by JavaScript, originally used mostly to pull/post data, not code (the X means XML). Lazy loading application parts is a newer concept mainstreamed by SPAs / bundlers and can be done with AJAX/XHR or other means (injecting script tags or await import).

            You had to wait for the entire .jar to download before it would start, when what it really needed was the ability to download a little stub .jar, start running, and then stream classes on the fly as you accessed them.

            As mentioned above, a native support to do that was baked into Java since 1.0. It’s possible some applets even used that. Those that didn’t - their problem. But this practice wasn’t really common in JS apps of that time either (apps weren’t typically SPAs, but still).

  • grue@lemmy.world
    link
    fedilink
    arrow-up
    12
    ·
    11 months ago

    How do the rules work here? Am I supposed to upvote opinions I agree with, or ones I disagree with?

    Either way, I 100% agree that trying to shoehorn an app into a document format is fundamentally dumb. I’m glad to see somebody other than me saying it, for once!

    • smileyhead@discuss.tchncs.deOP
      link
      fedilink
      arrow-up
      4
      ·
      11 months ago

      I don’t know eather. This is the same scheme as on Reddit, so supposedly you give upvote if the post fits the community, giving good discussion and downvote when don’t, even when you disagree But most people use it as like and dislike, as it’s more natural.

      For much time I was thinking I hate web development, but couldn’t name it. Http is great, HTML is great, CSS is great, JS is great, REST API is great… etc. I hate two things: lack of clear JavaScript licences format and what Internet Explorer did to us which is monoculture and thinking we can hack text documents to be totally custom app interfaces.

      • grue@lemmy.world
        link
        fedilink
        arrow-up
        3
        ·
        11 months ago

        This is the same scheme as on Reddit, so supposedly you give upvote if the post fits the community, giving good discussion and downvote when don’t, even when you disagree But most people use it as like and dislike, as it’s more natural.

        The way that r/UnpopularOpinion was supposed to work was non-standard: you were supposed to upvote if and only if you disagreed.

        For much time I was thinking I hate web development, but couldn’t name it. Http is great, HTML is great, CSS is great, JS is great, REST API is great… etc. I hate two things: lack of clear JavaScript licences format and what Internet Explorer did to us which is monoculture and thinking we can hack text documents to be totally custom app interfaces.

        There’s one part of that I disagree with: Javascript is not great. It was hacked together in a week and it shows. To the extent that it’s usable, it’s only because devs were forced to waste millions upon millions of man-hours bolting shit on in an attempt to fix it after-the-fact.

        The world would’ve been much better off if Brandon Eich had fucked off and Mozilla had embedded Scheme or Python instead (which were, in fact, the other options being considered).

  • kinsnik@lemmy.world
    link
    fedilink
    arrow-up
    10
    arrow-down
    2
    ·
    11 months ago

    modern web doesn’t relay on a monoculture of browsers, it relies on an open standard. The monoculture is a consequence of one single company getting too much power in the internet. Flash and Java also weren’t open source and standard, that is why the web standard was created

    i do agree that the modern web is a bloated mess, but that is not a different topic

    • smileyhead@discuss.tchncs.deOP
      link
      fedilink
      arrow-up
      0
      ·
      11 months ago

      Of course everything is standardized and we all love that. But building complex apps on top of this standard is temporary and defective unless all people are using same browser engine. That is because even if HTML, CSS and JS are always the same, details in implementation are not and it’s impossible to support all nuances. Good luck creating new browsers that can run Vue or React.

      • kinsnik@lemmy.world
        link
        fedilink
        arrow-up
        1
        arrow-down
        1
        ·
        11 months ago

        details in implementation are not and it’s impossible to support all nuances

        what are you talking about? modern web development is hardly a problem (because of the standardization). i am a front-end engineer, i deal with website development every day, and i can count with one hand the times i had browser-specific issues in the last 5 years.

        you know the times that I had to deal with technology-specific issues? with 3rd party vendors for screen readers, which is not as standardized as the web.

        the issues the web had in the past, where it was impossible to support all varieties were intentionally caused by Microsoft creating their own implementation of stuff for IE (sometimes because there was no standard yet; sometimes against the standard). there have been attempts from google more recently to add extra incompatibility (like making Google Drive offline only work for chrome), but nothing as bad as what Microsoft used to do.

        Good luck creating new browsers that can run Vue or React

        what does that mean? vue and react are just libraries, as long as you implement all the required features, they will work. it would be a ton of work, which is why there are only 2 (3 if you consider webkit and chromium different engines) implementations

  • nek0d3r@lemmy.world
    link
    fedilink
    arrow-up
    9
    arrow-down
    2
    ·
    11 months ago

    For how much web content is reliant on pretty much Chromium, it’s no comparison to the old competing standards between browsers. It is somewhat frustrating still, but I’d much rather have what we do now than before.

    • andyburke@kbin.social
      link
      fedilink
      arrow-up
      4
      ·
      11 months ago

      My young friends are forgetting the days of IE where we really did almost lose the web. It’s been a constant struggle, always, against the corporate interests who only see the internet as a money printer.

      I am more excited about the web today, with the fediverse for example, than I have been in a long time. Maybe since those days, when the future of any browser but IE was in doubt.

    • smileyhead@discuss.tchncs.deOP
      link
      fedilink
      arrow-up
      2
      ·
      11 months ago

      Because Flash and Java sucked, but I think, really an unpopular opinion, that their idea, that is custom runtime to play app, is better than hacking upon purerly document format. HTML is not PDF, it was not created to always look the same, it should be immutable and work even is some part is missing (not implemented).

  • Dojan@lemmy.world
    link
    fedilink
    arrow-up
    7
    arrow-down
    7
    ·
    11 months ago

    It makes websites a mess and relies on everyone using the same monoculture of browsers (like we now have Chromium, WebKit and Gecko, all nearly identical).

    Flash did too, though? Nevermind the gaping hole that is security.

    Browser compatibility is generally not an issue since most people target for Chromium. There are polyfills, preprocessing, and whatnot to ensure maximum compatibility with the minimum amount of effort, but in the end if a webapp doesn’t behave the way you want it to on Konqueror, maybe hop over and use it on your preferred flavour of Chromium, or just don’t use the app.

    Definitely an unpopular opinion but I can one-up you; I think Mozilla and Apple should give up on their respective platforms and move to Chromium. A unified web would be better, so long as no corporation has complete control over it. Mozilla hopping on and commandeering a part of Chromium would go a ways to safeguard that.

    • 4am@lemm.ee
      link
      fedilink
      arrow-up
      11
      ·
      11 months ago

      Now THAT is an unpopular opinion.

      Google is trying to lock down the web with WEI, and they might get away with it since most browsers are Chromium.

      We need more browser engines, not fewer. There are standards, there should be NO differences. Google is doing what Microsoft did with internet explorer and their waving around their big dick of a monopoly over the user base to create breaking changes that push competition out.

      Fuck Google and fuck Chrome.

      • smileyhead@discuss.tchncs.deOP
        link
        fedilink
        arrow-up
        1
        ·
        11 months ago

        There are standards, there should be NO differences.

        Yes, but this is not possible when having multiple browsers :). So this is the point, HTML, CSS, JS… all should obey standards, but websites should not expect that every part of the standard is implemented. Sites should not break if browser does not have one function for example.

    • severien@lemmy.world
      link
      fedilink
      arrow-up
      6
      ·
      11 months ago

      A unified web would be better, so long as no corporation has complete control over it.

      You know that chromium is controlled by google, right? Only google gets to decide what goes in and what stays out.

      • Dojan@lemmy.world
        link
        fedilink
        arrow-up
        2
        arrow-down
        1
        ·
        11 months ago

        Which is why I think that shouldn’t be the case.

        Originally Chromium was adapted from WebKit, developed by Apple, which in turn had been adapted from KHTML, developed by the KDE project. That’s the way of open source. Chromium is huge and used not just as a browser, but as the foundation for lots of desktop applications.

        No single company should have that kind of power, hence other companies, preferably companies like Mozilla, should step in and democratise the project.

        • severien@lemmy.world
          link
          fedilink
          arrow-up
          4
          ·
          edit-2
          11 months ago

          No single company should have that kind of power, hence other companies, preferably companies like Mozilla, should step in and democratise the project.

          How do you plan to convince Google to give up exclusive control?

          • Zagorath@aussie.zone
            link
            fedilink
            arrow-up
            1
            ·
            11 months ago

            You don’t have to. You just fork the existing Chromium, keep your fork up-to-date with the parts you like, while removing the parts you don’t (like WEI). It’s a job that would be much easier if companies like Mozilla and Apple were doing it, instead of just the much, much smaller companies behind Vivaldi and Brave.

            It would be mutually beneficial, because frankly Firefox has been struggling to keep up with its own development. They were years behind Chrome in implementing the column-span CSS property (April 2016 vs December 2019), and they still today have not, on their iPad OS version, implemented the multiple windows feature introduced in 2019. Every time there’s a new web standard, or a change to an existing standard, Mozilla has to spend time implementing it, along with all the usual time fixing bugs and implementing any new features. Forking Chromium would reduce the amount of work they need to do by sharing that work with Google, Microsoft, Brave, and Vivaldi, leaving more time for their own new features.

            • severien@lemmy.world
              link
              fedilink
              arrow-up
              2
              ·
              edit-2
              11 months ago

              You just fork the existing Chromium, keep your fork up-to-date with the parts you like, while removing the parts you don’t (like WEI).

              You mean like WebKit and Blink kept up to date with each other?

              You have basically two options:

              1. you keep your fork extremely close to the original, so you can keep it up to date. But that means making very little changes and as a consequence still leaving google in control.
              2. making more changes, but then your fork will diverge pretty soon, and you lose the benefit of the cooperation. In the end you end up in a similar position as e.g. Mozilla is in now.
                • If google doesn’t like what you’re doing, they can speed up the divergence by introducing refactorings in the interfaces of the code you modified which will make keeping your fork up to date with the upstream very difficult.
              • Zagorath@aussie.zone
                link
                fedilink
                arrow-up
                1
                ·
                11 months ago

                Google forked WebKit specifically because they didn’t want to remain too similar to it. If either of them had wanted to, they could have kept it close.

                We already have multiple browsers forking Chromium with the features they want and not the ones they don’t. Edge is this. Brave is this. Mozilla would just be the largest noncommercial option for a Chromium fork, not either beholden to an advertising giant or laden with bloat, which would benefit Brave, Vivaldi, Opera, etc. as well as themselves. It’s a model that works, and works well. All the effect it would have is enabling them to spend a smaller amount of effort maintaining the basic functionality of the browser.

                • severien@lemmy.world
                  link
                  fedilink
                  arrow-up
                  1
                  ·
                  edit-2
                  11 months ago

                  If either of them had wanted to, they could have kept it close.

                  I don’t know if you regularly work with large code bases, but that’s not true. It’s very easy to diverge significantly even if you don’t want to. That’s why there’s so much focus on short living branches, the long living branches cause a lot of pain.

                  Now, if you have a hostile upstream, which intentionally tries to make that difficult - that’s a whole another story.

                  We already have multiple browsers forking Chromium with the features they want and not the ones they don’t. Edge is this. Brave is this

                  So, which core web platform features (things like HTML, CSS, JS, DOM, network stack, WebGL, WASM, File API, WebVR, WebXR…) Brave/Edge add or remove? Brave/Edge go with the first option outlined above, they’re more like shells (or skins if you will) around the largely untouched Blink.

    • Bipta@kbin.social
      link
      fedilink
      arrow-up
      4
      ·
      11 months ago

      Google is currently trying to kill the open web via Chromium. I’m not at all convinced Mozilla could change that, and giving up their foundation in favor of Chromium would only give Google more leverage.

    • smileyhead@discuss.tchncs.deOP
      link
      fedilink
      arrow-up
      2
      ·
      11 months ago

      One Fediverse software, one notepad software, one SMS-capable phone, one http server…

      One standard - nice. One implementation - kill for any innovation and creating rigid systems impossible to build upon when they start to rot (look at targetting IE11 compatibility to this day).