As someone who spends time programming, I of course find myself in conversations with people who aren’t as familiar with it. It doesn’t happen all the time, but these discussions can lead to people coming up with some pretty wild misconceptions about what programming is and what programmers do.

  • I’m sure many of you have had similar experiences. So, I thought it would be interesting to ask.
  • NeonKnight52@lemmy.ca
    link
    fedilink
    arrow-up
    1
    ·
    4 months ago

    That there’s something inherently special about me that makes me able to program…

    … Yes…patience and interest.

    • stoly@lemmy.world
      link
      fedilink
      arrow-up
      0
      arrow-down
      1
      ·
      4 months ago

      Don’t underestimate what having the necessary intuitions do engage with mathematics does for you. A significant portion of the population is incapable of that, mostly because we have a very poor way of teaching it as a subject.

      • kaffiene@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        4 months ago

        Funny you should say that as I was thinking that the idea that math has anything to do with programming is the biggest misconprehension I encounter.

        • datelmd5sum@lemmy.world
          link
          fedilink
          arrow-up
          1
          ·
          4 months ago

          Hey we did all sort of crazy shit with linear algebra, vectors matrices and shit in college programmlng. Now I sometimes do some basic arithmetic in work life. E.g:

          n = n + 1

  • fruitycoder@sh.itjust.works
    link
    fedilink
    arrow-up
    1
    ·
    4 months ago

    That the business idea, the design, the architecture, and code for the next multimillion dollar app is just sitting in my head waiting for the next guy with enough motivation to extract from me.

  • atheken@programming.dev
    link
    fedilink
    arrow-up
    1
    ·
    4 months ago

    Programming and Software Engineering are related, but distinct fields. Programming is relatively easy, Software Engineering is a bit harder and requires more discipline in my opinion.

  • stoly@lemmy.world
    link
    fedilink
    arrow-up
    1
    ·
    4 months ago

    I think that non-tech people think that tech just goes. Like you pull it out of a box and turn it on and it just works. They have no idea how much jenk is in everything and how much jenk was eradicated before a user came went anywhere near.

    • z00s@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      4 months ago

      I’m not in IT but used to work with a very old terminal based data storage and retrieval system.

      If the original programmers had implemented a particular feature, it was very easy to enter a command and have it spit out the relevant info.

      But as times changed, the product outgrew its original boundaries, and on a regular basis clients would ask for specific info that would require printing out decades worth of data before searching and editing it to get what the client wanted.

      I can not tell you how many times I heard the phrase, “Can’t you just push a button or something and get the information now??”

      The thing that infuriated me the most was the idea that somehow we could do that, but didn’t want to, as if there was some secret button under the desk that we could push for our favourite clients. Ugh.

  • hawgietonight@lemmy.world
    link
    fedilink
    arrow-up
    1
    ·
    4 months ago

    Doesn’t happen as much, but family and non tech friends would present me to other people that “worked with computers” thinking I could take new job opportunities. They were always wildly unrelated to my field.

    I know I know,… they acted in good faith, and probably could have adapted a bit, but like 30 years ago there was a lot of overlap and systems where somewhat similar, but now somebody trained in Linux kernel maintenance isn’t going to learn how to create SharePoint SPFx webparts. Development is very specific now!

    • Lojcs@lemm.ee
      link
      fedilink
      arrow-up
      1
      ·
      edit-2
      4 months ago

      I’ve been listening to stuff you missed in history class pod from the beginning and whenever something about computers, science or tech comes up they start being like hush hush don’t worry we won’t actually talk about it; as if the mere mention will scare away listeners

    • AwkwardLookMonkeyPuppet@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      4 months ago

      The best programmers I know are all super creative. You can’t solve real world problems with the limited tools available to us without creativity.

  • Fudoshin ️🏳️‍🌈@feddit.uk
    link
    fedilink
    arrow-up
    1
    ·
    4 months ago

    “Just”

    That one word has done a fuck ton of lifting over my career.

    “Can’t you just make it do this”

    I can’t “just” do anything you fuck head! It takes time and lots of effort!

    • CaptDust@sh.itjust.works
      link
      fedilink
      arrow-up
      1
      ·
      edit-2
      4 months ago

      “Just” is a keyword that I’m going to triple my estimates. “Just” signifies the product owner has no idea what they are requesting, and it always becomes a dance of explaining why they are wrong.

    • AwkwardLookMonkeyPuppet@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      4 months ago

      I get that from our product owners a lot, and I usually say “yes!”, followed by an explanation of how much time it will take and why it’s not the path we want to take. People respond well to you agreeing with them, and then explaining why it’s probably not the best approach.

    • spartanatreyu@programming.dev
      link
      fedilink
      arrow-up
      1
      ·
      4 months ago

      I like to say:

      We have a half finished skyscraper, and you’re asking me to Just add a new basement between the second and third floor. Do you see how that might be difficult? If we want to do it, we have to tear down the entire building floor by floor, then build up again from the second floor. Are you prepared to spend the money and push back the release date for that new feature?

    • pinchcramp@lemmy.dbzer0.com
      link
      fedilink
      arrow-up
      0
      ·
      4 months ago

      I would have written that comment if you hadn’t already done it.

      I don’t know exactly why people think that we can “just” do whatever they ask for.

      Maybe it has something to do with how invisible software is to the tech-illiterate person but I’m not convinced. I’m sure there are other professions that get similar treatment.

      • CaptDust@sh.itjust.works
        link
        fedilink
        arrow-up
        1
        ·
        4 months ago

        I know you built the bridge to support 40 ton vehicles, but I think if we just add a beam across the middle here, we should be able to get 200 tons across this no problem? Seems simple, please have it done by Monday!

        • Daedskin@lemm.ee
          link
          fedilink
          arrow-up
          0
          ·
          4 months ago

          I used to work on printer firmware; we were implementing a feature for a text box for if you scanned a certain number of pages on a collated, multi-page copy job. The text box told you it would print the pages it had stored to free up memory for more pages; after those pages had printed, another text box would come up asking if you wanted to keep scanning pages, or just finish the job.
          The consensus was that it would be a relatively simple change; 3 months and 80 files changed — with somewhere in the ballpark of 10000-20000 lines changed, — proved that wrong.

          • Feathercrown@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            4 months ago

            printer firmware is tens of thousands of lines long

            I’m starting to understand why printers are so horrible

            • Daedskin@lemm.ee
              link
              fedilink
              arrow-up
              0
              ·
              4 months ago

              Just what was in the main repo (at least one other repo was used for the more secure parts of the code) was a little over 4 million lines. But yeah there’s a lot of complexity behind printers that I didn’t think about until I had worked on them. Of course that doesn’t mean they have to be terrible, it’s just easier to fall into without a good plan (spoiler alert: the specific firmware I was working in didn’t have a good plan)

              • Feathercrown@lemmy.world
                link
                fedilink
                English
                arrow-up
                0
                ·
                4 months ago

                Out of curiosity do you have any good examples of this hidden complexity? I’ve always kinda wondered how printers work behind the scenes.

                • Daedskin@lemm.ee
                  link
                  fedilink
                  arrow-up
                  1
                  ·
                  4 months ago

                  A lot of the complexity came from around various scenarios you could be in; my goto whenever people would ask me “Why can’t someone just make printer firmware simple?” is that you could, if you only wanted to copy in one size with one paper type, no margin changes, and never do anything else.

                  There’s just so many different control paths that need to act differently; many of the bugs I worked on involved scaling and margins. Trying to make sure the image ended up in a proper form before it made it to hardware (which as more complexity, ran on a different processor and OS than the backend so that it could run realtime) when dealing with different input types (flatbed scanner vs a document feeder, which could be a everyday size, or like 3 feet long) different paper sizes, scaling, and output paper. I mainly worked on the copy pipeline, but that also was very complex, involving up to, something like, 7 different pieces in the pipe to transform the image.

                  Each piece in the pipeline was decently complex, with a few having their own team dedicated to them. In theory, any piece that wasn’t an image provider or consumer could go in any order — although in practice that didn’t happen — so it had to be designed around different types of image containers that could come in.

                  All of that was also working alongside the job framework, which communicated with the hardware, and made sure what state jobs were in, when different pieces of the pipeline could be available to different jobs, locking out jobs when someone is using the UI in certain states so that they don’t think what’s printing is their job, and handling jobs through any of other interface (like network or web.)

                  That’s the big stuff that I touched; but there was also localization; the UI and web interfaces as a whole; the more OS side of the printer like logging in, networking, or configuration; and internal pages — any page that the printer generates itself, like a report or test page. I’m sure there’s a lot more than that, and this is just what I’m aware of.

  • aluminium@lemmy.world
    link
    fedilink
    arrow-up
    1
    ·
    edit-2
    4 months ago

    A lot people compleatly overrate the amount of math required. Like its probably a week since I used a aritmetic operator.

    • CheeseNoodle@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      4 months ago

      On the other hand in certain applications you can replace a significant amount of programming ability with a good undertstanding of vector maths.

    • MrScottyTay@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      1
      ·
      4 months ago

      Sometimes when people see me struggle with a bit of mental maths or use a calculator for something that is usually easy to do mentally, they remark “aren’t you a programmer?”

      I always respond with “I tell computers how to do maths, I don’t do the maths”

      • lemmyvore@feddit.nl
        link
        fedilink
        English
        arrow-up
        0
        ·
        4 months ago

        Which leads to the other old saying, “computers do what you tell them to do, not what you want them to do”.

        As long as you don’t let it turn around and let the computer dictate how you think.

        I think it was Dijkstra that complained in one of his essays about naming uni departments “Computer Science” rather than “Comput_ing_ Science”. He said it’s a symptom of a dangerous slope where we build our work as programmers around specific computer features or even specific computers instead of using them as tools that can enable our mind to ask and verify more and more interesting questions.

        • huginn@feddit.it
          link
          fedilink
          arrow-up
          0
          ·
          4 months ago

          The scholastic discipline deserves that kind of nuance and Dijkstra was one of the greatest.

          The practical discipline requires you build your work around specific computers. Much of the hard earned domain knowledge I’ve earned as a staff software engineer would be useless if I changed the specific computer it’s built around - Android OS. An android phone has very specific APIs, code patterns and requirements. Being ARM even it’s underlying architecture is fundamentally different from the majority of computers (for now. We’ll see how much the M1 arm style arch becomes the standard for anyone other than Mac).

          If you took a web dev with 10YOE and dropped them into my Android code base and said “ok, write” they should get the structure and basics but I would expect them to make mistakes common to a beginner in Android, just as if I was stuck in a web dev environment and told to write I would make mistakes common to a junior web dev.

          It’s all very well and good to learn the core of CS: the structures used and why they work. Classic algorithms and when they’re appropriate. Big O and algorithmic complexity.

          But work in the practical field will always require domain knowledge around specific computer features or even specific computers.

          • lemmyvore@feddit.nl
            link
            fedilink
            English
            arrow-up
            0
            ·
            edit-2
            4 months ago

            I think Dijkstra’s point was specifically about uni programs. A CS curriculum is supposed to make you train your mind for the theory of computation not for using specific computers (or specific programming languages).

            Later during your career you will of course inevitably get bogged down into specific platforms, as you’ve rightly noted. And that’s normal because CS needs practical applications, we can’t all do research and “pure” science.

            But I think it’s still important to keep it in mind even when you’re 10 or 20 or 30 years into your career and deeply entrenched into this and that technology. You have to always think “what am I doing this for” and “where is this piece of tech going”, because IT keeps changing and entire sections of it get discarded periodically and if you don’t ask those questions you risk getting caught in a dead-end.

            • Miaou@jlai.lu
              link
              fedilink
              arrow-up
              1
              ·
              4 months ago

              He has a rant where he’s calling software engineers basically idiots who don’t know what they’re doing, saying the need for unit tests is a proof of failure. The rest of the rant is just as nonsensical, basically waving away all problems as trivial exercises left to the mentally challenged practitioner.

              I have not read anything from/about him besides this piece, but he reeks of that all too common, insufferable, academic condescendance.

              He does have a point about the theoretical aspect being often overlooked, but I generally don’t think his opinion on education is worth more than anyone else’s.

              Article in question: https://www.cs.utexas.edu/~EWD/transcriptions/EWD10xx/EWD1036.html

              • didnt_readit@lemmy.world
                link
                fedilink
                arrow-up
                1
                ·
                3 months ago

                Sounds about right for an academic computer scientist, they are usually terrible software engineers.

                At least that’s what I saw from the terrible coding practices my brother learned during his CS degree (and what I’ve seen from basically every other recent CS grad entering the workforce that didn’t do extensive side projects and self teaching) that I had to spend years unlearning him afterwards when we worked together on a startup idea writing lots of code.

    • lemmyvore@feddit.nl
      link
      fedilink
      English
      arrow-up
      0
      ·
      4 months ago

      At the same time, I find it amazing how many programmers never make the cognitive jump from the “playing with legos” mental model to “software is math”.

      They’re both useful, but to never understand the latter is a bit worrying. It’s not about using math, it’s about thinking about code and data in terms of mapping arbitrary data domains. It’s a much more powerful abstraction than the legos and enables you to do a lot more with it.

      For anybody who finds themselves in this situation I recommend an absolute classic: Defmacro’s “The nature of Lisp”. You don’t have to make it through the whole thing and you don’t have to know Lisp, hopefully it will click before the end.

      • Lojcs@lemm.ee
        link
        fedilink
        arrow-up
        1
        ·
        edit-2
        4 months ago

        Read that knowing nothing of lisp before and nothing clicked tbh.

        When talking about tools that simplify writing boilerplate, it only makes sense to me to call them code generatiors if they generate code for another language. Within a single language a tool that simplifies complex tasks is just a library or could be implemented as a library. I don’t see the point with programmers not utilizing ‘code generation’ due to it requiring external tools. They say that if such tools existed in the language natively:

        we could save tremendous amounts of time by creating simple bits of code that do mundane code generation for us!

        If code is to be reused you can just put it in a function, and doing that doesn’t take more effort than putting it in a code generation thingy. They preach how the xml script (and lisp I guess) lets you introduce new operators and change the syntax tree to make things easier, but don’t acknowledge that functions, operator overriding etc accomplish the same thing only with different syntax, then go on to say this:

        We can add packages, classes, methods, but we cannot extend Java to make addition of new operators possible. Yet we can do it to our heart’s content in XML - its syntax tree isn’t restricted by anything except our interpreter!

        What difference does it make that the syntax tree changes depending on your code vs the call stack changes depending on your code? Of course if you define an operator (apparently also called a function in lisp) somewhere else it’ll look better than doing each step one by one in the java example. Treating functions as keywords feels like a completely arbitrary decision. Honestly they could claim lisp has no keywords/operators and it would be more believable. If there is to be a syntax tree, the parenthesis seem to be a better choice for what changes it than the functions that just determine what happens at each step like any other function. And even going by their definition, I like having a syntax that does a limited number of things in a more visually distinct way more than a syntax does limitless things all in the same monotonous way.

        Lisp comes with a very compact set of built in functions - the necessary minimum. The rest of the language is implemented as a standard library in Lisp itself.

        Isn’t that how every programming language works? It feels unfair to raise this as an advantage against a markup language.

        Data being code and code being data sounded like it was leading to something interesting until it was revealed that functions are a seperate type and that you need to mark non-function lists with an operator for them to not get interpreted as functions. Apart from the visual similarity in how it’s written due to the syntax limitations of the language, data doesn’t seem any more code in lisp than evaluating strings in python. If the data is valid code it’ll work, otherwise it won’t.

        The only compelling part was where the same compiler for the code is used to parse incoming data and perform operations on it, but even that doesn’t feel like a game changer unless you’re forbidden from using libraries for parsing.

        Finally I’m not sure how the article relates to code being math neither. It just felt like inventing new words to call existing things and insisting that they’re different. Or maybe I just didn’t get it at all. Sorry if this was uncalled for. It’s just that I had expected more after being promised enlightenment by the article

        • didnt_readit@lemmy.world
          link
          fedilink
          arrow-up
          1
          ·
          3 months ago

          This is a person that appears to actually think XML is great, so I wouldn’t expect them to have valid opinions on anything really lol

        • nottelling@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          4 months ago

          Function/class/variables are bricks, you stack those bricks together and you are a programmer.

          I just hired a team to work on a bunch of Power platform stuff, and this “low/no-code” SaaS platform paradigm has made the mentality almost literal.

          • amio@kbin.social
            link
            fedilink
            arrow-up
            1
            ·
            4 months ago

            I think I misunderstood lemmyvore a bit, reading some criticism into the Lego metaphor that might not be there.

            To me, “playing with bricks” is exactly how I want a lot of my coding to look. It means you can design and implement the bricks, connectors and overall architecture, and end up with something that makes sense. If running with the metaphor, that ain’t bad, in a world full of random bullshit cobbled together with broken bricks, chewing gum and exposed electrical wire.

            If the whole set is wonky, or people start eating the bricks instead, I suppose there’s bigger worries.

            (Definitely agree on “low code” being one of those worries, though - turns into “please, Jesus Christ, just let me write the actual code instead” remarkably often. I’m a BizTalk survivor and I’m not even sure that was the worst.

      • ChubakPDP11+TakeWithGrainOfSalt@programming.dev
        link
        fedilink
        arrow-up
        0
        arrow-down
        1
        ·
        4 months ago

        I think you are irresponsible towards your future if you are a gainfully employed self-taught programmer, and don’t invest in formal education. If you say ‘I don’t have time!’ well, consider this, even night classes in junior colleges teach you stuff you don’t know. Go to them, get your associates. I am in the process of getting into a contract where I do some thankless jobs for someone I know, he in exchange pays me to go to college. I am 31 – as I said in the other thread, THERE IS NOTHING WRONG WITH BEING A LATE-COLLEGER!

        I have been to college, I have studied 3 subjects for a total of 9 semesters, I have no degree to show for any of them :( I quit English lit, French lit and “Programming” all after 3 semesters. But when I was studying French lit, there was a guy in our class who was SIXTY-FIVE YEARS OLD! He wanted to learn French to open up some a commerce consulting office, focusing on import/export from France.

        What I wanted to do was to ‘write’, keep in mind, ‘write’, not ‘draw’ bande dessine! But now that I am older and hopefully wiser, I have a set goal in mind. I am going to go this ‘boutic’ college near our home to study Electronics Engineering and when push comes to shove and China makes its move, start a chipset engineering firm with a production wing.

        Just like how electronics is math with physics, programming is the virtual aspect of it. it’s ‘applied math’. I understand enough discmath because I studied enough of it both in college, and high school (since I was math/physics elective) so I have managed to understand some very rough and old papers.

        You can always self-study if you haven’t got the time. Here’s a book which is kind of a meme, but it’s still very good: https://fuuu.be/polytech/INFOF408/Introduction-To-The-Theory-Of-Computation-Michael-Sipser.pdf

        This is the 2nd edition though, 3rd is out — I think 4th is coming. The best goddamn book, regardless of its meme status.

  • blazeknave@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    4 months ago

    As a non-dev (tinker for fun) observer- it sounds like your friends and family think you’re working in IT, but their assumptions thereafter are fair. Is that accurate? That the misconception is software dev does not equal IT?

    • Buddahriffic@lemmy.world
      cake
      link
      fedilink
      arrow-up
      1
      ·
      4 months ago

      It goes a bit farther than that, even: IT work doesn’t always equal IT work. Someone can be an expert in managing Linux-based load sharing servers and have no idea how to help a family member troubleshoot why their windows install is slow. Sure, they might have a better idea about how to start, but they’d be essentially starting from scratch for that specific problem rather than being able to apply any of their expertise to it.

      Think of it like a programmer is a car builder, some IT people drive them for a living, others are mechanics. Someone who specializes in driving F1 cars might not have any idea why your car is rattling. The programmer might be able to figure it out if they built that car or the cause is something similar to what they see in the ones they have built. But if they build semis, odds are that isn’t the case. But they might have a better idea than say a doctor.

      • Neofox@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        4 months ago

        I use the medecine analogy: you wouldn’t ask your dentist or even your GP to operate on your brain; doesn’t mean that they are not good at what they do though.

  • mox@lemmy.sdf.org
    link
    fedilink
    arrow-up
    0
    ·
    4 months ago

    The notion that creating a half-decent application is quick and easy enough that I would be willing to transform their idea into reality for free.

    • Lung@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      4 months ago

      I’m pretty sure that government software always blows because they think software can be written according to a fixed schedule and budget

      It’s tempting to think it’s like building a house, and if you have the blueprints & wood, it’ll just be fast and easy. Everything will go on schedule

      But no, in software, the “wood” is always shape shifting, the land you’re building on is shape shifting, some dude in Romania is tryna break in, and the blueprints forgot that you also need plumbing and electric lines

      • mathemachristian@lemm.ee
        link
        fedilink
        arrow-up
        0
        arrow-down
        2
        ·
        4 months ago

        It’s tempting to think it’s like building a house, and if you have the blueprints & wood, it’ll just be fast and easy. Everything will go on schedule

        it never goes according to schedule eve if there is blueprint & wood

  • treechicken@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    4 months ago

    Not programming per se but my sister thinks it’s okay to have 300+ Chrome tabs open and just memorize the relative locations of them whenever she needs something. She’s lucky she has a beefy computer.

  • AwkwardLookMonkeyPuppet@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    4 months ago

    It has been a long time since I’ve interacted with people who are largely tech ignorant, but back in the day people always assumed I could hack anything since I’m a website developer. It wasn’t uncommon for people to ask me if I can hack Facebook. I mean the answer is “probably not, but maybe”, but they think that means furiously typing for 20 seconds and yelling “I’m in!”, when the reality would be months worth of snooping and social engineering.

    • darkpanda@lemmy.ca
      link
      fedilink
      arrow-up
      1
      ·
      4 months ago

      Why wouldn’t you just create a GUI interface in Visual Basic to track their IP addresses tho?

      • coloredgrayscale@programming.dev
        link
        fedilink
        arrow-up
        1
        ·
        4 months ago

        That was a decade or two ago. Now you need a react SPA webapp using angular and Rust and utilize the bandwidth of the Cloud with machine learning. To find the IP.

    • scorpionix@feddit.de
      link
      fedilink
      arrow-up
      0
      ·
      4 months ago

      A friend asked me to atempt data recovery on some photos which ‘vanished’ off an USB stick.

      Plugged it in, checked for potential hidden trash folders, then called it a day. Firstly I havenever done data forensic and secondly: No backup? No mercy.

        • scorpionix@feddit.de
          link
          fedilink
          arrow-up
          0
          ·
          4 months ago

          Well, here’s the important part:

          I have never done data forensic

          So yeah, I didn’t know that at the time. Anyway: Which tools are you talking about in particular?

          • korok@possumpat.io
            link
            fedilink
            arrow-up
            1
            ·
            4 months ago

            Someone else already named some tools, so I won’t repeat. But the reason this works is that even once you clear out those trash files, the OS usually only removes the pointer to where the data lives on the disk, and the disk space itself isn’t overwritten until it’s needed to save another file. This is why these tools have a much higher chance of success sooner after file deletion.