Archived link

The polyfill.js is a popular open source library to support older browsers. 100K+ sites embed it using the cdn.polyfill.io domain. Notable users are JSTOR, Intuit and World Economic Forum. However, in February this year, a Chinese company bought the domain and the Github account. Since then, this domain was caught injecting malware on mobile devices via any site that embeds cdn.polyfill.io. Any complaints were quickly removed (archive here) from the Github repository.

  • originalucifer@moist.catsweat.com
    link
    fedilink
    arrow-up
    167
    arrow-down
    26
    ·
    2 years ago

    nah. over 100k sites ignored dependency risks, even after the original owners warned them this exact thing would happen.

    the real story is 100k sites not being run appropriately.

    • douglasg14b@lemmy.world
      link
      fedilink
      English
      arrow-up
      79
      arrow-down
      4
      ·
      2 years ago

      That’s not how systemic problems work.

      This is probably one of the most security ignorant takes on here.

      People will ALWAYS fuck up. The world we craft for ourselves must take the “human factor” into account, otherwise we amplify the consequences of what are predictable outcomes. And ignoring predictable outcomes to take some high ground doesn’t cary far.

      The majority of industries that actually have immediate and potentially fatal consequences do exactly this, and have been for more than a generation now.

      Damn near everything you interact with on a regular basis has been designed at some point in time with human psychology in mind. Built on the shoulders of decades of research and study results, that have matured to the point of becoming “standard practices”.

      • oce 🐆@jlai.lu
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        2
        ·
        2 years ago

        Ok, people will always fuck up, so what do you do?

        The majority of industries that actually have immediate and potentially fatal consequences do exactly this, and have been for more than a generation now.

        All the organizations (including public) getting ransomware and data stolen, it’s because the consequences are not that bad? It is not gross negligence?

        • douglasg14b@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          2 years ago

          I’m not sure if this is just a rhetorical question or a real one?

          Because I didn’t claim it isn’t negligence. It is negligent, however, it is not a problem solvable by just pointing fingers. It’s a problem that solvable through more strict regulation and compliance.

          Cyber security is almost exactly the same as safety in other industries. It takes the same mindset, it manifests in the same ways under the same conditions, it tends to only be resolved and enforced through regulations…etc

          And we all know that safety is not something solvable by pointing fingers, and saying “Well Joe Smo shouldn’t have had his hand in there then”. You develop processes to avoid predictable outcomes.

          That’s the key word here, predictable outcomes, these are predictable situations with predictable consequences.


          The comment above mine is effectively victim blaming, it’s just dismissing the problem entirely instead of looking at solutions for it. Just like an industry worker being harmed on the job because of the negligence of their job site, there are an incredibly large number of websites compromised due to the negligence of our industry.

          Just like the job site worker who doesn’t understand the complex mechanics of the machine they are using to perform their work, the website owner or maintainer does not understand the complex mechanics of the dependency chains their services or sites rely on.

          Just like a job site worker may not have a good understanding of risk and risk mitigation, a software engineer does not have a good understanding of cybersecurity risk and risk mitigation.

          In a job site this is up to a regulatory body to define, utilizing the expertise of many, and to enforce this in job sites. On job sites workers will go through regular training and exercises that educate them about safety on their site. For software engineers there is no regulatory body that performs enforcement. And for the most part software engineers do not go through regular training that informs them of cybersecurity safety.

    • ShaunaTheDead@fedia.io
      link
      fedilink
      arrow-up
      47
      ·
      2 years ago

      One place I worked at recently was still using Node version 8. Running npm install would give me a mini heart attack… Like 400+ critical vulnerabilities, it was several thousand vulnerabilities all around.

      • corsicanguppy@lemmy.ca
        link
        fedilink
        English
        arrow-up
        20
        arrow-down
        1
        ·
        2 years ago

        Running npm install would give me a mini heart attack

        It should; but more because it installs things right off the net with no validation. Consistency of code product is not the only thing you’re tossing.

        • LordCrom@lemmy.world
          link
          fedilink
          English
          arrow-up
          9
          ·
          2 years ago

          How else would you get LPAD ? Expect me to write 2 lines of code when I could just import a 100 Mb library to do it for me?

        • AA5B@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          ·
          2 years ago

          You need to get up to date from three years ago. NodeJS 16.20, or thereabouts, enabled dependency auditing by default.

          I’m still fighting my engineers go get current enough to use this (but we do have a proxy artifact server that also attempts to keep downloads clean, and a dependency scanner)

      • unalivejoy@lemm.ee
        link
        fedilink
        English
        arrow-up
        5
        ·
        2 years ago

        If you’re on RHEL 8+, you can install the latest version of node with dnf.

        dnf install nodejs will likely install node 8 :(. Use dnf module install nodejs:20 to install the latest version.

    • Optional@lemmy.world
      link
      fedilink
      English
      arrow-up
      33
      arrow-down
      3
      ·
      2 years ago

      the real story is 100k sites not being run appropriately.

      Same as it ever was. Same as it ever was. Same as it ever was.

      • Warl0k3@lemmy.world
        link
        fedilink
        English
        arrow-up
        56
        arrow-down
        1
        ·
        2 years ago

        I don’t think we have to choose. “Maintain your websites so you don’t get taken advantage of” and “Here’s an example of a major-world-power-affiliated group exploting that thing you didn’t do” are both pretty important stories.

      • themurphy@lemmy.ml
        link
        fedilink
        English
        arrow-up
        27
        arrow-down
        1
        ·
        2 years ago

        The malware thing still deserves a headline. They just argue it’s stupid so many even have to use the library to begin with.

    • letsgo@lemm.ee
      link
      fedilink
      English
      arrow-up
      10
      ·
      2 years ago

      What rules can we add that solve this problem? (I’ve tried DDG but didn’t find any results)

      • Supermariofan67@programming.dev
        link
        fedilink
        English
        arrow-up
        31
        ·
        2 years ago

        This one is already in the default uBlock filters - Badware risks

        I also strongly suggest adding https://big.oisd.nl/ as a filter list. It’s a large and well maintained domain blocklist (sourced from combining lots of other blocklists) that usually adds lots of these sorts of domains quickly and has very few false positives.

        If you want to take it even further, check out the Pro list and Thread Intelligence Feeds list here https://github.com/hagezi/dns-blocklists

        These can all be added to a pihole too if you use one.

      • ChilledPeppers@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        2 years ago

        cdn.polyfill.io^ ? By now it was probably already added to the default lista tho…

        (I dont really understant these things tho, so correct me if I’m wrong)

    • cheese_greater@lemmy.world
      link
      fedilink
      English
      arrow-up
      23
      arrow-down
      5
      ·
      edit-2
      2 years ago

      Is therea way to ground them for a month while they think about what they did?

      Take away his honey No more honey on school nights

      Edit: could the US/NATO ground China in some debilliting/deterrent way? Like geopolitically/economically spank them and send them to bed without dinner until they come to Jesus?

      • unalivejoy@lemm.ee
        link
        fedilink
        English
        arrow-up
        18
        arrow-down
        4
        ·
        2 years ago

        All Chinese businesses are owned by the CCP, except the ones that get caught being naughty. Suddenly those are a private business with no ties to the party.

      • Strykker@programming.dev
        link
        fedilink
        English
        arrow-up
        7
        ·
        2 years ago

        All it would really take is internet providers to black hole the China AS numbers in their BGP configs. Then boom China basically can’t talk to the rest of the world.

        • cheese_greater@lemmy.world
          link
          fedilink
          English
          arrow-up
          7
          arrow-down
          2
          ·
          edit-2
          2 years ago

          This should be done with the new axis of evil and let them see how much they truly hate and “need” the destruction of the decadent West. Its insane their shenanigans are still being tolerated at all, cut em off and let them build their own self-sustaining economies and force the West to eliminate their dependance on mercurial and malicious actors on the world stage.

          • Allero@lemmy.today
            link
            fedilink
            English
            arrow-up
            4
            arrow-down
            1
            ·
            2 years ago

            Let’s not make the splinternet a reality, pretty please.

            Chinese scaling and manufacturing, Russian IT expertise, Iranian experience of sanctions evasion and North Korean hacking and remote operations mastery are not the combo you want to bet against.

            They would absolutely build the self-sustaining economy and rival networks, but in the process it would destroy the Internet as we know it, and break communication channels that are vital for democracy and international peace, while also breaking communications between relatives and friends on the two sides.

  • dan@upvote.au
    link
    fedilink
    English
    arrow-up
    67
    arrow-down
    1
    ·
    edit-2
    2 years ago

    My favourite part is that the developers that currently own it said:

    Someone has maliciously defamed us. We have no supply chain risks because all content is statically cached

    https://github.com/polyfillpolyfill/polyfill-service/issues/2890#issuecomment-2191461961

    Completely missing the point that they are the supply chain risk, and the fact that malicious code was already detected in their system (to the point where Google started blocking ads for sites that loaded polyfill .io scripts.

    We don’t even know who they are - the repo is owned by an anonymous account called “polyfillpolyfill”, and that comment comes from another anonymous account “polyfillcust”.

  • dan@upvote.au
    link
    fedilink
    English
    arrow-up
    49
    ·
    2 years ago

    Reposting my comment from Github:

    A good reminder to be extremely careful loading scripts from a third-party CDN unless you trust the owner 100% (and even then, ownership can change over time, as shown here). You’re essentially giving the maintainer of that CDN full control of your site. Ideally, never do it, as it’s just begging for a supply chain attack. If you need polyfills for older browsers, host the JS yourself. :)

    If you really must load scripts from a third-party, use subresource integrity so that the browser refuses to load it if the hash changes. A broken site is better than a hacked one.


    And on the value of dynamic polyfills (which is what this service provides):

    Often it’s sufficient to just have two variants of your JS bundles, for example “very old browsers” (all the polyfills required by the oldest browser versions your product supports) and “somewhat new browsers” (just polyfills required for browsers released in the last year or so), which you can do with browserslist and caniuse-lite data.

      • dan@upvote.au
        link
        fedilink
        English
        arrow-up
        1
        ·
        2 years ago

        You’d be surprised how much code people blindly reuse without even looking at it, especially in JavaScript. The JS standard library is ridiculously small, so nearly all JS apps import third-party code of some sort. One JS framework can pull in hundreds of third-party modules.

    • Echo Dot@feddit.uk
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 years ago

      Yeah I used to be guilty of this. Although in slight defense of myself I never used to use random sites like that I always used to pull everything from Google CDN since I can’t see that changing hands.

      They may very well shut it down without warning, but they’re probably not going to sell it to anyone.

      • dan@upvote.au
        link
        fedilink
        English
        arrow-up
        1
        ·
        2 years ago

        Yeah, it really depends on how much you trust the vendor.

        Google? Say what you want about the company, but they’ll never intentionally serve malware.

        Random company with no track record where we don’t even know who is maintaining the code? Much less trustworthy. The polyfill . io repo is currently owned by a Github user called “polyfillpolyfill” with no identifying information.

        Third-party CDNs make less sense these days though. A lot of hosting services have a CDN of some sort. Most sites have some sort of build process, and you usually bundle all your JS and CSS (both your code and third-party code, often as separate bundles) as part of that.

  • QuadratureSurfer@lemmy.world
    link
    fedilink
    English
    arrow-up
    20
    ·
    2 years ago

    That GitHub “archive here” link leads to a page where it hasn’t been archived… (or was the archive removed??).

  • Bertuccio@lemmy.world
    link
    fedilink
    English
    arrow-up
    21
    arrow-down
    2
    ·
    2 years ago

    Whichever editor let them post “100 thousand” should be spanked one 100 times with the severed hand of whatever asshole wrote it in the first place.

  • NutWrench@lemmy.world
    link
    fedilink
    English
    arrow-up
    17
    arrow-down
    6
    ·
    2 years ago

    This is probably connected to China cloning the entire GitHub website to their own servers.

    • UnderpantsWeevil@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 years ago

      Frustrating that the article doesn’t specify and simply links to a different Github page which doesn’t clearly specify the problem either.

      I have to assume the site’s article was dynamically generated, without any actual tech journalist doing the reporting. The byline is “Sansec Forensics Team” which doesn’t even link out to the group. Also, the “Chinese Company” isn’t named either it the article or the references, which is incredibly shoddy reporting. The archive link is dead.

      This whole page is indicative of the failed state of tech journalism. A genuinely explosive story but its so threadbare and vague that it becomes meaningless.

  • sunzu@kbin.run
    link
    fedilink
    arrow-up
    19
    arrow-down
    36
    ·
    2 years ago

    Noscript would fix this issue… Deny most of that shit and internet still works… Mostly

    • 9point6@lemmy.world
      link
      fedilink
      English
      arrow-up
      66
      arrow-down
      6
      ·
      2 years ago

      Not a solution. Much of the modern web is reliant on JavaScript to function.

      Noscript made sense when the web was pages with superfluous scripts that enhanced what was already there.

      Much of the modern web is web apps that fundamentally break without JS. And picking and choosing unfortunately won’t generally protect from this because it’s common practice to use a bundler such as webpack to keep your page weight down. This will have been pulled in as a dependency in many projects and the site either works or does not based on the presence of the bundle.

      Not saying this is a great situation or anything, but suggesting noscript as a solution is increasingly anachronistic.

      • dan@upvote.au
        link
        fedilink
        English
        arrow-up
        4
        ·
        2 years ago

        This will have been pulled in as a dependency in many projects and the site either works or does not based on the presence of the bundle.

        This wasn’t bundled. People inserted a script tag pointing to a third-party CDN onto their sites. The output changes depending on the browser (it only loads the polyfills needed for the current browser) so you can’t even use a subresource integrity hash.

      • Optional@lemmy.world
        link
        fedilink
        English
        arrow-up
        15
        arrow-down
        11
        ·
        2 years ago

        Much of the modern web is reliant on JavaScript to function.

        “function” is doing a lot of lifting there. Trackers, ads, and assorted other bullshit is not the kind of functioning anyone needs.

        It’s true the average user gets flummoxed quickly when the scripts are blocked, but they can either sink (eat ads and trackers) or swim (learn what scripts to allow). (Spoiler: they almost always sink)

      • btaf45@lemmy.worldBanned
        link
        fedilink
        English
        arrow-up
        3
        ·
        2 years ago

        Not a solution. Much of the modern web is reliant on JavaScript to function.

        And much of it works better and faster without JavaScript. Some sites don’t work in Noscript, but most sites run faster and work well enough.

          • dan@upvote.au
            link
            fedilink
            English
            arrow-up
            1
            ·
            2 years ago

            In this case the script wasn’t bundled at all - it was hotlinked from a third party CDN. Adding malicious code instantly affects all the sites that load it.

            The output differs depending on browser (it only loads the polyfills your browser needs) so it’s incompatible with subresource integrity.

        • PopOfAfrica@lemmy.world
          link
          fedilink
          English
          arrow-up
          5
          ·
          2 years ago

          Imo, computing, like all other things, requires a little trust and risk. The problem is most people are Wayyy to trusting in general.

      • parpol@programming.dev
        link
        fedilink
        English
        arrow-up
        8
        arrow-down
        14
        ·
        2 years ago

        I definitely prefer using no-script enabled pages. If it were me, I would prefer a fully non-JavaScript internet with static pages.

        JavaScript introduces so many vulnerabilities, it makes adobe flashplayer look like a security suite. JavaScript also breaks all accessibility features like speech recognition and font size and color control.

        • 9point6@lemmy.world
          link
          fedilink
          English
          arrow-up
          19
          arrow-down
          2
          ·
          2 years ago

          Flash was magnitudes worse than the risk of JS today, it’s not even close.

          Accessibility is orthogonal to JavaScript if the site is being built to modern standards.

          Unfortunately preference is not reality, the modern web uses JavaScript, no script is not an effective enough solution.

            • 9point6@lemmy.world
              link
              fedilink
              English
              arrow-up
              2
              ·
              edit-2
              2 years ago

              Well, by that measure, you don’t need JavaScript to make inaccessible sites, there are plenty of sites out there that ruin accessibility with just HTML and CSS alone.

              It’s always up to the developer to make sure the site is accessible. At least now it seems to be something that increasingly matters to search result rankings

              • parpol@programming.dev
                link
                fedilink
                English
                arrow-up
                2
                ·
                2 years ago

                You really can’t. If it was only HTML and CSS, any accessibility program would be able to select any part of the page, and easily alter the CSS and HTML. That is next to impossible now because of JavaScript.

                It shouldn’t be up to the website developer. It should be up to the browser developer. You don’t blame a lemmy instance for poor accessibility with Jerboa.

          • parpol@programming.dev
            link
            fedilink
            English
            arrow-up
            1
            ·
            2 years ago

            Flash was containerized, and completely safe until adobe just stopped supporting it. A million times better than what JavaScript has become in terms of privacy. There is a reason noscript is bundled with Tor.

            And preference is definitely a reality. It is niche at the moment but I see a future where more and more people see JavaScript for what it is. Bloat.

            • 9point6@lemmy.world
              link
              fedilink
              English
              arrow-up
              2
              ·
              2 years ago

              Flash ran as a browser plugin (as in not an extension, but a native binary that is installed into the OS and runs beside the browser, we basically don’t do this for anything now)

              Flash was pretty much on weekly security bulletins in the final years, arbitrary code execution and privilege escalation exploits were common, that’s why Adobe killed it.

              Flash was never safe and comparing JavaScript to it as a greater risk shows you’ve not fully understood the threat model of at least one of the two.

              • parpol@programming.dev
                link
                fedilink
                English
                arrow-up
                1
                ·
                2 years ago

                We still use plugins. In fact you most likely have one installed right now for video encoding. JavaScript not being a plugin is the reason we only have two major browser cores. Chromium and gecko. JavaScript prevents new browsers from entering the ecosystem due to how hard it is to implement unlike how easy it would have been as a plugin.

                Flash had vulnerabilities because of neglect from adobe. The core design of flash and its earlier stages made by Macromedia were great. It had a sandboxes environment, and later it even was integrated into a browser sandbox just like JavaScript, eliminating most vulnerabilities.

                Flash was very limited in the malicious code it could run, as opposed to JavaScript which can automatically redirect you to malicious websites, install tracking cookies, access the browser canvas to install tracking pixels, freeze your entire browser, take control of your cursor, look at your entire clipboard history, collect enough information about you to competely identify and track your footprint over the entire internet.

                Flash couldn’t access your clipboard or files unless you clicked allow every time, couldn’t access anything outside of its little window, and if it froze, the browser was mostly unaffected, and flash had almost no ability to collect any data about your browser.

    • dactylotheca@suppo.fi
      link
      fedilink
      English
      arrow-up
      30
      arrow-down
      4
      ·
      2 years ago

      and internet still works… Mostly

      That load-bearing “mostly” is doing a lot of work here.

      I invite everybody to find out how everything “mostly” works if you disable “most of” javascript – also have fun deciding which parts to enable because you think they’re trustworthy

      • valaramech@fedia.io
        link
        fedilink
        arrow-up
        11
        arrow-down
        1
        ·
        2 years ago

        I actively do this with uMatrix - granted, I only block non-first-party JavaScript. Most sites I visit only require a few domains to be enabled to function. The ones that don’t are mostly ad-riddled news sites.

        There are a few exceptions to this - AWS and Atlassian come to mind - but the majority of what I see on the internet does actually work more or less fine when you block non-first-party JavaScript and some even when you do that. uMatrix also has handy bundles built-in for certain things like sites that embed YouTube, for example, that make this much easier.

        Blocking non-first-party like I do does actually solve this issue for the most part, since, according to the article, only bundles that come from the cdn.polyfill.io domain itself that were the problem.

        • dactylotheca@suppo.fi
          link
          fedilink
          English
          arrow-up
          8
          ·
          2 years ago

          You’re still trusting that the 1st party javascript won’t be vulnerable to supply chain attacks, though

          • valaramech@fedia.io
            link
            fedilink
            arrow-up
            3
            arrow-down
            1
            ·
            2 years ago

            In my experience, first-party JavaScript is more likely to be updated so rarely that bugs and exploits are more likely than supply chain attacks. If I heard about NPM getting attacked as often as I hear about CDNs getting attacked, I’d be more concerned.

            • vxx@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              2
              ·
              2 years ago

              Funny that they want you to allow all java scripts but then criticise first party scripts for being unsave.

              I bet [insert random autocrat here] would approve of that message.

      • Optional@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        arrow-down
        2
        ·
        2 years ago

        I invite everybody to find out how everything “mostly” works if you disable “most of” javascript – also have fun deciding which parts to enable because you think they’re trustworthy

        Having done this for many many years, I can tell you: if you allow the site scripts (which is an acknowledgement of js at least), and a few “big” ones like ajax.google.com, jquery.com, and ytimg.com, etc., you then find a smaller subset of annoying-but-necessary-for-individual-websites that you can enable as needed or just add them as trusted if you’re into that kind of thing.

        After that you have the utter garbage sites with 30 scripts of tracking data-sucking bullshit (CNN, looking at you) and for those sites I have said “Thou shalt bite my shiny metal ass” and i just don’t go there.

        It’s a concession to js, yes, but it’s also not free rein to trample all over the surfing experience. Totally worth the time to work out.

      • parpol@programming.dev
        link
        fedilink
        English
        arrow-up
        7
        arrow-down
        3
        ·
        2 years ago

        I’ve been using noscript for years. I don’t even have to open up the blocklist anymore because I’ve successfully unblocked only the necessary scripts on all sites I ever visit. I get no trackers, no bloat, no google analytics, no Facebook, no microsoft, no ads, and no adblocker notifications.

        • DaGeek247@fedia.io
          link
          fedilink
          arrow-up
          3
          arrow-down
          2
          ·
          2 years ago

          I’ve been using noscript for years.

          Yeah, it took me about that long to get my regular websites working right too. And then i had to reinstall for unrelated reasons and all that customisation was gone.

          • Optional@lemmy.world
            link
            fedilink
            English
            arrow-up
            6
            ·
            2 years ago

            While you can back it up, at least once you’ve suffered the loss multiple times you can get it 90% back on first re-visit after reinstall.

          • parpol@programming.dev
            link
            fedilink
            English
            arrow-up
            1
            ·
            edit-2
            2 years ago

            It takes 2 clicks to get a website to work. It took a few minutes for me to get all my most commonly visited websites to work. And you can backup and restore so it takes a few minutes to sync the customization to all devices.