More specifically, are we seeing companies breached due to their (obvious?) security flaws, hackers getting better at what they do, or a combination of both?

What is the future of security for these large companies that we put our trust into that our data is safe?

  • Tanoh@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    8 months ago

    Security is hard. Especially at the scale of those companies. Since they are big, they get a lot more hacking attempts. Makes more sense for bad actors to attack someone with millions of customers than your mom & pop store that might have hundreds, if everything being equal.

    More and more people and compa ies wants to store things “in the cloud”, (read: someone else’s server). It is for the most part a good thing as it makes it easier to access, but it also opens up bigger and other attack vectors.

    So, I think the number of breeches will only increase. Not always because the companies have bad security (though sometimes it is 100% that), but also because the attack vectors keep growing due to changed business decisions and user preferences.

    • saltesc@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      edit-2
      8 months ago

      Also, data governance is attrocious in most places. Some of the things I’ve seen ICT do with PII is mind-blowing. I’ve been a part of three large breaches (two ransomwware and one data theft/sale) and it’s always ironically been because of ICT managers.

      I’ve caught a senior manager storing employee and device information for 17K staff in a Google Sheet on their personal account so they could distribute it to an external consultancy. I stumbled across the URL in an email chain, confirmed it was fully publicly accessible—anyone in the world could see it if they had the URL—and had been live for two months. This was apparently the safe workaround for emailing it as a file… They didn’t understand what was so wrong until I declared a formal breach internally. I can only assume that info got out but there was obviously no way of knowing. Names, addresses, genders, DOBs, etc. for employees. Then MAC addresses, IMEIs, network locations, serials, etc. for devices. Just sitting there…

  • BrikoX@lemmy.zip
    link
    fedilink
    English
    arrow-up
    0
    ·
    8 months ago

    It was always happening and at a large scale, it’s just there are new reporting requirements now, which leads to more nonspecialized publications to cover it more often.

  • The Bard in Green@lemmy.starlightkel.xyz
    link
    fedilink
    arrow-up
    0
    ·
    edit-2
    8 months ago

    As someone who does cybersecurity consulting for govt. contractors, companies invest in security when some external force forces them to, and then they spend the bare minimum to meet whatever that force requires (and they try to get away with less at every opportunity).

    Right now in government contracting we’re experiencing this paradigm shift where the NIST-800-171 standard (which everyone was required to follow, but kind of on the honor system) is going to be replaced in 21 months or so by something called CMMC (Cybersecurity Maturity Model Certification). But, CMMC is basically just the same requirements as NIST-800-171, so why?

    BECAUSE, everyone just SAYS they’re NIST-800-171 compliant on all their contracts. Everyone self scores themselves on it and gets a WAY higher score than they do when being scored by a 3rd party, and then reports their self scores up the chain. The way this works in both DoD and NASA projects, which is what I’m familiar with, is the big players like Boeing, Northrop Grumman, Raytheon, etc, have thousands of smaller suppliers and those suppliers have smaller suppliers, so the requirements flow down from the govt to the big contractors to the small subcontractors and each link in the chain is responsible for making sure the upstream links are compliant… which they NEVER are, but they all say they are!

    Of course, the government KNOWS this is happening, but lacks the resources to do anything about it. So the solution is to make everyone get third party certified that they are compliant. Half that industry is setting themselves up for failure to meet that deadline (which, of course, has already been delayed and pushed back multiple times) and I have a feeling that when small companies start failing their CMMC certs, they’re going to get stern warnings instead of losing their contracts because the government has to buy shit from someone.

    When I talk to the money / business people at my clients, this goes in one ear and out the other.

    There are wide spread (willful) misconceptions among those folks that cybersecurity is something IT people do and everyone else just does their jobs without having to think about it. I’ve had CEOs say things like “No, we’re not doing that, we can’t work that way.” when I educate them about their requirements… and then look to me to provide the solution where they don’t have to change anything about the way they work and when I can’t, they get frustrated with me and my team. I’ve had them ask me “Well what do the big companies do?” and I say “Look, they actually TRY to do all these things they require you to do and they fail at it ALL the time, but I’ve heard you complain about how their bureaucracy and rules slow everything down and make working with them difficult. A bunch of that stuff IS what they do to deal with this.” And they just don’t believe me. I’ve had CFOs say “We don’t have the budget to do all of this, so which parts are the most important?” and I’ve said “This is the LAW. You’re supposed to do all of it!” But they know and I know that for the time being no one will hold them accountable.

    Right now, tons of companies just say “We’re NIST-800-171 compliant” or “We’re working towards NIST-800-171 compliance” and their contracts go forward and they hire someone like me to tell them what to do and then they don’t do 60% of it and delay doing 20% of it.

    This is in an industry that is required by law to try extra hard on their security. In industries where there are no such requirements, or less requirements… good luck.

  • yeehaw@lemmy.ca
    link
    fedilink
    arrow-up
    0
    ·
    8 months ago

    It won’t become the norm, it is the norm. Security is incredibly hard and nothing is truly secure. If someone wants in badly enough they’ll get in.

    Keep your sensitive data in house, encrypt and back it up off site.

    • funkyfarmington@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      8 months ago

      And don’t let Ethel in Finance have administrator access and cat screensavers, that helps too.

      I so much wish I was joking.

    • wellDuuh@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      8 months ago

      it is the norm.

      It always has been. I mean, super yachts are expensive, someone’s gotta pay for that in one way or another. There’s never been ‘enough’ with these people. They hide their greeds in something like ‘new opportunities’

      Even now, the lawyers are working their butts off finding new loop holes so that they could sell your data even more without raising red-flags

      This will keep on happening, just as how they try to shove ads to you in any way possible.

      alert: they are invading the HDMI protocol :::
    • AggressivelyPassive@feddit.de
      link
      fedilink
      arrow-up
      0
      ·
      8 months ago

      The reality is: security is often non-existent in larger corporations. It’s all about optics and insurance. Hardly any project I’ve been involved with actually did something for security. It’s a cobbled together mess with just enough security theater to not be legally liable. That’s it.

      Case in point: I know of a database that holds data for pretty much all adult persons in Germany, Austria, Switzerland and some people from surrounding countries. The root password contains the company’s name and the year the DB was initially set up.

      • Scrubbles@poptalk.scrubbles.tech
        link
        fedilink
        English
        arrow-up
        0
        ·
        8 months ago

        Most of the time it’s not even nefarious, I can tell you’ve worked big corpos. There just are too many intertwined things and no one overseeing the whole system. Every large company probably has hundreds of security flaws, but no one can see the giant entrance sign to the forest when all they can see are trees

        • AggressivelyPassive@feddit.de
          link
          fedilink
          arrow-up
          0
          ·
          8 months ago

          Well, I would say it absolutely is possible, but it costs money directly, up front and in an accountable manner. Security incidents vanish in the fog of responsibility diffusion and nobody specifically can be blamed. That means for each individual responsible party, it is the rational choice to do just enough not to be blamed, pull off theater to seem engaged, but avoid anything that would actually cost money.

          So, you’re kind of right, but for the wrong reasons. It’s a systemic issue, that almost inevitably happens in large organizations, but at the root is not inherent complexity, but a perverse incentive structure.

  • Artyom@lemm.ee
    link
    fedilink
    arrow-up
    0
    ·
    8 months ago

    This will keep happening, no one knows how to make hardened IT infrastructure while also letting 65 year old Suzie in HR stay productive, so we’ll always have loopholes. The best thing you can do to protect yourself is to use fewer cloud services, but obviously that has limits, you can’t cancel your phone plan just because they may get hacked. You could use more encrypted services like Signal where a hacker wouldn’t get anything useful even if they broke in.

  • belated_frog_pants@beehaw.org
    link
    fedilink
    arrow-up
    0
    ·
    8 months ago

    It already is. The fines if they ever get them become the cost of doing business and customers are helpless basically. There is no recourse for shitty security with peoples data.