Podcast: Cybersecurity landscape and SEC rules for 2024
Manufacturers and tech companies have suffered severe cyber attacks in recent months, timing that coincides with new Securities and Exchange Commission rules that force publicly traded companies to disclose when they've been hit by hackers, as long as the damages are "material" to the companies' performances.
Smart Industry Managing Editor Scott Achelpohl discusses cybersecurity trends and the new requirements with Dennis Scimeca, senior technology editor at IndustryWeek.
Transcript
The following is an edited version of the transcript from this podcast:
Scott Achelpohl: Hello and welcome to another episode of “Great Question: A Manufacturing Podcast.”
The title of our program today, presented by Smart Industry from the Manufacturing Group of Endeavor Business media, is: “Cyber Incidents and the New SEC Rules.”
I’m Scott Achelpohl, and I’m managing editor of Smart Industry
Manufacturers and tech companies have suffered some costly and debilitating cyberattacks of late. At the same time, U.S. Securities and Exchange Commission rules, which took effect last month, require many firms to comply with new preparedness and reporting standards.
See also: Microsoft hack tests new SEC disclosure rules
The discussion today is all about cybersecurity and cyber-incident preparedness and what the brand-new federal government rules have to do with how manufacturing companies structure their preparedness and respond when these attacks happen to any size business, small or large, as we saw in the case of software and internet services giant Microsoft Corp., which reported a serious “password spray attack” just last week.
This all will be the focus of another Smart Industry program, “New SEC Reporting Requirements and Your Cyber Defenses,” on Feb. 15 with two luminaries in the area of cybersecurity, Michael Daniel, who now leads the Cyber Threat Alliance and was President Obama’s cybersecurity coordinator at the White House, and Richard Bird, the chief information security officer at Traceable AI.
Today, our guest is Dennis Scimeca, senior editor for technology at IndustryWeek, which is SI’s sister brand and part of the Manufacturing Group here at EBM.
Dennis is our in-house go-to on most matters related to cybersecurity and has led IW’s reporting on some high-profile cyberattacks the last few months, including the multimillion-dollar breach at Clorox Co. last summer that slowed production and generated a lot of headlines.
See also: Clorox begins recovery after severe cyberattack
His latest story for both IW and Smart Industry focused on the Microsoft attack, the details for which are still coming into focus but were concerning, considering Microsoft’s status as a software and internet services behemoth.
Welcome to the Great Question podcast, Dennis!
Dennis, you’re pretty in the weeds about the details of the notable cyber incidents the last six months, including Clorox and Microsoft. I’m going to tap your knowledge and insights here. Don’t hesitate to let loose. Our listeners really want candor on this subject.
Scott Achelpohl: What precisely do the new SEC rules say?
Dennis Scimeca: The new rules, adopted in July 2023 and in effect for all U.S. companies as of mid-December, state two requirements: First, in their annual 10-K filing, companies have to report on cybersecurity risk management, strategy, and governance.
If you’re not familiar, where an annual report to shareholders might focus almost entirely on financials, the 10-K form is much more comprehensive, with information about company history, organizational structure, facilities owned, etc. It’s all the info an investor needs to really understand how the company is doing.
See also: Cybersecurity threats and lessons to be learned by U.S. energy infrastructure
So, companies now must describe how they identify and manage material cybersecurity threats, the material damage a cyberattack might do, past cybersecurity incidents, how much oversight the Board of Directors has, and how management assesses and manages material risks from cybersecurity threats.
If a company isn’t paying attention to cybersecurity, investors are going to know about it, now.
Second, unless the United States Attorney General determines that the disclosure poses a national security or public safety risk, companies must within four days disclose cybersecurity incidents that the company determines are material, using a new Item line on Form 8-K. That’s the form companies use to report major events shareholders ought to know about.
Now, that includes material cybersecurity incidents, with “material” defined as an incident “to which there is a substantial likelihood that a reasonable investor would attach importance.”
Scott Achelpohl: They sound like very general guidelines, yes?
Dennis Scimeca: OK, so here’s where things get murky.
What’s “substantial likelihood?” In some cases, like the Clorox breach IndustryWeek reported on last September, there’s zero argument the incident wasn’t material because the breach actually affected production. There were product shortages on shelves. There’s no way profits wouldn’t be affected. And the stock price actually dropped following the news breaking.
See also: What’s in store in 2024 for cybersecurity, AI, and securely bridging the IT/OT gap
Last February, Dole reported a cybersecurity incident after customers in New Mexico and Texas noticed an absence of pre-cut and mixed salad kits on shelves. Dole reported it was in the midst of a cyberattack and had shut down its systems through North America, but also said the effect on operations was minimal. They put manual operations in place and got things moving again.
So, was that actually a “material” incident if all that happened was customers not being able to purchase a specific product line for a few weeks?
How much money does that cost?
Maybe not a lot, in the grand scheme of things?
Does data scraping, stealing names and real-life addresses of employees off a website, have a material effect on anything? Probably not?
That kind of data breach shows a company has holes in its cybersecurity net that need to be plugged, but the only financial ramifications might be spending some money on a cybersecurity firm to plug said holes.
So, while I expect most companies would play it safe when determining what cybersecurity events investors need to know about, I also expect some company, at some point, to dance around the verbiage. And that’ll be interesting to watch.
Scott Achelpohl: Sounds like, Dennis, that some of these rules are subject to interpretation. Do you think it’s best for companies to at least consult expert counsel before they go about putting in place their reporting procedures.
Dennis Scimeca: I honestly don’t know what those conversations would look like. If I know lawyers, it’d be a discussion about how to cover behinds, like maybe what verbiage to use to defend whether an incident needed to be reported on an 8-K and when. I think the wiggle room comes from the basic nature of reporting cybersecurity incidents.
The new regulations state that the disclosure of a material breach must take place within four days of determining the incident is material, but not within four days of when the incident first took place.
See also: The Crystal Ball Report 2024: A preview podcast
See also: eHandbook: The Smart Industry 2024 Crystal Ball Report
For example, the incident that Microsoft reported on Friday, Jan. 19. That incident was detected a week earlier, on Jan. 12, but the systems breach took place in November. It took Microsoft two months to discover they’d even been hacked. That’s not unusual.
Hewlett Packard Enterprises on Jan. 24 filed a Form 8-K disclosure that hackers had gained access to the company’s Microsoft Office 365 email environment. HPE was notified on Dec. 12 that the hack had taken place in May.
When a company detects a breach, they have to backtrack and figure out when it took place and what was affected. Companies need to know how the hackers got in, what the hackers had access to and what, if anything, they stole. Without knowing all those things the company cannot repair the damage and put new defenses in place to avoid a repeat occurrence.
That can take a while, and it’s reasonable that any company would want to have its ducks in a row before reporting any kind of systems breach, especially if they have to do so now to the SEC and need to assuage any investor concerns when they report the incident officially.
A company could argue that it has no idea whether a breach is material until they have determined the extent of the systems intrusion and just how much of what was taken. One could argue that an investor can’t know what to worry about, or what not to worry about, until the company has that information.
One imagines that companies may follow the spirit of the new regulations to avoid being hassled by the SEC but still be in no rush to report cybersecurity incidents any faster than is absolutely necessary.
Scott Achelpohl: Who at manufacturing company is responsible for gathering these answers? Is it, or should it be, the responsibility of someone like a CISO? Or do you think these answers are better gathered by the legal department with consultation from IT?
Dennis Scimeca: I don’t know how companies put together their Form 10-Ks but I’d hope a chief information security office would be able to provide whatever’s needed for the form.
See also: Cybersecurity in the spotlight: What recent attacks show about industry vulnerabilities, defenses
Scott Achelpohl: And does this mean, after all this time and all these high-profile incidents, even the largest companies—Microsoft is as big as it gets—don’t take cybersecurity as seriously as they should? Or they should at least care more than they did before the SEC rules took effect?
Dennis Scimeca: I’m certainly not going to indict a tech giant for their cybersecurity awareness. I have no idea what their training programs look like, but a company like Microsoft certainly doesn’t run short on people who understand cybersecurity intimately.
And I think one has to be fair—in the Microsoft incident, the breached account was a test account, not an actual account with an employee attached who was sending email messages all the time. With a company the size of Microsoft, with such a dense IT structure to manage, I don’t think that forgetting to shut down a test account or monitor it says anything about the company’s cybersecurity hygiene overall. Mistakes happen and who knows how long ago that test account was made.
I think most cybersecurity incidents probably don’t fall under the “material” category and therefore won’t need to be reported. The new SEC regulations on reporting might not change anything from that point of view. But the need to annually disclose preparations and plans does or should force companies to take cybersecurity more seriously.
My understanding, based on conversations with many cybersecurity specialists, is that the problem lies with companies wanting to run IT on efficiency models. Cybersecurity hygiene isn’t simple.
At the most basic level it means training all your employees about phishing and social engineering, like making sure they know how to identify fake emails and ideally getting them to report the attempts to the IT department, who then can take steps to address the problem.
Companies should be testing their networks for vulnerabilities, figuring out the size of their attack surface—the sum total of all potential entry points into their networks. That probably means hiring a cybersecurity firm because how many companies have the sort of IT team that can run those tests in-house? And then, of course, there’s the cost of taking action to fill those security gaps once they’re discovered.
See also: Security implications of hastily implemented AI and understanding what to do
All of that takes a lot more time and might cost more than to just, for example, pay a ransomware demand to get your data back if hackers breach your network and hold your data for hostage. I’ve heard stories about IT departments being asked by leadership what the plan is, if the company’s data ever gets hijacked, and the plan is literally just to pay the hackers off.
Which, to me, is funny, and also not something a company is going to report to the SEC in their annual 10-K filing. The good old days of a company rolling the dice on whether they will ever suffer a cyberattack and deciding accordingly how much money to throw at cybersecurity should be over.
Because now, if a company gets hacked and the incident is material, it may come to light that the company took no precautions and had no plan.
Scott Achelpohl: These new SEC rules aren’t specific to manufacturers, but what specific cybersecurity threats do manufacturers need to be aware of? What might they have to talk about on the Form 10-K annual filing?
Dennis Scimeca: First, there’s OT. A lot of OT is air-gapped, meaning it doesn’t connect to any kind of network. But is there a way to lift data from the machine, even it’s just as simple as plugging in a thumb drive and downloading a report?
If a machine is networked, it’s either running cable, in which case you again need to worry about who physically has access to the machine. If a manufacturer uses a WiFi network to connect OT to a central system, you still need to worry about physical access, and now you also need to worry about someone hacking into the WiFi.
In some cases, there can be cybersecurity threats targeted at specific industrial control systems (ICS), supervisor control and data acquisition devices (SCADA), and specific programmable logic controllers, (PLCs). The U.S. government in 2022 reported on threats very specifically to PLCs manufactured by Schneider Electric and OMRON.
IoT cybersecurity, all those sensors you’re installing on equipment spread across the floor—do you understand all the relevant security considerations? Do you have security and privacy service-level agreements (SLAs) with all your vendors? Do you have behavioral analytics, so you can see how IoT devices are behaving and catch anything abnormal going on? Do the IoT devices have built-in diagnostics?
How many users have access to the IoT devices and how well are they trained on cybersecurity? Because every single employee with access to networked technology is a cybersecurity risk.
See also: New research sees buy-in for digital transformation growing among manufacturing stakeholders
And that’s just the abridged version of a list of necessary security measures an expert shared with IndustryWeek this past October.
Third-party vendors—OK, manufacturers aren’t the only ones who have to worry about vendor access to their networks, but each and every supplier or seller with access to your networks presents a security risk.
Do you have any idea what kind of security they have on their end? Do they train their employees about phishing? How carefully do they guard access to their OT?
Nissan in January 2023 reported a data breach that came from a third-party software development vendor that had access to Nissan’s customer data stored on the cloud.
Scott Achelpohl: Is there general cybersecurity hygiene that all companies, including manufacturers, ought to implement sooner rather than later?
Dennis Scimeca: This is why I cover cybersecurity at IndustryWeek, to be honest. I try to focus on manufacturer-specific hacks, like the aforementioned Nissan and Dole hacks, but I report on “big” hacks like Microsoft and HPE or Boeing last year by way of trying to make our readers understand that if these big companies, who probably throw a lot of resources at cybersecurity, are vulnerable then what is your small- or medium-size business, that may not throw a lot of resources at cybersecurity, vulnerable to?
I worry about what cyberattacks can do to any business but to a SMB with only one plant and maybe only a few lines at that single plant?
What happens to them if a cyberattacker shuts production down for a few weeks? How bad is that damage comparatively?
These probably aren’t publicly traded companies and so we’re not talking about compliance with SEC regulations at all, but I hope that’s not the only reason a manufacturer would care about all this.
And if someone said I was trying to use scare tactics to make them care, I’m perfectly fine with that. I care about my readers and their industry, and everyone needs to pay attention to this, SEC regulations or otherwise.
So, the quickest and easiest step to take is to train your employees about phishing attempts, how bad actors trick people into giving away their login credentials. Fake emails about signing in to receive a document from the HR department or receive a package waiting for them. Use two-factor authentication whenever possible—that’s when you log in online and then get a text on your phone usually with a 6-digit number you enter to complete your login. And even that isn’t perfect protection.
Employees are the largest attack surface and if they’re distracted or busy it’s way too easy to not recognize when someone is trying to get your login credentials. Introducing a little paranoia into the workforce when it comes to cybersecurity might not be the worst thing. And that’s something you can get started with immediately.
See also: Cybersecurity stakeholders praise AI executive order—but say it’s just a start
And while I don’t want to be accused of throwing business to cybersecurity companies generally, you need to understand the nature of your attack surface, all the places your network might be vulnerable.
Honestly, this the kind of thing that I’m sure made a lot of businesses just bet on not being hacked and/or planning on just paying ransomware demands rather than sink the time and money into a proper threat assessment because unless you have the experts in-house you’re going to need to hire someone, right?
But maybe that’s less expensive than paying ransomware? I don’t know. I don’t know what the cybersecurity firms charge per hour or how much money manufacturers pay in ransomware demands but maybe the math works out in favor of knowing what you’re up against rather than ignoring it and paying out if you get hacked.
Scott Achelpohl: OK, so the lesson should be: At the very least, companies and their CISOs at manufacturers (and possibly their legal departments) should learn the brand-new SEC rules and inform their bosses about which other departments to involve in the preparations, just in case of an incident.
Like a “break glass in case of” a cyber breach.
But maybe the worst—and avoidable—is not to be prepared at all.
All great food for thought.
That concludes Smart Industry’s contribution today for the Great Question podcast.
Everyone have a great day.