My Reaction to the Bloomberg Article on Me and the 100 Day Action Plan

June 10, 2022

Hi all,

Well candidly here’s a blog I hate to have to write but I appreciate the show of support from the cybersecurity community; in all the social media posts/Twitter/etc. any negative comment was outweighed 20:1 by comments noting my actions didn’t sound wrong at all. On a very difficult day I just want to say I love you all and appreciate the kindness and trust. I struggled with the decision to say anything because I didn’t want to amplify the article but enough people have asked for my point of view I feel I owe it. I often joke that no one has ever had to wonder my opinion. I am perfectly fine with anyone thinking anything they want about me – except for challenging my integrity. I appreciate your time in reading this article.

For those that missed it here’s the Bloomberg article in question.

The article notes I was called into the White House to provide ICS cybersecurity insights. The article states that I used those interaction to position specific technology requirements that benefited my firm Dragos. As many on social media noted the requirements of “high-fidelity sensor-based continuous network cybersecurity monitoring” and anonymizing of data that is shared between participants is not Dragos technology specific and exactly what is required to gain visibility into the critical parts of our critical infrastructure to detect threats.

However, what I found misleading (not intentionally so, even though I didn’t agree with the basis of the article Jack the journalist was extremely professional throughout and had numerous calls with me to hear me out including many of my quotes in the article) is that I never actually did what was alleged.

I did not use those meetings and interactions to position any technology requirements. What I tole Anne was that I would support her and her staff fully and provide general ICS cybersecurity advice. That’s what I did. When the conversations started there were rumblings in Congress post Solar Winds that the country needed to significantly enhance regulation on the sectors to include the electric sector. I used my meetings to note that industrial asset owners and operators sincerely listen to the government when they talk and the problem is they often get very conflicting guidance from different government agencies. That the electric sector is already heavy regulated. That industry groups supported by DHS and DOE like the ESCC already exist. And that if there was any work to be done that it should be done in a way that doesn’t reinvent the wheel, speaks with one voice of the government to the community, and use existing government sponsored bodies like the ESCC. The White House personnel I talked to agreed and were very focused on national security and protecting people.

I did offer to help contribute to any plan the White House wanted to author as I am deeply committed to national security of our infrastructure. But I noted that if I was seen as being involved no matter how good the recommendations were it’d create distracting noise from good efforts (which ended up happening) and therefore if I was involved let’s not put my name on it. But the White House turned down the offer after discussions with ethics professionals as the optics could be bad, which I agreed with. I was never involved in authoring or editing of the 100 day action plan or any other government document/plan. Nor did I get to see it until after it was published.

What the article conflates is there was an independent contractor, not a government representative and not employed by the government, who later authored a whitepaper offering recommendations for ICS cybersecurity to submit to the government for consideration in their future efforts. This whitepaper was floated around to dozens of industry experts for comments. At one point there were Dragos’ Neighborhood Keeper specific requirements in there. The edits that I made that the Bloomberg article mentions were to edit those out and instead offer solution agnostic suggestions, one of which was the high-fidelity sensors which every single competitor in this space complies with. There would be nothing wrong with me advocating to another private citizen for our technology as the CEO of the firm, but I didn’t want that because I want the community to have options and choose what they want. I’ve always believed we’ll win on our own merits. The solution agnostic terms I used were directly from the Cyber Solarium Commission (a Senate appointed commission) and public comments and requirements from the Department of Energy. In essence all I did was join numerous industry experts on a whitepaper that informed the government of existing government requirements and language.  Some of the whitepaper’s content made it into the 100 day action plan but I had no insight into that and many of my comments (like making it easier to get security clearances across the sector with incentives for the private sector to participate) did not.

The article further points out that Dragos’ technology mirrors the language I used. That’s actually true. But the important context that I think is missing is that the language originated from the DOE, was made public, and then I added it into the datasheet of Neighborhood Keeper. Why would I do that? Because, what’s also not stated in the article, is Neighborhood Keeper was developed jointly with the DOE. Years ago the DOE opened up bids publicly and competitively asking firms to create technology that was needed but didn’t exist against the DOE roadmap for cybersecurity for the electricity sector. Dragos won one of the grants and developed Neighborhood Keeper. Of course a joint-DOE technology should use and meet DOE requirements – that’s the entire point of the grant program.

Also I want to thank Jack, the journalist, and his editor. I made it clear throughout our conversations that I never provided requirements to the White House. Though I didn’t love that the story got published anyway – post publication Jack continued to take my phone calls, offered to put me in touch with his editor, and after hours of explaining all that happened on a Friday night they agreed to add a clarification to the article confirming that I did not provide requirements to the White House:

Bloomberg Clarification

Jack treated me professionally throughout the process and I hold no grudge that he published a story based on complaints from our competitors; that’s not an easy position to be in for a journalist.

So in short:

  • Anne bringing me in was for general ICS cybersecurity advice, not technology specific anything
  • I edited an industry whitepaper to remove Dragos specific positioning and make it solution agnostic with existing government requirements
  • The government ended up reading the whitepaper and the government adopted some of the government’s existing requirements

The ESCC evaluation I did not have insight into but know that the electric industry showed amazing leadership in evaluating over 18 technologies quickly with a rising Russian cyber threat and *some* of them endorsed and used Neighborhood Keeper. Many others used competitor technologies instead. And over a year later the only technology that is up and running actively sharing insights across the electric sector in real time to enhance our critical infrastructure is Neighborhood Keeper. So before the ESCC is critiqued I think it’s fair to say they made a fair choice in an effort to enhance national security voluntarily and at their own cost. And now – they and the government are working to help the other vendors create solution agnostic sharing programs which makes the market place more competitive and advancing the state of the industry.

I’m proud of what happened, I deny that I operated unethically, and I am forever grateful that our electric community is always willing to rise to the occasion to provide safe, affordable, and reliable electric power to our communities even in the face of strategic and well funded state adversaries.

Thank you for hearing my side of the story and I wish you all the best.

Structuring Cyber Threat Intelligence Assessments: Musings and Recommendations

January 15, 2022

Tomorrow (Jan 16th 2022) I’ll be speaking at the free virtual conference PancakesCon 3 on “Structuring Intelligence Assessments and Gin Cocktails.” The conference’s format is to introduce material to students or new folks to a topic in the first half of the presentation and in the second half do something completely not work related, my second half will be non-intimidating and easy yet delicious to make gin cocktails (normally I’m more of a bourbon/rye guy but when it comes to cocktails I’m a gin fanatic).

As a result I decided to sit down today and write about the first half of my talk while practicing the second half of the talk. Therefore this blog will likely be more musings and recommendations than wildly coherent and well written thoughts. The talk is only 15 min long so I want to condense as much as possible into that timeframe and map the blog accordingly.

If this blog gets your insights going and you find the topic fun you can check out my SANS FOR578 – Cyber Threat Intelligence class where we talk about these topics and more. I’ll also include some reading materials and references at the end of the blog as well. Unfortunately and fortunately this one topic could (and has throughout history) been a complete manuscript/book but I’ll try to give me abbreviated version on the topic.

First Key Point

There’s a lot of opinions as it relates to cyber threat intelligence. I often semi-joke that the only thing Analyst 1 and Analyst 2 can agree on is that Analyst 3 is wrong. Intelligence doesn’t specialize in the area of facts. If everything was a fact or a simple yes or no answer you likely wouldn’t need intelligence analysts. Intelligence analysts specialize in going into the unknown, synthesizing the available information, and coming up with an analytical judgement often requiring you to go beyond the evidence (an analytical leap). There are some things you shouldn’t do, but generally speaking my retort to people is: “you do you.” There is no one right or wrong answer. I’ll give you my thoughts around intelligence assessments in this blog to help you but it should not be construed as the only answer. It is simply a consideration built on experience. The first rule of cyber threat intelligence, to me, is that if you’re being an honest broker then so long as you’re satisfying the requirement of your customer then everyone else’s opinion is irrelevant. Or said differently: everyone’s a critic, if you are delivering intelligence to the best of your ability to a consumer against their requirement and your intelligence helps them achieve their outcome then all the nit picking in the world about how you got there is irrelevant to me. Key focus on honest though. We deal in the world of trust because a lot of what cyber threat intelligence analysts do looks like magic to those not in our field. If you betray that trust, especially on topics that people aren’t well versed in and thus need to rely on trust more, then just pack up your bags and go home – you’re done. (As an aside that’s why I hate those stupid cyber attack pew pew maps so much. I don’t care how useful they are to your budget or not – it’s dishonest. The visualization is pointless and if a consumer digs a layer deeper and finds out it’s just for show then you come off as someone that is willing to mislead them for a good outcome. I.e. you lose trust).

Being Precise In Word Usage

Much has been written about intelligence assessments. One of the required readings is by Sherman Kent from 1981 at the Central Intelligence Agency titled “Words of Estimative Probability”. One of the beautiful things that Kent puts forward is the necessity to be measured and consistent in what we do. I.e. it’s not paramount to pick one direction over another but it is paramount to be transparent and repeatable in what direction you chose and consistent in its application. If you decide that you like the word “even chance” then you should know what that means to you (50/50 probability subjectively but as appropriately applied as you can) and consistently use that language in those scenario. You are more than welcome to use other people’s standards (like Kent’s) or create your own. It’s likely better to use what exists, but if you need to adapt or create your own you can just make sure it’s something you can make available to your team and consumers so its transparent and has definable meaning. I would recommend creating a style guide for you/your team and defining the words you do and do not want to use and as much as possible tying it to a number to help them understand how those words relate to each other. Defining words you do not use is also helpful. As an example, I cannot stand the word “believe” in intelligence assessments. As my friend Sergio Caltagirone would say belief is for religion not intelligence.


Figure 1: Example from Kent’s Words of Estimative Probability

One of the most important things intelligence analysts can do is reduce the barriers and friction between a consumer consuming the intelligence and leveraging it. Write briefly. Don’t use more words than required to make the point. Remove ambiguity wherever possible. In this topic, to convey what we do and don’t know as quickly as possible but in such a way that doesn’t require the consumer to learn our language over and over. The consumer should be familiar over time with the language we use and what it means to help inform them how and when they can leverage it against their requirement.

In a perfect world I want intelligence reports to be subjected to the Pepsi Challenge. I should be able to remove the logos, branding, color schema, etc. off of your intelligence product/report and put it in front of a consumer that you deal with and place it randomly next to other teams’ intelligence reports. The consumer should be able to pick out which report is yours. Language consistency, structure of the report, where the assessments are, where the suggested actions are, etc. all play into that.

Following that logic then our intelligence assessments should also be as consistent as possible.

Intelligence Assessments

If something is a yes or no answer then it is a factual statement. “Was this malware on this system?” can be answered as a yes or no question because the malware is either there or not and it is something we can prove. There should not be intelligence assessments for factual statements even if the factual statement is part of the threat intelligence. I.e. I would not expect to see someone say “We assess with moderate confidence the malware is on that system.” You can go and prove it or you cannot, but it’s not an assessment. The world of intelligence is for one step beyond the evidence. It’s the analysis and synthesis of the data and information to create new insights. However, we might say something like “We assess with low confidence, given the presence of the malware on the system, that our company is a target of their operation.” We do not know for sure whether or not the adversary intended to target us or if we were just a random victim. We can synthesize the available information and make an assessment based on our experience and what we’re seeing to reach an intelligence assessment.

I see most organizations leveraging Kent’s estimative language (or something close to it) in their wording and then using the estimative language you’d find in government intelligence agencies. Typically that is:

  • Low Confidence
  • Moderate Confidence
  • High Confidence

You can create middle grounds like Low-Moderate or Moderate-High but I try to avoid that myself as it often is more confusing to the consumer. Keeping it to three confidence levels in your intelligence assessment and having some rules you set out for yourself and your team on what that means often suffices.

General Rules for Confidence Levels

Again, you do you. But for me I like to generally follow the following guidance as it relates to designating confidence levels. Also I do think there’s a difference in cyber threat intelligence vs. the intelligence you might be delivering to the President of the United States to go to war. But these generally hold.

  • Low Confidence: A hypothesis that is supported with available information. The information is likely single sourced and there are known collection/information gaps. However, this is a good assessment that is supported. It may not be finished intelligence though and may not be appropriate to be the only factor in making a decision.
  • Moderate Confidence: A hypothesis that is supported with multiple pieces of available information and collection gaps are significantly reduced. The information may still be single sourced but there’s multiple pieces of data or information supporting this hypothesis. We have accounted for the collection/information gaps even if we haven’t been able to address all of them.
  • High Confidence: A hypothesis is supported by a predominant amount of the available data and information, it is supported through multiple sources, and the risk of collection gaps are all but eliminated. High confidence assessments are almost never single sourced. There will likely always be a collection gap even if we do not know what it is but we have accounted for everything possible and reduced the risk of that collection gap; i.e. even if we cannot get collection/information in a certain area it’s all but certain to not change the outcome of the assessment.

For purposes of clarification, the topic of “single sourced” to me relates to where we are getting the information. As an example, if you are operating only on netflow or malware repositories like VirusTotal it is highly unlikely you will get to a high confidence assessment out of any one of those. It is possible that you have commercial access to netflow, and malware repositories, and you have information shared from 3rd parties that help you get to a high confidence assessment. In a perfect world you’d have non-intrusion related information as well depending on the assessment. (e.g. you’re trying to attribute an operation to a specific government or government agency I’d prefer to see you have a lot of different types of 1st party data as well as 2nd or 3rd party data supporting the assessment and in an ideal scenario have more than just intrusion data. This is an area I often conflict with the private sector intelligence reporting; while I am a huge fan of private sector intelligence and think it runs circles around some types of government intelligence I generally hold a higher standard for confidence assessments on the topic of attribution than I see represented in *some* public reporting. That’s not a knock on every team out there – but I see “high confidence” for attribution thrown out quite a bit for data that just came out of an incident response case or malware repositories and you really need a lot more than that in my opinion).

Collection/Information gaps are anywhere there’s useful information or data that you don’t have access to. As an example, if an adversary targets an electric power plant I may have access to their malware but maybe I don’t have the initial infection vector. The missing data/information on the initial infection vector would be a collection gap. Additionally, maybe I see what their command and control server is but do not have access to what’s on the command and control server, who is accessing it when, etc. and those would be collection gaps. Some collection gaps you can solve for, some you cannot. But you must think through as many of them as possible and what that might mean to the assessment.

Structure of an Assessment

Again it really depends on you and your use-cases but generally speaking I like intelligence assessments’ structure to also be consistent and follow a repeatable and understandable pattern.

I often like my assessments to follow the following pattern:

Confidence + Analysis + Evidence + Sources

As an example:

“We assess with moderate confidence that ELECTRUM is targeting the Ukrainian electric sector based on intrusions observed at the Kyiv electric transmission substation by our incident response team as well as publicly available data from the Ukrainian SSU.”

The depth you go into analysis, evidence, and sources will vary entirely based on what your consumer needs and finds relevant.

Everyone Loves to be High

One of the big flaws I see in cyber threat intelligence teams is a well intentioned but misplaced effort to have as many assessments as possible be High Confidence. Your consumer may expect high confidence assessments. You may have accidentally trained them to expect that those are the ones to operate off of. But in reality, a low confidence assessment is a perfectly valid and well supported assessment. If you aren’t able to make an assessment with confidence levels its perfectly fine to say something to the effect of “We haven’t assessed the situation yet but here is my analytical judgement based on my experience and the available information.” In other words not everything needs to be an intelligence assessment.

But if you do give an intelligence assessment you need to train your consumer to understand that all three levels are appropriate to use. A low confidence assessment is a good assessment someone can have confidence in. Often times, the difference between a Low Confidence assessment and a High Confidence assessment comes down to time and sourcing. Do I have the time and collection to get to that level of confidence or not? While I love giving consumer’s Low Confidence assessments its perfectly reasonable to offer them what it would take to get to a higher level if they really need it: “If we had X more time and Y more resources we likely could raise the confidence level of our assessment or find an alternative assessment.” Be careful in how you use that but I generally find being transparent with consumers to empower them to make better decisions is almost always a good thing.

If we accept that low confidence assessments are good assessments. And we accept that to get to a Moderate or High level generally requires more time and resources. Then if were to look at all the assessments an intelligence team produces across a given time period (e.g. quarterly or annually) would fall into an inverted pyramid pattern.


Figure 2: Rob Makes a Pyramid in PowerPoint After Drinking Lots of Gin

I would expect to find that roughly 40-60% of the assessments produced by the team are Low Confidence Assessments. I would expect 20-30% are Moderate Confidence. I would expect 10-20% are High Confidence. That’s a general rule of thumb; many variables will impact that including the needs of the consumers. But generally speaking if I am trying to only release Moderate and High Confidence assessments the impact to my consumers is usually requiring more time and resources than necessary if they are comfortable making a decision on Low Confidence assessments. If the consumer truly requires a higher level of confidence – no problem! But if they don’t require that then don’t put barriers in between them and the intelligence they need to make a decision.

“But Rob my boss really only listens if we have High Confidence assessments.” I hear this often with my FOR578 students. You need to sit your consumer down then and have a conversation about this and how they can leverage your lower confidence assessments. You can try to pull historical information on your assessments vs. the decisions made and try to showcase that Low Confidence assessments are good assessments, they have a role. There’s tips/tricks here for this but it’s not really the point of the blog and I’ll just note that we serve as the request of the consumers (intelligence does not exist for intelligence’s purposes) but we want to make sure to arm them with the best information possible for their decision making process. Sometimes that does include correcting them.

Closing Thoughts

As mentioned this isn’t meant to be a doctoral thesis on the topic. This is largely just a resource for the people at my talk tomorrow and hopefully useful to others as well. However, there’s a lot of resources out there (some referenced in this blog) to give you help. Here’s some of my favorite as it relates to this topic:

The Cycle of Cyber Threat Intelligence
Presented by one of my awesome friends/peers/FOR578 instructors Katie Nickels where she highlights some of the FOR578 material in a masterful way to condense the view of cyber threat intelligence to folks in an hour webinar.

Hack the Reader: Writing Effective Threat Reports 
Presented by Lenny Zeltzer this is a great conference talk on threat reports that also touches on formatting (don’t roll your eyes, reduce friction for your consumers its part of the job)

Pen-to-Paper and the Finished Report
Presented by Christian Paredes this is a mini master level class on dealing with tough questions and translating things into useable intelligence reports.

Threat Intel for Everyone: Writing Like a Journalist to Produce Clear, Concise Reports
Presented by Selena Larson who I had the privilege to work with for awhile at Dragos. I would highly recommend any intel team look at hiring journalists to the team at some point.

I Can Haz Requirements? Requirements and CTI Program Success
Presented by Michael Rea, at the end of the day everything ties to the requirement. Do not do intel for intel’s point. No one cares how smart you are or how much you learned. They care about you giving them information that meets their requirement so that they can achieve their goals/outcomes. This is a great talk on that point that wraps into assessments as well.

 

*Edit Jan 16th 2022*

By popular demand here is my excel spreadsheet for Gin cocktails for my PancakesCon talk. I’ve changed some of the recipes a bit for my own personal taste. Feel free to adapt.

Gin Cocktails

Cybersecurity Has Much to Learn from Industrial Safety Planning

May 7, 2021

This is a blog I originally published on the World Economic Forum’s website here.

___________________________________________________________

• Safety engineering practices can be readily applied to cybersecurity.

• Developing safety ‘scenarios’ helps build a more comprehensive response to cyberthreats.

• Scenarios are also useful for communicating cybersecurity best practice to professionals outside the field.

A cybersecurity strategy informed from lessons learned in the safety engineering community will help executives and practitioners in the field reduce risks more efficiently. By considering scenarios instead of singular components of a cyberattack, as well adopting safety-engineering’s methodical approach to planning, cybersecurity professionals can put a more robust approach in place.

As a byproduct of more thoughtful and scenario-focused planning, it will also be possible to better communicate to operations staff and to other non-cybersecurity executives, including boards of directors, using cybersecurity scenarios as a storytelling mechanism. Currently, cybersecurity professionals using their own professional language are sometimes at odds with operational staff and business leaders.

Cybersecurity strategies should be based on scenarios and include the following three key recommendations:

  • Analyze scenarios instead of singular items.
  • Derive scenarios from intel-driven and consequence-driven analysis.
  • Prioritize and remove barriers for where cybersecurity and safety intersect.

By learning directly from the practices of safety engineering, the resulting insights can directly contribute to the most important functions of an organization, such as protecting human life.

1. Analyze scenarios instead of singular items

Intrusions into organizations are initiated by humans, not by malware. Which is why cybersecurity analysis should not be monopolized by a singular focus on controls such as patching or anti-malware. Instead, organizations should try to gain a holistic view across the intrusion lifecycle – particularly of the steps taken by the humans behind the malware.

Take, for example, the attack on a petrochemical plant’s safety instrumented systems in Saudi Arabia in 2017, which resulted in the first cyberattack targeted directly at human life. In this scenario, a preoccupation with malware and the final step of the adversary’s attack that caused the safety-system disruption, obscured valuable insights about the deeper risks posed by the attacker’s techniques across more than a dozen distinct steps they performed over three years. The organization focused on identifying and remedying the attack by sharing technical details about the malware; while important, this is easy for the adversary to change in any follow-up attack.

The attack had actually begun in 2014. From 2014-2017, the adversary compromised the organization and moved throughout their industrial networks learning about the operations and equipment. The team behind the attack, dubbed XENOTIME, engaged in a series of steps leading up to the deployment of their malware, called TRISIS or TRITON: over a dozen unique ones in total. In other cases involving this same adversary, many of the steps remained consistent, even though the specific malware leveraged was not observed again. This is common in cybersecurity where adversaries change capabilities, but maintain a level of consistency in the style of attack.

XENOTIME's 2017 attack on a Saudi Arabian petrochemical plant was in preparation for three years
XENOTIME’s 2017 attack on a Saudi Arabian petrochemical plant was in preparation for three years
Image: Dragos

With each action in the chain, there are multiple compensating controls against the risk the adversary poses that would inform any organization how to prepare against such attacks. For example, monitoring for the way the adversary moves through the networked environment. Told across the full scenario, the case study presents a story of how to develop and communicate a defensive strategy that prepares organizations for any other adversary that shares any overlap with how XENOTIME operates. Sharing strategies is a common practice for cybercriminals and gives defenders an upper hand in responding.

A scenario-based analysis makes it easier to understand the risk, without a high degree of technical jargon or acumen. The longstanding practices of safety engineers can provide an excellent template for this kind of analysis. For instance, by performing a hazard and operability (HAZOP) analysis process that examines and manages risk as it relates to the design and operation of industrial systems. One common method for performing HAZOPs is a process hazards analysis (PHA) that uses specialized personnel to develop scenarios that would result in an unsafe or hazardous condition. It is not a risk reduction strategy that simply looks at individual controls, but considers more broadly how the system works in unison and the different scenarios that could impact it.

2. Derive scenarios from intel-driven and consequence-driven analysis

Cybersecurity threats are the work of deliberate and thoughtful adversaries, whereas safety scenarios often result from human or system error and failures. As a result, a safety integrity level can be measured with some confidence by failure rates, such as one every 10 years or 100 years. In contrast, trying to take frequency or likelihood into account for cybersecurity scenarios is a highly unpredictable and failing practice. Instead, organizations should view protection from these risk scenarios as a binary, yes-or-no decision. Either an organization wants to be prepared for that type of incident or not.

To create scenarios that maximize the commonalities between safety and cyber-risks, organizations should consider a two-pronged approach:

• Intelligence-driven scenarios – those based on real attacks – have the benefit of being a documented case of precisely what happened to other organizations that led to incidents. The study of previous cyberthreats and the methods utilized is an excellent teacher.

• Consequence analysis is more akin to the art-of-the-possible (i.e. thinking through a near-limitless range of possibilities) and should be conducted by a diverse team ranging in skill sets from cybersecurity to plant engineering. Understanding what consequences would be most impactful to the organization or plant site can then be thought through in terms of how they could be influenced or conducted through cyber means.

The combination of ground-truth reality and impactful art-of-the-possible scenarios will create overlapping layers of security and risk reduction that form the basis for meaningful cybersecurity strategies.

3. Prioritize and remove barriers for where cybersecurity and safety intersect

Cybersecurity efforts that can be tied directly to safety should be prioritized and resourced in the interest of the overall organization, the safety of plant personnel, and the safety of people and environments around our plants.

In many organizations, cybersecurity is billed as an IT service provided to business units or individual plants. However, most organizations have consistently deemed safety-related expenses a company-level expense, which does not negatively impact plant budgets, performance bonuses, and key metrics. Not all cybersecurity efforts contribute to safety, but those that do should be prioritized and fully resourced at corporate level, not expensed to individual plants.

Through understanding broader cyberattack scenarios, and not focusing overly on any one step, preventive, detective and responsive controls can be crafted as part of an overall cybersecurity strategy. Scenarios that consider cybersecurity risks and that can impact safety directly should be prime candidates for prioritization and resourcing.

What a Record Setting Investment into the ICS/OT Cybersecurity Market Means to Me

December 8, 2020

“ICS cybersecurity? What’s that? Is it worth doing? Can it be done? No it cannot, I heard…Even if you did the market isn’t large enough to support it long term”

Dragos, Inc. announced today its C-Series financing which is the largest investment ever into an industrial control system (ICS) / operational technology (OT) cybersecurity company. The investment is $110M for a total raised of more than $158M over the four years the company has been around. As the co-founder and CEO it fills me with great pride because of the team Dragos has assembled and our amazing customers who have truly partnered with us on our collective journey. Seeing them leverage our technology, services, and intelligence to make their companies more secure and further their maturity is something amazing to behold. Most citizens never understand or gain insight into how hard their infrastructure companies work to provide safe and reliable services and goods; I can tell you first hand this community works amazingly hard. There’s a lot of unknown passionate professionals running proof of concepts, implementing projects, advocating internal to their org, getting trained, working long hours, etc. all to allow companies like Dragos to exist to serve this community. Thank you.

I’ve written before on what it’s like to raise venture capital, you can view that blog here. In this post I want to walk through some of the challenges I’ve faced for Dragos from an investment perspective and the path along the way explicitly to help explain what I think this investment means for the broader OT/ICS cybersecurity market and community. I’ll speak a lot about our journey so far but the point isn’t about Dragos’ financing but instead the amazing realization that OT cybersecurity is worth doing, a large enough market to do it in, and that it can be done.

I will say without any intent to hype it up that I do believe this is a watershed moment and I hope to share that perspective with you.

No alt text provided for this image

I started Dragos a little over four years ago with my co-founders Jon Lavender and Justin Cavinee who had worked with me at the National Security Agency on our mission of identifying and responding to threats to ICS worldwide. We started the company not out of the desire to create a company or technology. To be candid we all abhorred the idea of becoming a software vendor after a career of being practitioners and community members in this space. But we did so out of a stark realization that the industry was changing and the threats were becoming more numerous and aggressive. What we were seeing as “answers” were a copy/pasting of IT security best practices into the ICS networks with little regard for the unique mission and threats those systems faced. I had authored the SANS ICS515 class on ICS incident response and network monitoring to help educate and train the workforce but realized that the only way to scale human knowledge fast enough in the face of what we were seeing was to also ensure those practitioners had ICS specific cybersecurity technology as well. We needed to do this in a company that would refuse to get acquired and be a long term player to put a dent in the problem. It’s my view though that to make the best technology you need the best people and you need to be hyper informed on the changing risk landscape if you’re going to counter it. So we built Dragos focused on our visibility, monitoring, and response technology but also with a professional services team of ICS cybersecurity experts to do everything from threat hunting and pentesting to incident response and architecture reviews while being trusted advisors to our customers. To inform everything we did and to help educate our community we built an intelligence team to identify and track threats specifically focused on ICS. To date we track 14 state adversaries explicitly doing so. I say all that in context of this fund raising to say – most investors hated our approach.

We got off the ground with a Seed investment from DataTribe. The only reason they invested was they had a background in the intelligence community and military and understand that we were mission focused. I’m sure they didn’t know much about what we wanted to do but they knew the problem was important and we were the team that would stop at nothing to satisfy the mission. When I went to raise the Series A round of $10M to finance our operations I met with and pitched well over 100 investors. Many of them sought us out to learn more about ICS/OT cybersecurity. The broader OT security market which encompasses ICS and the industrial internet of things (IIoT) (not to be confused with IoT, Alexa and a Gas Turbine have little in common) was very interesting to investors but none of them seemed to believe it was worth focus. I received pushback from the investors that fell into three distinct camps; these camps were challenges I heard from plenty of non-investors as well that I had encountered over the years:

  • Companies have tried to do ICS security before and failed. It’s not doable. People don’t care past regulation or fear. These companies won’t change. OT specific cybersecurity will never be successful.
  • The market is too small. If you’re interested in getting quickly acquired we’ll invest but if you’re interested in going the distance, we’re not along for the ride. The OT market is so niche.
  • IT and OT are converging. There won’t be an OT network in the years to come. IT? OT? It’s all just T. Enterprise cybersecurity will be rolled into the plants there’s no need for OT specific cybersecurity. OT specific security isn’t worth doing.

Finding people that didn’t agree with the three points above in the broader market was hard. In reality, very few of the practitioners in our small ICS security community believed such things. I think many people in our community have wondered if it’ll take some giant cyber attack on ICS to get people to take it seriously, but my view was “we’ve had all the attacks we need.” Every industry has representative attacks and stories. That wasn’t the issue or need and no one should hope for it. The reality was there wasn’t a large investable market which means for the business there was no obvious need to address this risk. I viewed building a technology company and staying around long term as necessary to getting these companies resources for workforce development, training, etc. as much as anything else.

On the three points here’s where I disagree in order:

  • Just because a few companies have failed on this path doesn’t mean that it won’t be successful. But more importantly the efforts I’ve seen failed before were largely re-skinned IT security efforts with some ICS marketing. It was obvious they were going to fail. This community does care about its infrastructure but we are a community of people who understand what does and doesn’t work. Our infrastructure members will invest beyond regulation and fear but not in things they don’t believe will work. Also undeniably over the last decade there has been a larger and more proactive community advocating cross company cross vendor cross conference etc. on what does and doesn’t work.
  • The OT market is huge. It’s hard to put a real number on it; some orgs claim 20-30B, but however you size it, it’s huge. Most people associate it with electric utilities and oil and gas. But manufacturing, rail, water, mining, transportation, etc. should come to mind. And the physical systems in the data center. And building automation systems. And airports. And and and. It’s actually harder to find companies that don’t have OT than those that do. These businesses aren’t in the business of selling emails. They produce goods, interact with the physical world, and provide services all powered by OT. The major risk is in the OT and when executives are aware of that and have an answer to address it they will in a way a majority of investors I’ve met have misunderstood.
  • IT and OT convergence happened a decade ago. I’m near tired of hearing how it’s “coming.” We have had Windows in ICS/SCADA/DCS/OT/etc. networks for more than a decade. The convergence is actually the digital transformation of these organizations coming at the same time of ICS specific adversaries. But no matter what the underlining operating system is that’s not the point. The point of OT cybersecurity is that the mission is different. The threats are different. The risks are different. The culture to get the job done is different. The challenges are different to succeed. Therefore the way you secure it will be different. I’m not saying all IT security is useless in the plants. There’s plenty we can learn from and adopt. What I’m saying is the unique and most critical part of these businesses deserves a specific focus that understands and accounts for the people, culture, process, technology, mission, risks, threats, etc. of that side of the business. To not accept that is naïve.

When I went to investors saying we wanted to focus exclusively on OT cybersecurity and we wanted to partner with our customers not just in providing technology but also having smart people and actual insights to provide it didn’t go so well with most of them. You cannot describe all the VCs in one broad stroke just like you cannot describe any group; and I’ve met and really enjoyed getting to know plenty of VCs, but to say the vast majority didn’t understand this market is an understatement.

Not only were the pushbacks from above tangible but also “and you want to hire experts to do professional services? Won’t that lower the margins on the software sales? I don’t think that’s a good idea.” But even extremely mature companies are relatively immature in their OT cybersecurity journey and need a partner not just a technology. That’s also how we get better. So it was non-negotiable. Our team’s people are and were our secret weapon. For all the words like innovation and disruption that get flaunted in Silicon Valley it was interesting how many investors we scared away by simply being different than what they had seen before. The reason we were successful in our A round was largely due to Energy Impact Partners and AllegisCyber. AllegisCyber is a VC built by former operators (ran companies before) which helped them see what we were doing beyond a spreadsheet. Energy Impact Partners though deserved the lion’s share of the credit as they are a VC built by the electric companies. Southern Company, National Grid, Xcel, Oklahoma Gas and Electric, etc. and those companies knew first hand how important OT was and the necessity of a full solution.

No alt text provided for this image

By the time the B round came about, a $37M investment, a lot of the naysayers of OT cybersecurity in the context that it couldn’t be done fell to the side. We were flooded with investors who wanted to invest. But, most of them were talking about and thinking about acquisitions. My view was and is that the OT cybersecurity market is so large that it can not only support one company IPO’ing or being of that size but multiple. This was not a widely shared view to say the least. Most of the 70+ investors calling who were interested in us because of the importance of ICS quickly had the wind taken out of their sails and the conversations would noticeably shift when I mentioned our vision was to be a long term company and not to build to be an acquisition target. To them it was clear now that OT cybersecurity could be done. They agreed it should be done. They did not believe it was a large market. Luckily, this time around we had Canaan which is a well respected Silicon Valley VC to add that type of credibility to our name in those circles but believed in the mission and in the market size. They saw what many at the time didn’t and I think that a big reason for that is how involved they had been with pharmaceutical companies and others realizing that maybe there was something to this OT market. Vision is an easy word to say and hard in practice. (Hear their perspective in this blog here.) We were fortunate to also be joined by a direct strategic investment from National Grid, Emerson, and Schweitzer Engineering Labs. Obviously those three understood OT and have continued to be great partners.

No alt text provided for this image

To claim that the C round is some sort of finish line is obviously silly. It’s really just the starting point. But to have a record setting $110M investment isn’t about Dragos. It’s about our OT cybersecurity community and the broader market. It’s a massive signal to everyone that not only is OT cybersecurity important (most everyone gets that), and is doable (people starting to realize that), but that the market is large enough to make it a worthy investment (new to most). This time, instead of taking calls from all the interested investors, we focused on letting the industry tell the story. The only thing more powerful to me than a large investment is having the asset owners and operators themselves tell their story. Thus, for the C-round we had the venture arms of National Grid and Koch Industries lead the round with investments from Saudi Aramco and HPE as well. One of the largest electric and natural gas companies in the world, with the largest manufacturer in the world, with the largest oil and gas company in the world, with one of the largest manufacturers in the supply chain in the world. That’s a powerful story. That’s a signal to everyone including the investors that the OT cybersecurity market is large, worthy of investment, and will be around for a long time. These are industry leaders saying not only do we believe in the technology we’re seeing but this market and category is important to our businesses at a strategic level. That’s a powerful signal to the other companies in their space and broader. That’s the new piece here. That’s the story. That’s what I think serves as a watershed moment. The community itself standing up and saying “we’ll get this done ourselves, it’s of strategic value.”

There are plenty of savvy investors and VCs that I’ve had the privilege to get to know. But across the broad swath of them the conversations have changed as they learned about our C round. And it’s not just investors. I’ve run into the naysayers every month and sometimes every week of my entire career. It gets tiring. And don’t even get me started on “you’re technical? Are you sure you can be the CEO? Shouldn’t you bring in someone else?” discussions. That’s a less polite blog I’ll write some time. But I know many of you in our community run into the same conversations about our ICS community. To all of you I will tell you now that I can say with great confidence the folks telling you that it “can’t be done” “shouldn’t be done” or “cannot be done long term” are on the wrong side of the argument. We have a lot of work to be done. But this is a community milestone.

It’s not a Dragos only story. The work by so many firms, so many passionate professionals, students, practitioners, leaders, government agencies, and even competitors have been a part of getting here. And here we stand on a larger platform than ever before, as an OT/ICS cybersecurity community, to tell our story.

If you’re in our community we at Dragos hope this provides some ammo for you to propel your ICS security journey forward. If you’re not in the ICS security community and you want to join, we hope this is a good signal to you that you can have a wonderful career here and its worth your time. Your local power company, water utility, oil and gas, manufacturing, rail, data center, mining, etc. companies are hiring. Go check them out. Their mission is worth investing in.

The Department of Energy’s New Grid Resilience for National Security (GRNS) Subcommittee

November 30, 2020

In this blog post I want to explain what the Grid Resilience for National Security (GRNS) subcommittee is and some thoughts that I have on the role of the committee in our broader electric system community. I will try to demystify and add context to what these types of committees are as well.

The Department of Energy (DOE) serves many roles and responsibilities across the broader energy community not only in the electric sector but also in the oil and gas communities. As a federal government agency, it is also the sector specific agency for the energy sector, i.e. the agency on tap to be the face to the community from the US government and help where they can. I have always been a huge fan of the Department of Energy and its numerous missions that help secure national security while just generally being good community members.

To help guide its mission the DOE created the Electricity Advisory Committee (EAC) which is staffed by industry experts. Each are appointed by the Secretary of Energy and become government employees with various focus areas. The EAC looks at everything from storm reliability to grid storage discussions and recommends to the various DOE leaders courses of action and focus that could be of use to the energy community and the broader US. The DOE then reviews and actions amongst its many mission leaders and focus areas.

I was appointed to the EAC earlier this year and its been a fascinating view into timely discussions with true partnership from the DOE representatives. I am generally opposed to the creation of new government agencies, coordination groups, etc. as a panacea for problems. This is especially true on the topic of cybersecurity where I have been fairly critical on the role and responsibilities of government as it comes to cybersecurity and the necessity to engage asset owners and operators as well as the private sector better as partners. What makes the EAC stand out to me though is it is not a government agency with billions of dollars of taxpayer funding that’s meant to solve everything. Instead, the positions are unpaid, everyone there is volunteering their expertise, and the conversations are all collaborative and try to help government break the group think that can form in any large organization.

The DOE has continually recognized cybersecurity as an important national security topic especially for electric systems. Not just the Enterprise information technology (IT) environments but critically the industrial control systems (ICS) or broadly the operational technology (OT) parts of the electric system. That is where the real risk is and what we all need to focus more on beyond regulation.

The DOE, as a result of this sharpening focus, looked to identify critical electric infrastructure (CEI) and defense critical electric infrastructure (DCEI) as defined by the Federal Power Act and amplify its partnership and resourcing of those private sector and public companies that are on the list. Its an important clarification to state: everything is critical. The small local distribution grid to your hometown is critical to the people it serves. An attack against it could have impacts far beyond what we anticipate not just in larger cascading issues but more realistically in American citizen confidence and in emboldening our strategic adversaries. But the US government cannot pretend that everything is critical to it. It must focus if it is to achieve any level of success with its partners and its limited resources. The focus of CEI and DCEI asks the hard-hitting questions of: what’s most critical? And what’s our role in helping those companies?

Beyond just the DCEI the DOE also has a need to understand threats that are over the horizon and how to develop strategies that play to everyone’s strengths to protect our country. As a result, the DOE has established a new subcommittee to the EAC. This inaugural committee is staffed with some truly passionate and committed leaders from around the energy sector including electric and oil and natural gas.

The subcommittee will be chaired by Dr. Paul Stockton. Paul is an exceptional leader and drenched in critical infrastructure security through a career working at various levels of the government from a Legislative Assistant in the US Senate to the Assistant Secretary of Defense for Homeland Defense. He has also consistently been involved in our energy community and held positions on advisory boards for Idaho National Laboratory, the Center for Cyber and Homeland Security Studies at the George Washington University, and as a Senior Fellow at the Johns Hopkins University Applied Physics Laboratory. To say it lightly Paul is a smart dude. More importantly to me, the late and great Mike Assante always told me Paul was someone to trust. That is all I ever needed to know his caliber.

I am proud to announce that I have been selected as the vice-chair of the committee. My time in government was at the lower levels (I exited as a Captain in the US Air Force after spending my young career at the National Security Agency) but my entire career has been focused on those ICS and OT systems that we are all so rightfully focused on. More importantly, the ICS/OT cybersecurity community and the broader industrial community across energy, manufacturing, rail, mining, water, and more have always been gracious to allow me to be a member of the community. I say all that to stress what Paul and I are bringing to this subcommittee filled with experts and passionate folks: the community approach.

We have a big charge for the GRNS subcommittee. And we need to ensure it’s not just “yet another committee” that doesn’t achieve it’s goals. We are going to be laser focused on accomplishing one or two things at a time instead of boiling the ocean. But the most important focus here is on the community. It is my opinion that the US energy community particularly does not get the credit it deserves when it comes to their investment and partnership. You need to look no further than the Electric Sector Coordinating Council and the amazing work done there as a CEO led organization in partnership with the Department of Homeland Security and Department of Energy to see what all these electric companies give up in time and focus to dedicate towards the national security mission.

Can we do more as a community? Yes. But does the community do more than it gets credit for? By far. I see it as our role on the GRNS to not only push the community forward by advising the DOE on cybersecurity focused topics for our electric system but to also highlight the amazing work done by our energy community. It is my goal that this subcommittee truly embraces its unique role in partnership to ensure we are talking with the electric system players, not at them.

Should Governments Actively Defend Private Sector Networks?

August 7, 2020

This is a blog I’ve wanted to write for a long time but every time I sat down to do it I found it difficult to capture the nuance of my intent here and also difficult to keep it from turning into a 50k word thesis paper. However, I keep getting the question “Should governments actively defend private sector networks” or iterations of it such as “should the government be doing assessments?” “should the government do incident response?” and a lot of variations of that question. Ultimately people are really asking “what is the role and responsibility of government in cybersecurity?” but that is definitely a complicated topic that I won’t pretend to be able to answer here. This blog is going to be me rambling more than normal. There are so many examples I have to back up my point of view that I simply cannot share which is unfortunate so I recognize this is going to have to be seen much more as “in my opinion” and accept that. I can also already see this ruffling many feathers especially of my government colleagues but please know I do see and value a role for governments in cybersecurity but I think we just have to be much more candid with ourselves so that we can get to a better place than the trajectory we’re on now. Ok here we go…

 

I keep getting the question on if the government should be performing the cyber defense mission for the private sector and there are a lot of important documents related lately on the topic I want to share some observations. I am not going to try to capture all the nuance. That will leave plenty of areas for disagreement and grey areas. But I think capturing some streaming thoughts are important as the “debate” is becoming incredibly one sided with not many voices publicly opposed to the topic while many in at least the networks I have are privately very vocal and with good reason.

For background reading for this topic I’d suggest Joe Slowik’s “Cyber Leviathan” entry (a much more nuanced and eloquent version of my ramble), the Cyber Solarium Commission report, DHS’ CISA Strategic Intent document, and Australia’s Cyber Security Strategy 2020 document.

I will also note there’s a big difference between “can” and “should” in the question of what the government role is. The answer “can the government (insert any of your choice) defend private sector networks (or companies) today?” is a simple “no.” Full stop. Can the government do this mission? No they are not equipped to do so and the private sector has far outpaced them. Forget the laws, regulations, or complications due sometimes to even constitutional protections – can the government do the job if they were allowed to do it – no. It’s not even resourcing alone at this point. Governments are large bureaucratic organizations and while they house many experts and wonderful passionate people as a whole they are not capable of this mission today as proven by decades of not only abdicating their role in this space but more importantly not even getting their own house in order. I always find it slightly laughable when people talk about the USG coming in to save the day when every report, analysis, assessment, etc. from GAO and others as well as public high profile breaches like OPM showcase that there is more than enough mission to keep the USG occupied with USG cybersecurity. And they have awesome people to do it and I believe they will be successful there; but they have their hands full. This is going to come off coarse and unfair in some instances, I say this all having been a USG cyber operator both in the Air Force and NSA doing defensive missions. I love the government folks I interact with especially across US, Canada, UK, Norway, Germany, Australia, New Zealand, Singapore, and the list goes on – tons of friends and amazing memories. Lots of love for the folks and admiration for the cybersecurity expertise that does exist in the pockets of complication and frustration. But can the government do the job of protecting the private sector from cyber attacks, does it have the experience to do so, the expertise, the resourcing, etc. any way you slice the answer the answer is no.

The question of “should” the government do this is a much more interesting debate to me.

At this point in time my answer is no but I recognize a lively and appropriate debate on the topic. While my answer on “can” is a strong no and I will argue with anyone that they are obtuse to think the answer is “yes”, on the topic of should I think it’s more of an open debate. I know which camp I am in right now but I am open to being flexible and recognize there are many points of view that are valid here. I think it is extremely difficult to have a conversation on this while ignoring the reality of the “no” that exists today but governments around the world are getting impatient on what they perceive to be strategic national risk due to cyber threats and expecting that the private sector is going to do that mission and the governments may not even get visibility into attacks. That is the piece that scares most governments. That there can be attacks domestically that companies are dealing with and the government doesn’t see it. We see these debates playing out right now fiercely in the US and usually the canary in the coal mine is the US electric sector. As an example, under NERC CIP there are new regulations that require electric companies to report any access or attempted access by unauthorized users/adversaries into bulk electric system systems such as those SCADA and ICS environments I often talk about. On the surface this sounds great. If an adversary is trying to get access into the ICS networks of a power company the government should know about it. The problem as I see it is the government should encourage that with partnership, value added efforts that incentivize sharing, etc. and that doing it via force is a great way to kill the appetite of those companies to even look for the problems in the first place. What is the incentive of a company to report such access attempts? Today there is very little. They can expect FBI/DHS/DOE/DoD/etc. all on their doorstep telling them “No, I’m the agency to talk to” and “I’m here to help” without any understanding of their systems or problems. In many cases USG help for these companies to date in the form of tactical network defense efforts (there are plenty of other efforts that have been extremely well received and helpful) have either been full of a false sense of security (we ran your data through our survey tool and you’re ok now because the lights showed green) or it has been highly confusing and complicated for those companies. That’s not to say all government assessment teams are incompetent, there are plenty that do amazing work across government owned infrastructure as an example. But again, we exist in an ecosystem today where the federal agencies cannot and will not even concretely define roles and responsibilities and they treat every case as an opportunity to go peacock and pitch their services and offerings and sharing groups more aggressively than the most annoying vendor. Again, I recognize how broad of a paintbrush I’m wielding here but please recognize I do understand the nuance but on this topic it’s getting worse not better and as far I see it there are very few challenges publicly to the mindset. Why? Because when a CEO of a company has USG come into their company and are told “you’re all good now” they feel good. Simple as that. They feel great. And if something happens and they’re in front of Congress about a major chemical explosion or power outage due to a cyber attack? They get to say “well we had your teams in DHS/DOD/DOE/EPA/FBI/etc.” in here and they said we were all good. It’s an extremely attractive proposition. I’m not saying people are doing any of this with malice. I don’t think anyone involved that I see is malicious or gaming the system. But a game has formed for sure, that I see at least, where people are incentivized not to do real security but to do security theater and the people who pay the biggest price are the day to day security analysts who have to watch their CEO’s publicly praise government agencies who are also publicly praising themselves in front of Congress while the security analysts in the companies were the ones that did all the work or had to pick up the pieces. This is not meant to be an overly cynical take. There are plenty of good things happening too. But I promise I’m getting to the answer of “should” in the shortest rambling way I can.

Why was all that lead in important to the topic of “should”?

Let’s review two items. One from the US’ CISA and one from Australia’s cyber strategy which largely touches on ASD. These were documents in the background reading above. And I say all this with as much love and admiration I can for both organizations, they are good examples here but they themselves are not the problem nor the people in those organizations. I have many friends in each and each organization has done a lot of amazing things and has great potential. But on this topic let’s be candid and transparent so we can all get to a better answer. Here we go:

  • Australia’s cyber strategy specifically calls out that they intend to invest $1.67B over the next 10 years to achieve their vision. The vision they outline covers a ton of areas from cyber security advice for families, to taking a more offensive approach to Australia’s strategic adversaries, to protecting and actively defending the critical infrastructure across Australia.
  • US’ CISA Strategic Intent notes they want to partner more with the private sector with goals of defending infrastructure today and helping strengthen critical infrastructure long term. They advocate for common themes across USG from the years including risk management, risk visibility, information sharing, capacity building, training, deployed tools and sensors, and incident management and incident response

If you review various appropriations documents and conversations on public record between DHS and Congress you’ll find they are positioning very heavily for programs like “Cyber Sentry” where DHS wants to create and maintain their own technology to deploy directly into networks, including into ICS networks, for them to be able to pull data out of those environments and perform managed defense and hunting type efforts for companies. Softer voices will note that this is really only intended for defense critical industries but that’s an absolute falsehood and it’s intended for everyone from power companies to pharma companies. There are plenty of times on public stages from RSA to Congressional hearings to documents such as this that DHS has noted they will perform network defense and incident response for the private sector. “Call us” they will say. They actively do assessments in the sectors today. And there are plenty of times if you talk to one of the many fantastic people in the DHS and ask them “do you do incident response” the answer is “yes” or “well it depends” and then you slice out what they mean by incident response. Which in many times is largely “we won’t actually do the incident response, data collection, etc. efforts but if you do the data collection we’re happy to take a look at it for you and if we know anything from the USG we’ll share back.” That’s not incident response. That’s a great value of “if you’re going through an incident communicate with us and we’ll try to help you with any insights we have.” That’s fantastic. But that’s not incident response. I have been to a wide number of critical infrastructure sites across the US where you ask them if they want to do an assessment in their networks or prepare an incident response plan for the eventual day they’re attacked they will tell you “no DHS was in here last year and our incident response plan is to call DHS.” If you get the right people in CISA as an example they’ll be explicit that that’s not what they are intending and private sector companies should still do their own efforts. But based on the communication, like CISA’s strategic intent document, this is the result you actively get. Additionally, you can ask different people in DHS and FBI field offices and similar and get different answers. It’s a highly confusing narrative that no matter what is based on the “can” the government do this response of “no.”

Looking at Australia as an example who I truly believe is trying to get to a good answer as well, the idea to do everything they’ve talked about across 10 years with a $1.67B budget is simply not going to work. That’s not even a very effective budget across the broad mission they are painting. The idea that they also want to message that they will actively defend networks in critical infrastructure could very quickly lead to a misunderstanding in the sector that the “government has it covered” and will be your incident response team or even do proactive work for you. Government agencies would do well to pick one or two things they want to be good at and go nail it; trying to boil the ocean quickly loses confidence from all parties involved.

Governments should want the private sector playing to their strengths and governments should play to theirs. Each have numerous strengths and roles to play. But when governments choose to focus on tactical network defense, incident response, risk assessments, etc. and messaging that they have that space covered or will – not only are we not playing with a full hand but it also messages to the private sector “spend your resources elsewhere we have this covered” and in my opinion that will overall lower the level of security across the country.

The private sector in the US, as an example, outpaces the government in cybersecurity process, people, and technology. Training, expertise, insights, intelligence, etc. on the topic of cyber threats reign king in the private sector compared to what is in the US. Many times the classified US intel report on something is a combination of three or four private sector reports they’ve bought and pieced together with a picture of an Iranian and slapped a TS/SCI label on it. That sounds harsh. But it’s honestly that bad and worse sometimes. I have seen my own threat intelligence reports copy/pasted in full with no citation and had a classified label slapped on it and redistributed to the private sector as original work. That’s not saying the government doesn’t have amazing finds, cool people, great expertise, etc. but again – we must play to our strengths. Also private sector companies don’t get to say “see yea we can do this without you government.” Nope, many of the reasons private sector companies outpace the government today is largely in part to investments the government has made over the years into this space, the early years of their work in fields like incident response and threat intelligence, and collaborating with the private sector. Undoubtedly we are better together. But in claiming to do missions that the government cannot do you will destroy the ecosystem and competition amongst companies that drives innovation and expertise.

Which leaves me to the “should”. In my opinion governments should not be taking up a tactical cybersecurity mission such as network defense, incident response, deploying sensors in your networks, etc. especially in sectors that have a community or market. Not only is this deeply rooted in the “they can’t do it effectively anyway” commentary but it’s deeply rooted in my belief that tax payer companies should not be competing with tax paid entities when it lowers the overall value to the community. Where the USG as an example is developing amazing expertise it is also leaning on the very private sector cybersecurity firms that they run the risk of destroying. When there is no market somewhere there is absolutely a reason for governments to enter the discussion. If you look at what CISA did with state and local commissions around election security as an example – grand slam home run yes please do more of that. But hopefully that effort plus the private sector effort ongoing will lead to a vibrant market and community around election security that sees the government back out of that space and play more to the strategic and amplification role and responsibility they are fantastic at. In the same way in the ICS community, early DHS ICS-CERT did proactive work across the sector in ways that drove conversations at the executive level and tried to help showcase the issue. Now there’s a vibrant ICS security community and market and the butterfly effect and even unintended consequence of poor messaging or execution by government agencies like DHS/FBI/DOD could kill it.

So again I’ll ramble and say “should” the government take on this mission. No. I do not think they’ve shown they understand the requirements, the way to engage the community, the way to have us all play with a full hand, and do so in a way that isn’t taking feedback from the executives in public but is understanding the tactical mission players they are working with to be able to do it in a way that is going to lead us anywhere more productive. Said simply. They haven’t shown the maturity of understanding to even enter the discussion in a serious way. So no they shouldn’t take on the broad and complex mission of protecting the private sector from cyber attacks.

But something does have to change. There are many companies that don’t have the resources to do cybersecurity. Many cybersecurity markets get created around the top 10% and sometimes even the top 1% of companies wanting to invest in cybersecurity. In fairness, if the answer to “could” was “yes” to just take the problem and solve it I would be highly incentivized to say “yes” to the “should” regardless. I’d love this problem to be solved and focus my time and talent elsewhere. But that’s not the reality we live in. Yes we do need a more broad approach to cybersecurity and yes the government “should” have visibility into challenges and threats facing the private sector. A Congresswoman should be able to know that in her district a strategic foreign adversary is compromising a site that has the potential to impact her constituents. But we all need to be really thoughtful about levying the requirements we all have instead of trying to force an answer and pretend we all agree on the requirements. Pushing an agenda of the government protecting networks or deploying technology for remote access and visibility isn’t the answer. The answer in my opinion will revolve around a no kidding discussion of what the requirements are and working with government and the private sector to understand how to achieve those requirements being open to innovative and interesting approaches. Government can and should do work in the private sector, they should do it in partnership with private sector companies and vendors, it should be done in a way that helps build the ecosystem and sustainable approach. Grant programs (DOE’s CEDS now under CESER as an example) that are competitive bids to incentivize innovation and work in the private sector with help from governments – yes do more of that that’s awesome. But “We got this” is not a real answer and shouldn’t be. We should all explore incentives for companies while also exploring ways to hold maligned or incompetent actors accountable for choices that impact the community. I’m not anti-regulation as an example but let’s not try to regulate ourselves into a secure state let’s try to regulate away the things we know don’t work or can agree on as the basics. But the answer in my opinion should not be that governments message or attempt to do tactical network defense actions. I’ll end my rambling by stating unequivocally the answer regardless of the should is they are not equipped or capable to do it today anyway.

 

 

 

DISC: SANS ICS Virtual Conference and ICS CTF Event Details

April 26, 2020

On April 30th, 2020 there will be an entirely free, really exciting, industrial control system (ICS) capture the flag (CTF) hosted by Dragos, Inc. and the SANS Institute. Following that there will be an entirely free day long virtual conference with speakers from SANS and Dragos, Inc. covering topics from building your own ICS range, analyzing ICS vulnerabilities, thinking through the easiest and low cost actions you can take to better enable ICS security quickly, and more.

There are a lot of people that have signed up so I want to provide some quick details ahead of the email going out about this tomorrow. You can register for the event here.

The conference agenda is published at the link I posted and it’s pretty self explanatory. The only thing to call out is that the times are different than normal; we did that so that folks across the US could easily access it and it’s long enough that folks across the world can participate in different parts of it from a time zone perspective; it’s difficult to balance this but the sessions are all being recorded. If you sign up you will get the slides and recordings after the fact. The only confusing thing is that initially the webinar was going to be run on GoToWebinar as all SANS presentations are but after we blew past the limit (well over 3,500 have signed up already) we switched to Zoom (yes we evaluated the security concerns and found Zoom’s response and actions to be appropriate). So if you have a calendar invite for GoToWebinar that’s a legacy thing. However, there’s nothing you need to do. On the day of the conference simply go to the same registration link that you used to sign up, when you sign in to your SANS portal account that link will turn into the conference link and automatically forward you to the Zoom invite (we have enabled the browser option so you do not need to install the Zoom application if you do not want).

On the ICS CTF I want to draw folks’ attention to a few points to help them prepare. For those of you that have participated in a NetWars before, this is an entirely new and unique DISC ICS NetWars so you won’t see any overlap with previous questions and approaches of the other ICS NetWars run at the various SANS Summits. Additionally, the style will be different anyway since Dragos made Level 3 and Level 4. Here’s the most important details for everyone (these will all be in the email that goes out on Monday but in case you don’t get the email I wanted to write them down in the blog):

 

  • DISC ICS NetWars is an entirely unique ICS CTF and will only be run at this event
  • The data, questions, and answers will be made available to everyone who registers for the virtual conference, you do not need to register for the CTF to get the data
  • You should only register for the CTF if you plan to play live, it’s limited to 1k people so we want to ensure everyone who wants to play gets to play
  • To register for the CTF you must first register for the conference, then, starting Monday the 27th, in your SANS portal you will see a NetWars registration link; it is first come first serve
  • The style of the CTF is entirely defensive; there will be questions ranging from entry level questions that are multiple choice (e.g. what is the accurate way to describe Fieldbus protocols?), intermediate questions that have data sets (e.g. here’s some PLC ladder logic, analyze it to find the flag), and advanced questions primarily in the form of packet captures (e.g. analyze an ICS range’s data to find flags in ICS protocols, analyze attacks happening, and perform functions across asset identification, threat detection, and response with network security skills)
  • You will be playing at home on your own (teams may be enabled, we’re checking now to see if it’s doable but plan on playing alone as a back up plan if you have a team)
  • You can use your own system and your own tools, no tools or VMs will be provided; I would recommend network security tools and VMs like SecurityOnion or if you’re a SANS alumni your ICS515 VM
  • There will be prizes. It’ll at least include coins and swag but we’re seeing if we can get approval for free SANS events, training, and maybe some Amazon gift cards; we’ll know more at the event
  • Normally at ICS NetWars you can ask questions and get help; we’ll have a Slack channel for everyone and a Zoom link for everyone to join in on if they want to hear our commentary or us answering questions and announcing important information to the participants, but it will be entirely impossible for us to answer 1,000 people’s questions consistently. So plan on only asking questions that relate to technical issues and getting up and running with the data, you will not have much support in the event outside of that
  • The day of the event we’ll have all the appropriate details for everyone and a welcome brief (that will be shared over a Zoom link we’ll distribute in email) to include the Slack channel, some FAQs, and some details to get started. We’ll distribute as much of these as possible ahead of the event especially for those of you who are joining at different times instead of doing the CTF the entire time
  • There will be a leaderboard broadcast through the Zoom conference on the day of the CTF
  • Austin Scott (the lead architect of the CTF) will present the last session at the conference on May 1st to go through Level 3 and 4’s questions and answers. It will not be a full walk through but give you all the answers and details that would have been helpful. Post event, all the questions/answers will be published. People are free to post their own walkthroughs
  • Be Social! The hashtag #DISCSANS is the event (CTF and Conference) hashtag; share helpful tips with people, collaborate with peers, and try to make this as social as possible given the socially distant life to which we are all dealing with
  • If you are intimidated by the concept of a CTF don’t worry. The event is broken into 4 levels.
    • Level 1: QA with multiple choice and hints to help you answer the questions
    • Level 2: Some multiple choice, some exact answer, across some technical data sets such as packet captures, but still with hints enabled and very approachable
    • Level 3 and Level 4: A single packet capture that’ll contain data from an ICS range and a wide variety of technically challenging questions with little to no hints
    • The approach means that the winners will really have to earn it but everyone can play and learn from any background including brand new folks
      • This is an exceptionally important event for people to learn from, it is very difficult to get ICS range data normally especially with attacks and a variety of ICS protocols take this opportunity

 

Dragos and SANS are doing this for the community as a thank you for everyone always being so awesome but also as an opportunity to help share the world of ICS security and get you all excited about it. As always thank you for your continued learning and excitement. Take care, look for the email on Monday, check your portal Monday regardless, and have fun! If you have any questions use the hashtag #DISCSANS and I’ll try to answer as many as I can ahead of the event.

 

 

Claims of a Cyber Attack on Iran’s Abadan Oil Refinery and the Need for Root Cause Analysis

October 20, 2019

This blog was originally posted by me here.

__________________________________________________

 

On October 20th, 2019 the Twitter account @BabakTaghvaee posted that there was a fire at the Abadan Oil Refinery in Iran; notably the account claimed that the fire was a result of a confirmed cyber attack. A video was posted of the fire and the news organization Retuers had posted just prior to the tweet about the fire as well. The Reuters reporting cited Iranian state broadcaster IRIB to say that the fire was in a canal carrying waste from the oil refinery and was at that time under control. Various posts on social media took advantage of the claim to spread the information about the cyber attack and claim that it was “probably” a result of the alleged Iranian attacks on Saudi Aramco. A few commentators linked to the Reuters story on a secret cyber attack was carried out by the U.S. on Iran published on October 16th as proof and fell victim to the classic Post Hoc Propter Hoc fallacy of assuming correlation equals causation.

The purpose of this blog is to add some context to such events for the purpose of avoiding hype but to clearly point out a gap in the industrial cybersecurity community that we have around root cause analysis and the importance of setting forth a strategy across collection, visibility, and detection to ever get to the point where response scenarios can account for such processes.

Cyber attacks can absolutely have the capability to cause devastating effects. Adversaries have become more aggressive over the last few years in this space and are demonstrating an increase in knowledge and sophistication with regards to causing physical effects through cyber intrusions and capabilities. In 2017, the TRISIS malware leveraged by XENOTIME was responsible for a shut down of a Saudi Arabian’ petrochemical company where the adversary failed in their likely actual intent to kill people at the facility by targeting safety systems. In that case though, one of the interesting details is that the adversary tried multiple times to achieve their effect. The first time TRISIS was deployed it failed, the plant shut down, and the personnel involved attempted to do root cause analysis. Root cause analysis is well understood and practiced in the engineering and operations communities. However, those practices rarely fully consider a cyber component.

In the TRISIS case, the plant engineers could not determine what went wrong, i.e. they did not identify the cyber attack during or after the event and went back into operations giving the adversary another opportunity. It is not that the cyber attack was undetectable, it was perfectly detectable through a variety of detection approaches in the industrial networks, but the defenders at that site were not performing industrial specific cyber detection. Because of the lack of detection capabilities as well as the collection capabilities feeding into them some of the evidence was not available after the attack to properly get to root cause analysis of the event and what evidence was available was easy to miss. This is like trying to photograph the getaway car of a robbery after the car is already gone; you can still find other evidence such as tire tracks, but it would have been nice to have the photo of the license plate. Often times there are forensic practices that can take place after that attack even without good detection capabilities, but they can be easy to miss if not prepared for properly in the incident response procedures or highlighted through threat detection and intrusion analysis.

In the Abadan case it is unlikely from what we know of such incidents and normal engineering practices around root cause analysis that the personnel on site have had any opportunity at all to properly do root cause analysis. Refinery fires are not rare, but they are serious events that the engineering and operations community usually handle maturely with safety as the number one priority. While personnel are still trying to get the fire under control it is very unlikely that anyone is performing root cause analysis of the event to include a cyber component. Proper root cause analysis including cyber forensics is one of the most difficult tasks to achieve in industrial control systems (ICS) networks. The ICS cybersecurity community is maturing rapidly but still very far from being able to perform this level of a task reliably.

It is my estimates that only a small subset of the community is gaining visibility into the ICS networks today though the progress we are seeing is encouraging and a hallmark of increasing maturity. A smaller subset of that community though is pursuing a collection and detection strategy factored in to the products, process, and training they implement. A much smaller subset is tying this into what types of events they want to be able to respond to and gain root cause analysis. Even if Abadan’s oil refinery was world leading in this regard it is unlikely enough time has passed for anyone to properly analyze the information collected. For this reason, I would assess that any claims of a cyber attack are immature at this point and unlikely to be founded in proper evidence. Should cyber be considered though? Absolutely, especially with the increasing tension and demonstration of adversaries. But today the larger industry lives closer to Schrödinger’s ICS than we do to organizations’ reliably achieving root cause analysis.

It is my recommendation to the ICS cybersecurity community that events like this be used to highlight the gaps we have in our current defenses. We should not hype up such events but instead look inward and determine if we could answer similar questions of “was it a cyber attack?” in our own industrial and operations networks. I often recommend to organizations to start with a few scenarios that you want to be able to respond to taken both from intelligence-driven scenarios as well as consequence-driven scenarios. From those determine what types of requirements, such as root cause analysis, reliability, and safety will be important to the organization and its stakeholders. Develop incident response plans from those events and work backwards to define the type of detection that you’ll need to get to that incident response and the type of collection you’ll need to get to that detection. That will help define your visibility requirements. Instead of starting with visibility and working forward, potentially never getting to the results you need, start with the end in mind and work backwards to ensure the visibility requirements are aligned.

For more information on these topics I would recommend Dragos’ Collection Management Framework, the Four Types of Threat Detection, and Consequence-Driven ICS Cybersecurity papers as well as the Year In Review reports which should help you on your path to think about the challenges ahead and operate more safe and reliable infrastructure.