Tuesday, December 30, 2014

Five Reasons Attribution Matters

Attribution is the hottest word in digital security. The term refers to identifying responsibility for an incident. What does it matter, though? Here are five reasons, derived from the five levels of strategic thought. I've covered those before, namely in The Limits of Tool- and Tactics-Centric Thinking.

Note that the reasons I outline here are not the same as performing attribution based on these characteristics. Rather, I'm explaining how attribution can assist responsible actors, from defenders through policymakers.

1. Starting from the bottom, at the Tools level, attribution matters because identifying an adversary may tell defenders what software they can expect to encounter during an intrusion or campaign. It's helpful to know if the adversary uses simple tools that traditional defenses can counter, or if they can write custom code and exploits to evade most any programmatic countermeasures.

Vendors and software engineers tend to focus on this level because they may need to code different defenses based on attacker tools.

2. The benefits of attribution are similar at the Tactics level. Tactics describes how an adversary acts within an engagement or "battle." It describes how the foe might use tools or techniques to accomplish a goal within an individual encounter.

For example, some intruders may abandon a system as soon as they detect the presence of an administrator or the pushback of a security team. Others might react differently by proliferating elsewhere, or fighting for control of a compromised asset.

Security and incident response teams tend to focus on this level because they have direct contact with the adversary on a daily basis. They must make defensive choices and prioritize security personnel attention in order to win engagements.

3. The level of Operations or Campaigns describes activities over long periods of time, from days to months, and perhaps years, over a wider theater of operations, from a department or network segment to an entire organization's environment.

Defenders who can perform attribution will better know their foe's longer-term patterns of behavior. Does the adversary prefer to conduct operations around holidays, or certain hours of the day or days of the week? Do they pause between tactical engagements, and for how long? Do they vary intrusion methods? Attribution helps defenders answer these and related questions, perhaps avoiding intrusion fatigue.

CISOs should focus on this level and some advanced IR teams incorporate this tier into their work. This is also the level where outside law enforcement and intelligence teams organize their thinking, using terms like "intrusion sets." All of these groups are trying to cope with long-term engagement with the adversary, and must balance hiring, organization, training, and other factors over budget and business cycles.

4. At the level of Strategy, attribution matters to an organization's management and leadership, as well as policymakers. These individuals must decide if they should adjust how they conduct business, based on who is attacking and damaging them. Although they might direct technical responses, they are more likely to utilize other business methods to deal with problems. For example, strategic decisions could involve legal maneuvering, acquiring or invoking insurance, starting or stopping business lines, public relations, hiring and firing, partnerships and alliances, lobbying, and other moves.

Strategy is different from planning, because strategy is a dynamic discipline derived from recognizing the interplay with intelligent, adaptive foes. One cannot think strategically without recognizing and understanding the adversary.

5. Finally, the level of Policy, or "program goals" in the diagram, is the supreme goal of government officials and top organizational management, such as CEOs and their corporate boards. These individuals generally do not fixate on technical solutions. Policymakers can apply many government tools to problems, such as law enforcement, legislation, diplomacy, sanctions, and so forth. All of these require attribution. Policymakers may choose to fund programs to reduce vulnerabilities, which in some sense is an "attribution free" approach. However, addressing the threat in a comprehensive manner demands knowing the threat. Attribution is key to any policy decision where one expects other parties to act or react to one's own moves.

Remember the five levels of strategic thought and their associated parties and responsibilities when you hear anyone (especially a techie) claim "attribution doesn't matter" or "don't do attribution."

Also, check out Attributing Cyber Attacks by my KCL professor Thomas Rid, and fellow PhD student Ben Buchanan.

Sunday, December 28, 2014

Don't Envy the Offense

Thanks to Leigh Honeywell I noticed a series of Tweets by Microsoft's John Lambert. Aside from affirming the importance of security team members over tools, I didn't have a strong reaction to the list -- until I read Tweets nine and ten. Nine said the following:


9. If you shame attack research, you misjudge its contribution. Offense and defense aren't peers. Defense is offense's child.

I don't have anything to say about "shame," but I strongly disagree with "Offense and defense aren't peers" and "Defense is offense's child." I've blogged about offense over the years, but my 2009 post Offense and Defense Inform Each Other is particularly relevant. John's statements are a condescending form of the phrase "offense informing defense." They're also a sign of "offense envy."

John's last Tweet said the following:



10. Biggest problem with network defense is that defenders think in lists. Attackers think in graphs. As long as this is true, attackers win

This Tweet definitely exhibits offense envy. It plays to the incorrect, yet too-common idea, that defenders are helpless drones, while the offense runs circles around them thanks to their advanced thinking.

The reality is that plenty of defenders practice advanced thinking, while even nation-state level attackers work through checklists.

At the high end of the offense spectrum, many of us have seen evidence of attackers running playbooks. When their checklist ends, the game may be up, or they may be able to ask their supervisor or mentor for assistance.

On the other end of the spectrum, you can enjoy watching videos of lower-skilled intruders fumble around in Kippo honeypots. I started showing these videos during breaks in my classes.

I believe several factors produce offense envy.

  1. First, many of those who envy the offense have not had contact with advanced defenders. If you've never seen advanced defenders at work, and have only seen mediocre or nonexistent defense, you're likely to mythologize the powers of the offense.
  2. Second, many offense envy sufferers do not appreciate the restrictions placed on defenders, which result in advantages for the offense. I wrote about several of these in 2007 in Threat Advantages -- namely initiative, flexibility, and asymmetry of interest and knowledge. (Please read the original post if the last two prompt you to think I have offense envy!)
  3. Third, many of those who glorify offense hold false assumptions about how the black hats operate. This often manifests in platitudes like "the bad guys share -- why don't the good guys?" The reality is that good guys share a lot, and while some bad guys "share," they more often steal, back-stab, and inform on each other.


It's time for the offensive community to pay attention to people like Tony Sager, who ran the Vulnerability Analysis and Operations (VAO) team at NSA. Initially Tony managed independent blue and red teams. The red team always penetrated the target, then dumped a report and walked away.

Tony changed the dynamic by telling the red team that their mission wasn't only to break into a victim's network. He brought the red and blue teams together under one manager (Tony). He worked with the red team to make them part of the defensive solution, not just a way to demonstrate that the offense can always compromise a target.

Network defenders have the toughest job in the technology world, and increasingly the business and societal worlds. We shouldn't glorify their opponents.

Note: Thanks to Chris Palmer for his Tweet -- "He [Lambert] reads like a defender with black hat drama envy. Kind of sad." -- which partially inspired this post.

Monday, December 22, 2014

What Does "Responsibility" Mean for Attribution?

I've written a few posts here about attribution. I'd like to take a look at the word "responsibility," as used in the FBI Update on Sony Investigation posted on 19 December:

As a result of our investigation, and in close collaboration with other U.S. government departments and agencies, the FBI now has enough information to conclude that the North Korean government is responsible for these actions. While the need to protect sensitive sources and methods precludes us from sharing all of this information, our conclusion is based, in part, on the following... (emphasis added)

I'm not in a position to comment on the FBI's basis for its conclusion, which was confirmed by the President in his year-end news conference. I want to comment on the word "responsibility," which was the topic of a February 2012 paper by Jason Healey for The Atlantic Council, titled Beyond Attribution: Seeking National Responsibility in Cyberspace.

In the paper, Jason created the excellent table at left. You can read more about it in the original document.

Using the Spectrum of State Responsibility, in my assessment, the US government's statements include a range of possibilities, from State-encouraged to State-integrated.

(Options such as State-Prohibited, State-prohibited-but-inadequate, and State-ignored, are outside of the US government's "responsibility" statement.)

Given the nature of the DPRK regime and other factors, it is probable to conclude that the FBI's statement indicates State-ordered, State-executed, or State-integrated activity.

For example, if Bureau 121 is responsible, the attack would be State-executed.

If the DPRK contracted with third party criminal hackers, the attack would be State-ordered.

If the DPRK used both Bureau 121 and third party criminal hackers, the attack would be State-integrated.

It is unlikely the attack was State-rogue-conducted, meaning "out-of-control elements" attacked a victim. The incredibly restrictive, authoritarian nature of the DPRK regime and Internet access makes that highly unlikely.

Note that, using the Spectrum, some seemingly contradictory arguments can be resolved. For example, in a State-ordered scenario, the US government could correctly assert DPRK "responsibility," although the attack could have been executed by third party criminal hackers.

I believe the debate about the nature of DPRK activity would be more fruitful if concerned parties placed themselves on the Spectrum.

I do not know which option from the spectrum the FBI or other elements of the US government would place this DPRK incident, but as I said it is probable to conclude that the FBI's statement indicates State-orderedState-executed, or State-integrated activity.

On several related notes, I highly recommend reading Did North Korea Hack Sony? by RAND's Bruce Bennett, a true DPRK expert. Bennett explained his role recently on CNN. Also listen to this interview, read this story citing Korean defector Kim Heung Kwang, and read this paper (PDF) by DPRK expert Dr Alexandre Mansourov. I also agree with the analysis here by Professor Michael Schmitt.

Finally, I suggest that critics of government attribution need to think beyond their current positions, towards the consequences of their beliefs. If they demand higher standards for attribution, they're essentially asking for less anonymity, and more identification on the Internet. That would likely lead to government identity schemes, which the critics would also detest. They should be careful what they ask for, in other words.

Friday, December 05, 2014

Nothing Is Perfectly Secure

Recently a blog reader asked to enlist my help. He said his colleagues have been arguing in favor of building perfectly secure systems. He replied that you still need the capability to detect and respond to intrusions. The reader wanted to know my thoughts.

I believe that building perfectly secure systems is impossible. No one has ever been able to do it, and no one ever will.

Preventing intrusions is a laudable goal, but I think security is only as sound as one's ability to validate that the system is trustworthy. Trusted != trustworthy.

Even if you only wanted to make sure your "secure" system remains trustworthy, you need to monitor it.

Since history has shown everything can be compromised, your monitoring will likely reveal an intrusion.

Therefore, you will need a detection and a response capability.

If you reject the notion that your "secure" system will be compromised, and thereby reject the need for incident response, you still need a detection capability to validate trustworthiness.

What do you think?

Tuesday, December 02, 2014

Bejtlich on Fox Business Discussing Recent Hacks

I appeared on Fox Business (video) today to discuss a wide variety of hacking topics. It's been a busy week. Liz Claman and David Asman ask for my perspective on who is responsible, why the FBI is warning about destructive malware, how the military should respond, what businesses can do about intrusions, and more. All of these subjects deserve attention, but I tried to say what I could in the time available.

For more on these and other topics, don't miss the annual Mandiant year-in-review Webinar, Wednesday at 2 pm ET. Register here. I look forward to joining Kristen Verderame and Kelly Jackson Higgins, live from Mandiant HQ in Alexandria, Virginia.

Monday, November 17, 2014

Response to "Can a CISO Serve Jail Time?"

I just read a story titled Can a CISO Serve Jail Time? Having been Chief Security Officer (CSO) of Mandiant prior to the FireEye acquisition, I thought I would share my thoughts on this question.

In brief, being a CISO or CSO is a tough job. Attempts to criminalize CSOs would destroy the profession.

Security is one of the few roles where global, distributed opponents routinely conduct criminal acts against business operations. Depending on the enterprise, the offenders could be nation state adversaries largely beyond the reach of any party, to include the nation state hosting the enterprise. Even criminal adversaries can remain largely untouchable.

I cannot think of another business function that suffers similar disadvantages. If a commercial competitor took actions against a business using predatory pricing, or via other illegal business measures, the state would investigate and possibly prosecute the offending competitor. For actions across national boundaries, one might see issues raised at the World Trade Organization (WTO), assuming the two hosting countries are WTO members.

These pressures are different from those faced by other elements of the business. When trying to hire and retain staff, human resources doesn't face off against criminals. When trying to close a deal, sales people don't compete with military hackers. (The exception might be transactions involving Chinese or Russian companies,) When creating a brand campaign, marketing people might have to worry about negative attention from hacktivists, but if the foe crosses a line the state might prosecute the offender.

The sad reality is that no organization can prevent all intrusions. The best outcome is to prevent as many intrusions as possible, and react quickly and effectively to those compromises that occur. As long as the security team contains and removes the intruder before he can accomplish his mission, the organization wins.

We will continue to see organizations fined for poor security practices. The Federal Trade Commission, Securities and Exchange Commission, and Federal Communications Commission are all very active in the digital security arena. If prosecutors seek jail time for CSOs who suffer compromises, I would expect CSOs will leave their jobs. They already face an unfair fight. We don't need to add the threat of jail time to the list of problems confronting security staff.

Monday, November 10, 2014

Thank You for the Review and Inclusion in Cybersecurity Canon

I just read The Cybersecurity Canon: The Practice of Network Security Monitoring at the Palo Alto Networks blog. Rick Howard, their CSO, wrote the post, which marks the inclusion of my fourth book in Palo Alto's Cybersecurity Canon. According to the company's description, the Canon is:

a list of must-read books where the content is timeless, genuinely represents an aspect of the community that is true and precise and that, if not read, leaves a hole in a cybersecurity professional’s education that will make the practitioner incomplete.

The Canon candidates include both fiction and nonfiction, and for a book to make it into the canon, must accurately depict the history of the cybercrime community, characterize key places or significant milestones in the community, or precisely describe technical details that do not exaggerate the craft.

It looks like my book is only the second technical book to be included. The first appears to be the CERT Guide to Insider Threats.

I am incredibly thankful for the positive and thorough coverage of my newest book, The Practice of Network Security Monitoring (PNSM). It is clear Rick spent a lot of time reading the book and digesting the contents. Even the post headings, such as "Network Security Monitoring Is More Than Just a Set Of Tools," "Operate Like You Are Compromised: Kill Chain Analysis," "Network Security Monitoring as a Decision Tool, Not a Reaction Process," "Incident Response and Threat Intelligence Go Together," and so on communicate key themes in my book.

With his background at the Army CERT, Counterpane, and iDefense, it's clear Rick converted his experiences defending significant networks into a worldview that resonates with that in PNSM.

Rick also emphasizes one of the goals of the book, which is to get anyone started on the road to network instrumentation. I wrote the book, and teach a class -- Black Hat, 8-9 December, near DC -- for this very purpose.

I wanted to add a bit more detail to the last section of the blog for the benefit of those who have not yet read PNSM. Rick mentions some of the tools incorporated in Security Onion, but I wanted to be sure readers understand the full spectrum of SO capabilities. I captured that in Figure 6-1, reproduced below.

While I don't cover all of these tools in PNSM, as Rick wrote, I show how to leverage core SO capabilities to detect and respond to intrusions.

If you would like a copy of PNSM, consider buying from the No Starch Web site, using discount code NSM101 to save 30%. One benefit over buying from the publisher is getting the digital and print editions in a bundle.

Thank you again to Rick Howard and Palo Alto Networks for including PNSM in the Cybersecurity Canon.

Tuesday, September 16, 2014

We Need More Than Penetration Testing

Last week I read an article titled  People too trusting when it comes to their cybersecurity, experts say by Roy Wenzl of The Wichita Eagle. The following caught my eye and prompted this post:

[Connor] Brewer is a 19-year-old sophomore at Butler Community College, a self-described loner and tech geek...

Today he’s what technologists call a white-hat hacker, hacking legally for companies that pay to find their own security holes. 

When Bill Young, Butler’s chief information security officer, went looking for a white-hat hacker, he hired Brewer, though Brewer has yet to complete his associate’s degree at Butler...

Butler’s security system comes under attack several times a week, Young said...

Brewer and others like him are hired by companies to deliberately attack a company’s security network. These companies pay bounties if the white hackers find security holes. “Pen testing,” they call it, for “penetration testing.”

Young has repeatedly assigned Brewer to hack into Butler’s computer system. “He finds security problems,” Young said. “And I patch them.”

On the face of it, this sounds like a win-win story. A young white hat hacker does something he enjoys, and his community college benefits from his expertise to defend itself.

My concern with this article is the final sentence:

Young has repeatedly assigned Brewer to hack into Butler’s computer system. “He finds security problems,” Young said. “And I patch them.”

This article does not mention whether Butler's CISO spends any time looking for intruders who have already compromised his organization. Finding security problems and patching them is only one step in the security process.

I still believe that the two best words ever uttered by Bruce Schneier were "monitor first," and I worry that organizations like those in this article are patching holes while intruders maneuver around them within the compromised network.

A Brief History of Network Security Monitoring

Last week I was pleased to deliver the keynote at the first Security Onion Conference in Augusta, GA, organized and hosted by Doug Burks. This was probably my favorite security event of the year, attended by many fans of Security Onion and the network security monitoring (NSM) community.

Doug asked me to present the history of NSM. To convey some of the milestones in the development of this operational methodology, I developed these slides (pdf). They are all images, screen captures, and the like, but I promised to post them. For example, the image at left is the first slide from a Webinar that Bamm Visscher and I delivered on 4 December 2002, where we presented the formal definition of NSM the first time. We defined network security monitoring as

the collection, analysis, and escalation of indications and warnings to detect and respond to intrusions.

You may recognize similarities with the intelligence cycle and John Boyd's Observe - Orient - Decide Act (OODA) loop. That is not an accident.

During the presentation I noted a few key years and events:

  • 1986: The Cliff Stoll intrusions scare the government, military, and universities supporting gov and mil research.
  • 1988: Lawrence Livermore National Lab funds three security projects at UC Davis by supporting the Prof Karl Levitt's computer science lab. They include AV software, a "security profile inspector," and the "network security monitor."
  • 1988-1990: Todd Heberlein and colleagues code and write about the NSM platform.
  • 1991: While instrumenting a DISA location suffering from excessive bandwidth usage, NSM discovers 80% of the clogged link is caused by intruder activity.
  • 1992: Former FBI Director, then assistant AG, Robert Mueller writes a letter to NIST warning that NSM might not be legal.
  • 1 October 1992: AFCERT founded.
  • 10 September 1993: AFIWC founded.
  • End of 1995: 26 Air Force sites instrumented by NSM.
  • End of 1996: 55 Air Force sites instrumented by NSM.
  • End of 1997: Over 100 Air Force sites instrumented by NSM.
  • 1999: Melissa worm prompts AFCERT to develop dedicated anti-malware team. This signaled a shift from detection of human adversaries interacting with victims to detection of mindless code interacting with victims.
  • 2001: Bamm Visscher deploys SPREG, the predecessor to Sguil, at our MSSP at Ball Aerospace.
  • 13 July 2001: Using SPREG, one of our analysts detects Code Red, 6 days prior to the public outbreak. I send a note to a mailing list on 15 July.
  • February 2003: Bamm Visscher recodes and releases Sguil as an open source NSM console.

As I noted in my presentation,. the purpose of the talk was to share the fact that NSM has a long history, some of which happened when many practitioners (including myself) were still in school.

This is not a complete history, either. For more information, please see my 2007 post Network Security Monitoring History and the foreword, written by Todd Heberlein, of my newest book The Practice of Network Security Monitoring.

Finally, I wanted to emphasize that NSM is not just full packet capture or logging full content data. NSM is a process, although my latest book defines seven types of NSM data. One of those data types is full content. You can read about all of them in the first first chapter of my book at the publisher Web site.

Thursday, September 04, 2014

Bejtlich Teaching at Black Hat Trainings 8-9 Dec 2014

I'm pleased to announce that I will be teaching one class at Black Hat Trainings 2014 in Potomac, MD, near DC, on 8-9 December 2014. The class is Network Security Monitoring 101. I taught this class in Las Vegas in July 2013 and 2014, and Seattle in December 2013. I posted Feedback from Network Security Monitoring 101 Classes last year as a sample of the student commentary I received.

This class is the perfect jumpstart for anyone who wants to begin a network security monitoring program at their organization. You may enter with no NSM knowledge, but when you leave you'll be able to understand, deploy, and use NSM to detect and respond to intruders, using open source software and repurposed hardware.

The first discounted registration deadline is 11:59 pm EDT October 31st. The second discounted registration deadline (more expensive than the first but cheaper than later) ends 11:59 pm EST December 5th. You can register here.

I recently topped the 1,000 student count for my cumulative years of teaching my own material at Black Hat. Since starting my current Black Hat teaching run in 2007, I've completely replaced each course every other year. In 2007-2008 I taught TCP/IP Weapons School version 1. In 2009-2010 I taught TCP/IP Weapons School version 2. In 2011-2012 I taught TCP/IP Weapons School version 3. In 2013-2014 I taught Network Security Monitoring 101.

I have no plans to design a new course for 2015 and beyond. If you want to see me teach Network Security Monitoring and related subjects, Black Hat is your best option.

Please sign up soon, for two reasons. First, if not enough people sign up early, Black Hat might cancel the class. Second, if many people sign up, you risk losing a seat. With so many classes taught at this venue, the conference lacks the large rooms necessary to support big classes.

Several students asked for a more complete class outline. So, in addition to the outline posted currently by Black Hat, I present the following that shows what sort of material I cover in my new class.

OVERVIEW

Is your network safe from intruders? Do you know how to find out? Do you know what to do when you learn the truth? If you are a beginner, and need answers to these questions, Network Security Monitoring 101 (NSM101) is the newest Black Hat course for you. This vendor-neutral, open source software-friendly, reality-driven two-day event will teach students the investigative mindset not found in classes that focus solely on tools. NSM101 is hands-on, lab-centric, and grounded in the latest strategies and tactics that work against adversaries like organized criminals, opportunistic intruders, and advanced persistent threats. Best of all, this class is designed *for beginners*: all you need is a desire to learn and a laptop ready to run a virtual machine. Instructor Richard Bejtlich has taught over 1,000 Black Hat students since 2002, and this brand new, 101-level course will guide you into the world of Network Security Monitoring.

CLASS OUTLINE

Day One

0900-1030
·         Introduction
·         Enterprise Security Cycle
·         State of South Carolina case study
·         Difference between NSM and Continuous Monitoring
·         Blocking, filtering, and denying mechanisms
·         Why does NSM work?
·         When NSM won’t work
·         Is NSM legal?
·         How does one protect privacy during NSM operations?
·         NSM data types
·         Where can I buy NSM?

1030-1045
·         Break

1045-1230
·         SPAN ports and taps
·         Making visibility decisions
·         Traffic flow
·         Lab 1: Visibility in ten sample networks
·         Security Onion introduction
·         Stand-alone vs server plus sensors
·         Core Security Onion tools
·         Lab 2: Security Onion installation

1230-1400
·         Lunch

1400-1600
·         Guided review of Capinfos, Tcpdump, Tshark, and Argus
·         Lab 3: Using Capinfos, Tcpdump, Tshark, and Argus

1600-1615
·         Break

1615-1800
·         Guided review of Wireshark, Bro, and Snort
·         Lab 4: Using Wireshark, Bro, and Snort
·         Using Tcpreplay with NSM consoles
·         Guided review of process management, key directories, and disk usage
·         Lab 5: Process management, key directories, and disk usage

Day Two

0900-1030
·         Computer incident detection and response process
·         Intrusion Kill Chain
·         Incident categories
·         CIRT roles
·         Communication
·         Containment techniques
·         Waves and campaigns
·         Remediation
·         Server-side attack pattern
·         Client-side attack pattern

1030-1045
·         Break

1045-1230
·         Guided review of Sguil
·         Lab 6: Using Sguil
·         Guided review of ELSA
·         Lab 7: Using ELSA

1230-1400
·         Lunch

1400-1600
·         Lab 8. Intrusion Part 1 Forensic Analysis
·         Lab 9. Intrusion Part 1 Console Analysis

1600-1615
·         Break

1615-1800
·         Lab 10. Intrusion Part 2 Forensic Analysis
·         Lab 11. Intrusion Part 2 Console Analysis

REQUIREMENTS

Students must be comfortable using command line tools in a non-Windows environment such as Linux or FreeBSD. Basic familiarity with TCP/IP networking and packet analysis is a plus.

WHAT STUDENTS NEED TO BRING

NSM101 is a LAB-DRIVEN course. Students MUST bring a laptop with at least 8 GB RAM and at least 20 GB free on the hard drive. The laptop MUST be able to run a virtualization product that can CREATE VMs from an .iso, such as VMware Workstation (minimum version 8, 9 or 10 is preferred); VMware Player (minimum version 5 -- older versions do not support VM creation); VMware Fusion (minimum version 5, for Mac); or Oracle VM VirtualBox (minimum version 4.2). A laptop with access to an internal or external DVD drive is preferred, but not mandatory.

Students SHOULD test the open source Security Onion (http://securityonion.blogspot.com) NSM distro prior to class. The students should try booting the latest version of the 12.04 64 bit Security Onion distribution into live mode. Students MUST ensure their laptops can run a 64 bit virtual machine. For help with this requirement, see the VMware knowledgebase article “Ensuring Virtualization Technology is enabled on your VMware host (1003944)” (http://kb.vmware.com/selfservice/microsites/search.do?language=en_US&cmd=displayKC&externalId=1003944). Students MUST have the BIOS password for their laptop in the event that they need to enable virtualization support in class. Students MUST also have administrator-level access to their laptop to install software, in the event they need to reconfigure their laptop in class.

WHAT STUDENTS WILL RECEIVE

Students will receive a paper class handbook with printed slides, a lab workbook, and the teacher’s guide for the lab questions. Students will also receive a DVD with a recent version of the Security Onion NSM distribution.

TRAINERS

Richard Bejtlich is Chief Security Strategist at FireEye, and was Mandiant's Chief Security Officer when FireEye acquired Mandiant in 2013. He is a nonresident senior fellow at the Brookings Institution, a board member at the Open Information Security Foundation, and an advisor to Threat Stack, Sqrrl, and Critical Stack. He is also a Master/Doctor of Philosophy in War Studies Researcher at King's College London. He was previously Director of Incident Response for General Electric, where he built and led the 40-member GE Computer Incident Response Team (GE-CIRT). Richard began his digital security career as a military intelligence officer in 1997 at the Air Force Computer Emergency Response Team (AFCERT), Air Force Information Warfare Center (AFIWC), and Air Intelligence Agency (AIA). Richard is a graduate of Harvard University and the United States Air Force Academy. His fourth book is "The Practice of Network Security Monitoring" (nostarch.com/nsm). He also writes for his blog (taosecurity.blogspot.com) and Twitter (@taosecurity), and teaches for Black Hat.

Thursday, August 21, 2014

Air Force Leaders Should Read This Book

I just finished reading The Icarus Syndrome: The Role of Air Power Theory in the Evolution and Fate of the U.S. Air Force by Carl Builder. He published this book in 1994 and I wish I had read it 20 years ago as a new Air Force second lieutenant. Builder makes many interesting points in the book, but in this brief post I'd like to emphasize one of his concluding points: the importance of a mission statement.

Builder offers the following when critiquing the Air Force's mission statement, or lack thereof, around the time of his study:

[Previous] Air Force of Staff, General John P. McConnell, reportedly endorsed the now-familiar slogan

     The mission of the Air Force is to fly and fight. 

Sometime later, the next Chief, General John D. Ryan, took pains to put it more gruffly:

     The job of the Air Force is to fly and to fight, and don't you ever forget it. (p 266)

I remember hearing "Fly, Fight, Win" in the 1990s as well.

Builder correctly criticizes these mission statements on multiple grounds, none more compelling than this: how are non-flyers supposed to interpret this statement? It's simply a reminder and reinforcement of the second-class status of non-flyers in the Air Force. Furthermore, Builder more or less also notes that "fight" is often eclipsed but non-combat missions, such as airlift or humanitarian relief. Finally, Builder doesn't ask the question explicitly, but how does one define "winning"? Would wars in Iraq or Afghanistan be a "win"? That's a demoralizing way to think in my opinion.

Builder offers a wonkish, but conceptually more useful, mission statement on p 284:

The mission of the Air Force is the military control and exploitation of the aerospace continuum in support of the national interests.

The author immediately notes that one Air Force officer criticized Builder's mission statement as too "academic," but I think this particular policy wonk is on target.

Curious as to what the current Air Force mission statement says, I checked the Our Mission page and read at the top:

The mission of the United States Air Force is to fly, fight and win … in air, space and cyberspace.

Wow. That's even worse than before. Not only does it still insult non-flyers, but now the mission involves "flying" in "cyberspace."

I strongly suggest Air Force leaders read Builder's book. It's as relevant today as it was 20 years ago.

Sunday, June 01, 2014

On the Twenty Years Since My USAFA Graduation


Twenty years ago today, on 1 June 1994, 1024 of us graduated from the United States Air Force Academy, commissioned as brand new second lieutenants. As of September 2012, over 600 members of the class of 1994 were still in uniform. I expect that number is roughly the same today. Reaching the 20 year mark entitles my classmates still in uniform to retire with lifetime benefits, should they choose to do so. I expect some will, but based on patterns from earlier classes I do not expect a massive exodus. The economy is still in rough shape, and transitioning from the military to the private sector after a lifetime in uniform is a jarring experience.

I remember 1994 being a fairly optimistic year, but the personnel situation was precarious for those who wanted to fly. After graduation we found ourselves in the middle of a drawdown, with no undergraduate pilot training (UPT) slots available. One jody (marching song) of the time went as follows:

Oh there are no fighter pilots in the Air Force...(repeat)
Because there is no UPT for 94 or 93
Oh there are no fighter pilots in the Air Force...

I stayed in the Air Force until early 2001, at which point I brought my military intelligence and computer network defense skills to the private sector. I've stayed in the private world since then.

I do not regret my time in uniform, from 1990 to 2001, although I would not repeat the time I spent at the Air Force Academy. Many people are surprised to hear me say that. Upon reflection I believe those four years consisted of a mental, physical, and spiritual endurance test, and I wonder if I could have found a better match for my personality and interests elsewhere.

From an academic perspective, I made the most of my "free" education, graduating 3rd in my class with degrees in history and political science, and minors in French and German. From a leadership perspective I enjoyed my roles as an element leader during my junior year and as a flight commander my senior year. I also met some of the finest young people this nation could have produced, as well as some of the most dedicated professors I've ever known.

After 20 years of consideration, however, I've begun to realize that I endured that four year experience because I thought others expected it of me. I didn't do it for myself, and coincidentally the message the Air Force ingrained into me -- "Service Before Self" -- did nothing to balance my younger personality. In my 40s, I've managed to realize that it's ok to determine and pursue personal interests, but I wish I had figured that out in my late teens.

In a matter of weeks the class of 2018 will report for basic training. Would I tell them to go home? Of course not. My hope is that they are there because they believe their personal goals match the needs of the service. I do not believe they should be there only because they expect their country needs them. The Air Force and the nation needs the best this country can provide, but they should not expect those who serve to do so at the expense of their souls.

This fall is my 20 year reunion, and I plan to attend the event with my family. I hope to see some of my former classmates there, likely with their families. My wife and I attended the 10 year reunion in 2004, and it was a powerful and memorable experience. Today though, I would like to thank all of the class of 1994, especially those still in uniform, for their service. I also extend my best wishes to the brave men and women of the inbound class of 2018. You can do it, but do it only if you really want to be there.

Fly, fight, win!

Wednesday, May 14, 2014

Video of Bejtlich at Cyber Crime Conference 2014

On Tuesday the 29th of April I delivered a keynote at the US Cyber Crime Conference in Leesburg, VA.

The video is online although getting to it is more complicated than clicking on a link to YouTube.

Here's what I did to access the video.

First, visit this link for a "SabreCity" account. Fill in your "information" and click Register.

You will then see a rude message saying "Registration for this conference is now closed."

That's no problem. From the same browser now visit this link to go to the SabreCity "lobby."

Click the "On Demand" button on the right side of the screen. Now you can access all of the videos from the conference.

Mine is called "State of the Hack: 2014 M-Trends - Beyond the Breach." Click the green arrow to the left of the title to start the video.

You may be interested in several of the other interesting speakers listed as well. Thank you to Jim Christy and his team for organizing the conference, inviting me to speak, and for providing these videos for free online.

Update: You might want to know what I discuss. For the first part of the talk I summarize three key findings from the 2014 M-Trends Report. In the second part I discuss strategic security using a Civil War example then turn to a network security monitoring example. In the final minutes I answer audience questions.

Saturday, May 03, 2014

Brainwashed by The Cult of the Quick

Faster is better! Those of us with military backgrounds learned that speed is a "weapon" unto itself, a factor which is "inherently decisive" in military conflict. The benefit of speed was so ingrained into my Air Force training that I didn't recognize I had been brainwashed by what Dr. Thomas Hughes rightly identified as The Cult of the Quick.

Dr. Hughes published his article of this title in the Winter 2001 issue of the Aerospace Power Journal. His main point is the following:

At a time when the American military has global commitments arrayed at variable threats, both real and potential, the Pentagon’s single-minded view of speed leaves the nation’s defenders poorly prepared for the range of military opposition and enemies they may face.

Although Dr. Hughes wrote his article in 2001, his prescription is as accurate as ever. I found his integration of Edward Luttwak's point very telling:

In the 1990s, the quest for swift war, replete with exit strategies and premature cease-fires, has led to less, not more, decisive war, as Edward Luttwak argues. For him, wars nowadays rarely “run their natural course” to “burn themselves out and establish the preconditions for a lasting settlement.” Instead, they “become endemic conflicts that never end because the transformative effects of both decisive victory and exhaustion are blocked.” The present struggle against terrorism may well prove an acid test for Luttwak’s point.

These points resonated with me because they reflected what I am learning about the US Civil War. Scott, Grant and Lincoln knew that a quick, early strike against Richmond, whereby the Union seized the capital of the Confederacy, would not decisively end the Civil War and bring the rebels back to the Union. Sad as it may seem, the rebels had to believe that there was no further point in fighting the war. If Richmond had fallen in 1861, only months after the attack on Fort Sumter, it's likely the Confederacy would have transferred their capital and kept fighting. Following the advice of the "cult of the quick" would have been a poor strategy during the Civil War. (That doesn't necessarily justify fighting a four year conflict, but I believe a strategy of quickly capturing Richmond to the exclusion of other objectives would have resulted in Civil War 2, and so on, similar to World War II.)

On the cyber side, the article reminded me of an area where speed is often paramount: detection and response. However, I remembered that my guidance on "fast" containment has always integrated one exception, as I noted on page 199 of my newest book, The Practice of Network Security Monitoring:

The speed with which a CIRT and constituent take containment actions is the subject of hot debate in the security world. Some argue for fast containment in order to limit risk; others argue for slower containment, providing more time to learn about an adversary. The best answer is to contain incidents as quickly as possible, as long as the CIRT can scope the incident to the best of its capability.

Scoping the incident means understanding the intruder’s reach. Is he limited to interacting with only the one computer identified thus far? Does he control more computers, or even the entire network by virtue of exploitation of the Active Directory domain controllers?

The speed with which a CIRT can make the containment decision is one of the primary ways to measure its maturity. If the CIRT regularly learns of the presence of advanced (or even routine) threats via notification by external parties, then rapid containment is less likely to be effective. A CIRT that cannot find intrusions within its own environment is not likely to be able to rapidly scope an incident. “Pulling the plug” on the first identified victim will probably leave dozens, hundreds, or thousands of other victims online and available to the adversary.

On the other hand, if the CIRT develops its own threat intelligence, maintains pervasive visibility, and quickly finds intruders on its own, it is more likely to be able to scope an incident in a minimum amount of time. CIRTs with that sort of capability should establish the intruder’s reach as rapidly as possible, and then just as quickly contain the victim(s) to limit the adversary’s options. (emphasis added)

I highly recommend reading The Cult of the Quick. You may find you have also been brainwashed!

Gunfight picture credits: Popular Mechanics

Thursday, April 24, 2014

Five Thoughts on New China Article

I just read a thoughtful article by Michael O'Hanlon and James Steinberg, posted at Brookings and Foreign Policy titled Don't Be a Menace to South (China Sea).

It addresses thorny questions regarding China as President Obama visits South Korea, Japan, Malaysia, and the Philippines.

I wanted to share five quick thoughts on the article, fully appreciating I don't have all the answers to this complex strategic problem.

1. "Many in China see the U.S. rebalance as ill-disguised containment, while many in the United States see Chinese military modernization and territorial assertiveness as strong indications that Beijing seeks to undermine Washington's alliances and drive the United States from the Western Pacific."

I agree with these statements as being perceptions by both sides, but I also think they are closer to the truth than what the authors believe. I recommend Dr Ashley Tellis' monograph Balancing Without Containment: An American Strategy for Managing China as the best strategy I've seen for handling this aspect of the problem.

2. "Compounding this challenge, the long-term intentions of both sides are inherently unknowable. The inclination in the face of such uncertainty is to prepare for the worst -- which all too frequently becomes a self-fulfilling prophecy."

I disagree that long-term intentions are inherently unknowable. Building on the first point, the Chinese want to project regional power without US interference, and the US wants to maintain the ability to protect power globally. That means the two sides will be in conflict in the South China Sea and other regional Chinese waters.

3. "That does not mean Washington must immediately unsheathe the sword if tensions escalate over China's actions near the Senkakus or disputed islands in the South China Sea, but it must make clear that it is prepared to impose significant costs if red lines are crossed -- which is why the response to Russia's actions in Ukraine is so salient to the situation in East Asia."

I believe many commentators and policymakers cringe at the term "red lines" when applied to the current administration. The President's use of the term with respect to Syrian weapons of mass destruction has weakened his position. Perhaps more importantly, just what are the "red lines" in the South China Sea? The authors recommend meeting alliance commitments, but what does that mean?

4. "U.S. allies in Asia worry that China's ability to impose economic costs against the United States might deter Washington from acting -- a concern exacerbated by U.S. and European caution in imposing costs on Russia. The late March expansion of sanctions against Russia should help reassure U.S. allies of Washington's willingness to accept the risks of economic retaliation in order to impose costs on those who cross red lines."

There are few similarities between the US-Russia and US-China economic relationships. The risks of economic retaliation from Russia are far smaller than those that could be applied by China. US allies should worry about China's ability to impose economic costs against the US, but that is tempered somewhat by the effects those sanctions could have against China itself.

5. "The United States and its allies also have an interest in reassuring China that if Beijing acts responsibly, they will not seek to thwart its future prosperity and security... These might include "Open Skies" reconnaissance agreements, where both sides allow territorial overflights to reduce concerns about concealment...

Just as important as formal agreements is the willingness of both sides to exercise restraint in defensive actions that might appear threatening; to enhance transparency to dispel misunderstandings; and to reciprocate positive actions to stimulate a virtuous circle of enhanced confidence. This might mean Chinese willingness to slow the rate of its military buildup rather than race for parity." (emphasis added)

What does "act responsibly" mean? In US eyes, it probably means the Chinese allow the US to project power globally, including in the South China Sea. As I mentioned above, the Chinese don't want this to be the case in the medium and long term.

"Open skies" agreements and "enhanced transparency" are non-starters for China, just as they were non-starters for the Soviet Union in the 1950s. Strategic theory explains why. China is militarily weaker than the United States. They fear that the more the US learns about Chinese capabilities, the more accurately and effectively the US will be able to target and neutralize those capabilities. The Chinese follow this approach with nuclear weapons and cyber weapons, as we saw with the latter recently (see Adam Segal's What Briefing Chinese Officials On Cyber Really Accomplishes.)

I see few situations where China would slow its military buildup, with the exception of nuclear weapons. With nuclear weapons, the important feature is a first-strike-survivable retaliation capability. The Chinese don't need to match the US warhead-for-warhead if the US knows we can't get away with a first strike against China. (To learn more about this dynamic, see Strategic Stability: Contending Interpretations.)

On the conventional side, the Chinese are more likely to try to outbuild the US, because they still lack a qualitative advantage compared to US forces. Given declining US budgets, the Chinese should be able to out-spend and out-build the US Navy and Air Force, the two most critical services for a future US-China conflict.

Overall, this is a very tough problem, but I recommend reading the piece by Dr Tellis for the best answer I've read concerning strategic approaches to the US-China issue in the South China Sea.

Thursday, March 20, 2014

Are Nation States Responsible for Evil Traffic Leaving Their Networks?

During recent talks to various audiences, I've mentioned discussions within the United Nations. One point from these discussions involved certain nation states agreeing to modes of behavior in cyber space. I found the document containing these recent statements: A/68/98, Group of Governmental Experts on Developments in the Field of Information and Telecommunications in the Context of International Security (pdf). This document is hosted within the United Nations Office for Disarmament Affairs, in the developments in the field of information and telecommunications section.

Fifteen countries were involved in producing this document: Argentina, Australia, Belarus, Canada, China, Egypt, Estonia, France, Germany, India, Indonesia, Japan, the Russian Federation, the United Kingdom of Great Britain and Northern Ireland and the United States of America.

Within the section titled "Recommendations on norms, rules and principles of responsible behaviour by States," I found the following noteworthy:

19. International law, and in particular the Charter of the United Nations, is applicable and is essential to maintaining peace and stability and promoting an open, secure, peaceful and accessible ICT environment...

23. States must meet their international obligations regarding internationally wrongful acts attributable to them. States must not use proxies to commit internationally wrongful acts. States should seek to ensure that their territories are not used by non-State actors for unlawful use of ICTs.

The first statement is important because it "imports" a large body of external law and agreements into the cyber field, for good or ill.

The second statement is important because, if States obey these principles, it has interesting effects upon malicious activity leaving State networks. Collectively these sentences imply that States are responsible for their networks. States can't claim that they are only innocent intrusion victims, and that any malicious activity leaving their State isn't their fault or problem.

Whether States try to meet these obligations, and whether others call them out for not meeting them, is another matter.

Sunday, March 16, 2014

Five Thoughts from VADM Rogers Testimony

I had a chance to read Advance Questions for Vice Admiral Michael S. Rogers, USN (pdf) this weekend.

I wanted to share five thoughts based on excerpts from the VADM Rogers' answers to written questions posed by the Senate Armed Services Committee.

1. The Committee asked: Can deterrence be an effective strategy in the absence of reliable attribution?

VADM Rogers responded: Yes, I believe there can be effective levels of deterrence despite the challenges of attribution. Attribution has improved, but is still not timely in many circumstances...

Cyber presence, being forward deployed in cyberspace, and garnering the indications and warnings of our most likely adversaries can help (as we do with our forces dedicated to Defend the Nation). (emphasis added)

I wonder if "cyber presence" and "being forward deployed in cyberspace" means having access to adversary systems? There's little doubt as to the source of an attack if you are resident on the system launching the attack.

2. The Committee asked: Is it advisable to develop cyberspace officers as we do other combat arms or line officers? Why or why not?

VADM Rogers responded: ...We must find a way to simultaneously ensure combat arms and line officers are better prepared to contribute, and cyberspace officers are able to enjoy a long, meaningful career with upward mobility. A meaningful career should allow them to fully develop as specialized experts, mentor those around them, and truly influence how we ought to train and fight in this mission space. 

I am especially interested in the merit of how a visible commitment to valuing cyberspace officers in our ranks will affect recruitment and retention. I believe that many of today’s youth who are uniquely prepared to contribute (e.g. formally educated or self-developed technical expertise) do not feel there is a place for them in our uniformed services

We must find a way to strengthen the message of opportunity and I believe part of the answer is to do our part to ensure cyberspace officers are viewed as equals in the eyes of line and combat arms officers; not enablers, but equals. Equals with capabilities no less valued than those delivered by professional aviators, special operators, infantry, or surface warfare. (emphasis added)

In my opinion, the best way to meet these goals is to create a separate Cyber Force. Please read the article Time for a US Cyber Force by Admiral James Stavridis (ret) and David Weinstein.

3. The Committee asked: The Unified Command Plan (UCP) establishes U.S. Cyber Command as a subunified command reporting to U.S. Strategic Command. We understand that the Administration considered modifying the UCP to establish U.S. Cyber Command as a full combatant command.
What are the best arguments for and against taking such action now?

VADM Rogers responded: ...The argument for full Unified Command status is probably best stated in terms of the threat. Cyber attacks may occur with little warning, and more than likely will allow only minutes to seconds to mount a defensive action seeking to prevent or deflect potentially significant harm to U.S critical infrastructure. 

Existing department processes and procedures for seeking authorities to act in response to such emergency actions are limited to Unified Combatant Commanders. If confirmed, as the Commander of U.S. CYBERCOM, as a Sub-unified Combatant Commander I would be required to coordinate and communicate through Commander, U.S. Strategic Command to seek Secretary of Defense or even Presidential approval to defend the nation in cyberspace. 

In a response cycle of seconds to minutes, this could come with a severe cost and could even obviate any meaningful action. As required in the current Standing Rules of Engagement, as a Combatant Commander, I would have the requisite authorities to directly engage with SECDEF or POTUS as necessary to defend the nation. (emphasis added)

I'm dismayed but not surprised by this argument. I'm dismayed because it sounds like the most important reason to establish a unified cyber command is the perception that "cyber attacks...allow only minutes to seconds to mount a defensive action." This is just not true for any strategically significant attack.

If you only have "minutes to seconds" left for defense, you are way too far down the kill chain. You need to be intercepting the adversary in the reconnaissance phase, or at least no earlier than the stage whereby the threat explores the target searching for critical elements. I fear the "minutes to seconds" camp is a legacy of the bad old days of Internet worms from 10 years ago.

4. The Committee asked: How could the Internet be redesigned to provide greater inherent security?

VADM Rogers responded: Advancements in technology continually change the architecture of the Internet. Cloud computing, for instance, is a significant change in how industry and individuals use Internet services... 

Several major providers of Internet services are already implementing increased security in email and purchasing services by using encryption for all transmissions from the client to the server. It is possible that the service providers could be given more responsibility to protect end clients connected directly to their infrastructures. 

They are in a position to stop attacks targeted at consumers and recognize when consumer devices on their networks have been subverted. The inability of end users to verify the originator of an email and for hackers to forge email addresses have resulted in serious compromises of end user systems... (emphasis added)

So, we see reference to cloud computing, encrypting client-to-server communications, ISPs protecting end users, and email verification. Think of all the tactical and technology options that were not mentioned here. Also notice the lack of discussion of better operations/campaigns and strategies. Finally, notice the Committee asked about redesigning the Internet, an engineering-focused approach.

5.  I am glad to live in a country where a candidate to lead important military and intelligence agencies can be questioned in then open for public benefit. However, I am disappointed that the Unified Command Plan (UCP), referenced several times in the Q&A, remains a classified document.

The best we seem to have is The Unified Command Plan and Combatant Commands: Background and Issues for Congress, (pdf) a 2013 Congressional Research Service document hosted by FAS, and History of the Unified Command Plan (pdf), hosted by dtic.mil. The 2012 CRS report is posted on a state.gov Web site. It would be helpful to read an unclassified version of the next UCP, which is due anytime it seems.

PHOTO CREDIT: Gary Cameron, Reuters.

Saturday, March 08, 2014

Bejtlich Teaching at Black Hat USA 2014

I'm pleased to announce that I will be teaching one class at Black Hat USA 2014 2-3 and 4-5 August 2014 in Las Vegas, Nevada. The class is Network Security Monitoring 101. I've taught this class in Las Vegas in July 2013 and Seattle in December 2013. I posted Feedback from Network Security Monitoring 101 Classes last year as a sample of the student commentary I received.

This class is the perfect jumpstart for anyone who wants to begin a network security monitoring program at their organization. You may enter with no NSM knowledge, but when you leave you'll be able to understand, deploy, and use NSM to detect and respond to intruders, using open source software and repurposed hardware.

The first discounted registration deadline is 11:59 pm EDT June 2nd. The second discounted registration deadline (more expensive than the first but cheaper than later) ends 11:59 pm EDT July 26th. You can register here.

Please note: I have no plans to teach this class again in the United States. I haven't decided yet if I will not teach the class at Black Hat Europe 2014 in Amsterdam in October.

Since starting my current Black Hat teaching run in 2007, I've completely replaced each course every other year. In 2007-2008 I taught TCP/IP Weapons School version 1. In 2009-2010 I taught TCP/IP Weapons School version 2. In 2011-2012 I taught TCP/IP Weapons School version 3. In 2013-2014 I taught Network Security Monitoring 101. This fall I would need to design a brand new course to continue this trend.

I have no plans to design a new course for 2015 and beyond. If you want to see me teach Network Security Monitoring and related subjects, Black Hat USA is your best option.

Please sign up soon, for two reasons. First, if not enough people sign up early, Black Hat might cancel the class. Second, if many people sign up, you risk losing a seat. With so many classes taught in Las Vegas, the conference lacks the large rooms necessary to support big classes.

Several students asked for a more complete class outline. So, in addition to the outline posted currently by Black Hat, I present the following that shows what sort of material I cover in my new class.

OVERVIEW

Is your network safe from intruders? Do you know how to find out? Do you know what to do when you learn the truth? If you are a beginner, and need answers to these questions, Network Security Monitoring 101 (NSM101) is the newest Black Hat course for you. This vendor-neutral, open source software-friendly, reality-driven two-day event will teach students the investigative mindset not found in classes that focus solely on tools. NSM101 is hands-on, lab-centric, and grounded in the latest strategies and tactics that work against adversaries like organized criminals, opportunistic intruders, and advanced persistent threats. Best of all, this class is designed *for beginners*: all you need is a desire to learn and a laptop ready to run a virtual machine. Instructor Richard Bejtlich has taught over 1,000 Black Hat students since 2002, and this brand new, 101-level course will guide you into the world of Network Security Monitoring.

CLASS OUTLINE

Day One

0900-1030
·         Introduction
·         Enterprise Security Cycle
·         State of South Carolina case study
·         Difference between NSM and Continuous Monitoring
·         Blocking, filtering, and denying mechanisms
·         Why does NSM work?
·         When NSM won’t work
·         Is NSM legal?
·         How does one protect privacy during NSM operations?
·         NSM data types
·         Where can I buy NSM?

1030-1045
·         Break

1045-1230
·         SPAN ports and taps
·         Making visibility decisions
·         Traffic flow
·         Lab 1: Visibility in ten sample networks
·         Security Onion introduction
·         Stand-alone vs server plus sensors
·         Core Security Onion tools
·         Lab 2: Security Onion installation

1230-1400
·         Lunch

1400-1600
·         Guided review of Capinfos, Tcpdump, Tshark, and Argus
·         Lab 3: Using Capinfos, Tcpdump, Tshark, and Argus

1600-1615
·         Break

1615-1800
·         Guided review of Wireshark, Bro, and Snort
·         Lab 4: Using Wireshark, Bro, and Snort
·         Using Tcpreplay with NSM consoles
·         Guided review of process management, key directories, and disk usage
·         Lab 5: Process management, key directories, and disk usage

Day Two

0900-1030
·         Computer incident detection and response process
·         Intrusion Kill Chain
·         Incident categories
·         CIRT roles
·         Communication
·         Containment techniques
·         Waves and campaigns
·         Remediation
·         Server-side attack pattern
·         Client-side attack pattern

1030-1045
·         Break

1045-1230
·         Guided review of Sguil
·         Lab 6: Using Sguil
·         Guided review of ELSA
·         Lab 7: Using ELSA

1230-1400
·         Lunch

1400-1600
·         Lab 8. Intrusion Part 1 Forensic Analysis
·         Lab 9. Intrusion Part 1 Console Analysis

1600-1615
·         Break

1615-1800
·         Lab 10. Intrusion Part 2 Forensic Analysis
·         Lab 11. Intrusion Part 2 Console Analysis

REQUIREMENTS

Students must be comfortable using command line tools in a non-Windows environment such as Linux or FreeBSD. Basic familiarity with TCP/IP networking and packet analysis is a plus.

WHAT STUDENTS NEED TO BRING

NSM101 is a LAB-DRIVEN course. Students MUST bring a laptop with at least 8 GB RAM and at least 20 GB free on the hard drive. The laptop MUST be able to run a virtualization product that can CREATE VMs from an .iso, such as VMware Workstation (minimum version 8, 9 or 10 is preferred); VMware Player (minimum version 5 -- older versions do not support VM creation); VMware Fusion (minimum version 5, for Mac); or Oracle VM VirtualBox (minimum version 4.2). A laptop with access to an internal or external DVD drive is preferred, but not mandatory.

Students SHOULD test the open source Security Onion (http://securityonion.blogspot.com) NSM distro prior to class. The students should try booting the latest version of the 12.04 64 bit Security Onion distribution into live mode. Students MUST ensure their laptops can run a 64 bit virtual machine. For help with this requirement, see the VMware knowledgebase article “Ensuring Virtualization Technology is enabled on your VMware host (1003944)” (http://kb.vmware.com/selfservice/microsites/search.do?language=en_US&cmd=displayKC&externalId=1003944). Students MUST have the BIOS password for their laptop in the event that they need to enable virtualization support in class. Students MUST also have administrator-level access to their laptop to install software, in the event they need to reconfigure their laptop in class.

WHAT STUDENTS WILL RECEIVE

Students will receive a paper class handbook with printed slides, a lab workbook, and the teacher’s guide for the lab questions. Students will also receive a DVD with a recent version of the Security Onion NSM distribution.

TRAINERS

Richard Bejtlich is Chief Security Strategist at FireEye, and was Mandiant's Chief Security Officer when FireEye acquired Mandiant in 2013. He is a nonresident senior fellow at the Brookings Institution, a board member at the Open Information Security Foundation, and an advisor to Threat Stack. He was previously Director of Incident Response for General Electric, where he built and led the 40-member GE Computer Incident Response Team (GE-CIRT). Richard began his digital security career as a military intelligence officer in 1997 at the Air Force Computer Emergency Response Team (AFCERT), Air Force Information Warfare Center (AFIWC), and Air Intelligence Agency (AIA). Richard is a graduate of Harvard University and the United States Air Force Academy. His fourth book is "The Practice of Network Security Monitoring" (nostarch.com/nsm). He also writes for his blog (taosecurity.blogspot.com) and Twitter (@taosecurity), and teaches for Black Hat.