Reportage from the 7th IT Security Automation Conference (ITSAC)

Three weeks ago (October 31th - November 2nd) I had the privilege to attend the 7th ITSAC in Arlington (Crystal City, Virginia), hosted by the National Institute of Standards and Technology (NIST) in conjunction with the Department of Homeland Security (DHS), National Security Agency (NSA) and Defense Information Systems Agency (DISA). The conference was about Security Automation through the development of the Security Content Automation Protocol (SCAP). “Security automation leverages standards and specifications to reduce the complexity and time necessary to manage vulnerabilities, measure security and ensure compliance, freeing resources to focus on other areas of the IT infrastructure”. - http://www.nist.gov/itl/csd/7th-annual-scap-conference.cfm.Although the primary intended “clients” of these efforts are US Federal Agencies and Agencies IT providers (as required by White House Office of Management and Budget), this set of standards and procedures could be used in both the public and private sector, as well as by other governments and their associated infrastructures (along with some “adjustments” to meet national or local laws/regulations requirements), becoming a significant component of large information security management and governance programs. NIST encourages widespread support and adoption for the SCAP indeed.


In brief, what is SCAP?

SCAP is a suite of specifications that standardize the format and nomenclature by which software flaw and security configuration information is communicated, both to machines and humans. SCAP is a multi-purpose framework of specifications that support automated configuration, vulnerability and patch checking, technical control compliance activities, and security measurement ” from NIST SP800-126r2.

Why to adopt SCAP? “Organizations need to conduct continuous monitoring of the security configuration of each system... at any given time... to demonstrate compliance with various sets of security requirements... these tasks are extremely time-consuming and error-prone because there has been no standardized, automated way of performing them... the lack of interoperability across security tools...can cause delays in security assessment, decision-making, and vulnerability remediation” from NIST SP800-117.

The Conference

Definitely it was an amazing experience and a real honor for me to meet such great people and organizations, getting closer to (and exploring) the real world of SCAP one year after I started my own research on this subject here at eMaze's R&D Center in Trieste, Italy (where the SCAP is still widely unknown and unsupported at the moment). The most impressive thing for me has been to see how the US Government, as well as other well known Agencies, is actively and deeply involved in the development and improvement of the SCAP, how much they believe in it and how large is the consensus even in the military/intelligence community and not only. Amongst the attendees I have seen people from: Department of Defense (DoD), Space and Naval Warfare Systems Command (SPAWAR), US Army, US AirForce, Department of State, USCYBERCOM, USSTRATCOM, Office of Naval Intelligence, National Nuclear Security Administration (NNSA), Department of Justice (DoJ), Department of Energy (DoE), NASA and so on...

I have found even remarkable to see at an event like this the presence and interest on the subject by some US universities, and not only from US as far as I know. For example, Ms. Angela Orebaugh (Booz Allen Hamilton), that I had the pleasure to meet, wrote an article about the University of North Carolina at Charlotte (UNCC) which house the Cyber Defense & Network Assurability (CyberDNA) Center, too (http://www.arc.uncc.edu). Her article is featured in the last number of the IAnewsletter (freely available here: http://iac.dtic.mil/iatac/download/Vol14_No4.pdf), focused on Security Automation. Also many vendors like nCircle, McAfee, Symantec, Juniper, Cisco, eEye, Tenable, Microsoft, Redhat, EMC and so on, were at the conference.

The conference stressed how much money the adoption of SCAP allowed the US Government to save in these years (“reducing time AND money”), along with the concept of “near real-time” Continuous Monitoring (NIST SP800-137) as opposed to the “snapshot-in-time” model for example (“you must track anything”). To paraphrase a famous Lord Kelvin's quote: “We cannot improve what we cannot measure” but, as you know, it is not always so simple.


Measure Software Security

Sean Barnum (MITRE) in his presentation “Measure Software Security” made the point about the great difference between “to measure” and “to be measurable”, talking about the problem of integrating a new security solution with existing solutions and, consequently, why we need “Standardized Approaches and the application of Architecting Principles”.
It's not the standards but how you use them
For all the readers unfamiliar with the MITRE's view of security and measures, you have to know that they proudly sponsorize and maintain the “Making Security Measurable (MSM) initiatives to provide the foundation for answering today’s increased demands for accountability, efficiency, resiliency, and interoperability without artificially constraining an organization’s solution options”. - http://measurablesecurity.mitre.org



Risk Analysis and Measurement

“Risk Analysis and Measurement with CWRAF” by Richard Struse (DHS) and Steve Christey (MITRE) was another presentation that I appreciated very much. They made a clear and important distinction between “weaknesses” and “vulnerabilities” in the context of security automation and software assurance:
A (software) weakness is a property of software/systems that, under the right conditions, may permit unintended / unauthorized behavior
while
A (software) vulnerability is a collection of one or more weaknesses that contain the right conditions to permit unauthorized parties to force the software to perform unintended behavior (a.k.a. “is exploitable”)

Please note the absence of the word “mistake” in these definitions...
Nowadays we are able to identify vulnerabilities with CVE identifiers, to score their impact with CVSS metrics and to classify weaknesses thanks to the Common Weakness Enumeration (CWE). Some examples of CWE are:
  • CWE-89: Improper Neutralization of Special Elements used in an SQL Command ('SQL Injection')
  • CWE-119: Improper Restriction of Operations within the Bounds of a Memory Buffer
However, we would like a way to specify priorities based on business/mission risk. So, “How do I identify which of the 800+ CWE’s are most important for my specific business domain, technologies and environment? ”. To answer this question, MITRE provides the Common Weakness Risk Analysis Framework (CWRAF) - http://cwe.mitre.org/cwraf/index.html. Instead, the Common Weakness Scoring System (CWSS) allows us to answer the question: “How do I rank the CWE’s I care about according to my specific business domain, technologies and environment?”. - http://cwe.mitre.org/cwss/index.html.

Getting the Network Security Basics Right

In “Getting the Network Security Basics Right” by Paul Bartock (NSA) and Steve Hanna (Juniper), Mr. Bartock shared with us some lessons learned by the Agency (I guess They learned many, many other things) while Mr. Hanna talked about “Trusted Network Connect & SCAP Use Cases ”. These lessons, although described by short sentences (unsurprisingly), have deep implications in our world:
  • The optimal place to solve a security problem is ... never where you found it.Corollary: the information for the solution is never in the right form for the solution
  • If it is happening to you today, then ... something very much like it happened to someone else yesterday, and will happen to someone else tomorrow.
    Corollary: and you probably don’t know them
  • After you figure out what happened, there were ... plenty of signs that *could* have helped us prevent or manage this.Corollary: but not all the signs are in “cyberspace” or available to “cyber defenders” (this is the best one and the most subtly winking in my opinion. Can you “read between the lines”?)
  • Information Sharing is ... Over-rated!Corollary: until you think about Purpose, Content, Plumbing, and the Framework

NVD CPE Dictionary

“NVD CPE Dictionary Management Practices” by Christopher McCormick (Booz Allen Hamilton) was a for me very important presentation (due to part of my job here in eMaze). We have exchanged many emails in these last months, working hard together on the correctness of many CPE names contained within National Vulnerabilty Database (NVD) Data Feeds and the Dictionary itself (http://nvd.nist.gov). I wish to thank Chris for his professionalism, helpfulness and patience.His presentation started describing the history of the transition to the CPE at NVD, from a proprietary product naming to the implementation of the initial CPE support with CPE 2.1 until today, becoming the primary source of CPE names based upon the Dictionary (containing about 35000 CPE names at the moment). After describing the CPE Dictionary Management process, Chris introduced a demo of the new CPE Submission Interface on the NVD web site which greatly improves the management process of names within the Dictionary.
"Common Platform Enumeration (CPE) is a standardized method of describing and identifying classes of applications, operating systems, and hardware devices present among an enterprise's computing assets" from NIST IR-7695. For example, the following is a formal and standardized way to describe the ubiquitous Microsoft Windows XP SP3 (try to think how many times we have seen different documents/reports/advisories referring to the same platform using strings like "Windows XP Service Pack 3" or "Win XP SP3") :
cpe:/o:microsoft:windows_xp::sp3


OVAL 5.10

Jon Baker (MITRE) in his presentation “OVAL 5.10 Update” talked about the Open Vulnerability and Assessment Language, a community-developed open standard that enables automated assessment and compliance checking. The OVAL Language is an XML-based framework for making logical assertions about a system (further information: http://oval.mitre.org). “OVAL provides the low level system assessment capability ” and can be used to perform Vulnerability Assessment (eg. Credentialed Scan) and Configuration Management (Defining the desired configuration and monitoring systems), just to mention some OVAL's Use Cases.
 
Common Configuration Scoring System (CCSS)

Not necessarily all “unauthorized accesses/actions” derive from bad pieces of code. Often attackers can take advantage from improperly configured settings or misconfigurations (eg. Directory listing on a web server).
Before to take any step further thinking (or talking) about the automation of security checks on these settings, we need to be able to uniquely identify security settings and configurations. Similar to the CVE effort, the Common Configuration Enumeration (CCE) assigns a unique, common identifier to a particular security-related configuration issue. “CCE identifiers are associated with configuration statements that express the way humans name and discuss their intentions when configuring computer systems . In this way, the use of CCE Identifiers (CCE-IDs) as tags provide a bridge between natural language, prose-based configuration guidance documents and machine-readable or executable capabilities such as configuration audit tools”. - For further information see http://cce.mitre.org.

The following is an example of a standard CCE identifier which represents a configuration requirement:
CCE-4191-3 (RHEL 5): “The dhcp client service should be enabled or disabled as appropriate for each interface”.

Having a way to identify security configuration settings, the next step is to be able to “rank” the security impact of configuration choices in a system, as well as CVSS to score vulnerabilities (CVE) for example. With regard to this, Karen Scarfone (Scarfone Cybersecurity) introduced the Common Configuration Scoring System (CCSS), “A universal way to convey the relative severity of security configuration choices ” based on CVSS version 2, therefore a set of metrics and formulas . Ms. Scarfone highlighted the fact that CCSS is “Not a risk assessment solution ”. Why should we use CCSS? Basically because “Understanding security implications of each configuration option allows better risk assessment and sound decision-making ”.
Many others interesting subjects that I would like to share with you has been presented and discussed at the conference, like standards for Asset Identification, IF-MAP, XCCDF, CEE and EMAP, MAEC... but a blog post is not enough to cover and explain them all. Hopefully this post will be only the first one on the subject.


1 comment:

  1. I have found also uncommon to see at an event like this the emergence and interest on the subject by some US universities, and not only from US as far as I know.



    Business security systems

    ReplyDelete