CWE

Common Weakness Enumeration

A community-developed list of SW & HW weaknesses that can become vulnerabilities

New to CWE? click here!
CWE Most Important Hardware Weaknesses
CWE Top 25 Most Dangerous Weaknesses
Home > Community > Software Assurance > Detection Methods  
ID

Detection Methods

The "Detection Methods" field within many CWE entries conveys information about what types of assessment activities that weakness can be found by. Increasing numbers of CWE entries will have this field filled in over time. The recent Institute of Defense Analysis (IDA) State of the Art Research report conducted for DoD provides additional information for use across CWE in this area. Labels for the Detection Methods being used within CWE are:

  • Automated Analysis
  • Automated Dynamic Analysis
  • Automated Static Analysis
  • Black Box
  • Fuzzing
  • Manual Analysis
  • Manual Dynamic Analysis
  • Manual Static Analysis
  • White Box

With this type of information (shown in the table below), we can see which of the specific CWEs that can lead to a specific type of technical impact are detectable by dynamic analysis, static analysis, and fuzzing evidence and which ones are not.

This table is incomplete, because many CWE entries do not have a detection method listed.

Technical Impact Automated Analysis Automated Dynamic Analysis Automated Static Analysis Black Box Fuzzing Manual Analysis Manual Dynamic Analysis Manual Static Analysis White Box
Execute unauthorized code or commands 78, 120, 129, 131, 476, 805 78, 79, 98, 120, 129, 131, 134, 190, 426, 798, 805 79, 129, 134, 190, 426, 494, 698, 798 98, 120, 131, 190, 426, 494, 805 476, 798 78, 798
Gain privileges / assume identity 601 306, 352, 426, 601, 798 259, 426, 798 259, 306, 352, 426 798 601, 798, 807
Read data 209, 311, 327 78, 89, 129, 131, 209, 404, 665 78, 79, 89, 129, 131, 134, 352, 426, 798 14, 79, 129, 134, 319, 426, 798 89, 131, 209, 311, 327, 352, 426 209, 404, 665, 798 78, 798 14
Modify data 311, 327 78, 89, 129, 131 78, 89, 129, 131, 190, 352 129, 190, 319 89, 131, 190, 311, 327, 352 78
DoS: unreliable execution 78, 120, 129, 131, 400, 476, 665, 805 78, 120, 129, 131, 190, 352, 400, 426, 805 129, 190, 426, 690 400 120, 131, 190, 352, 426, 805 476, 665 78
DoS: resource consumption 120, 400, 404, 770, 805 120, 190, 400, 770, 805 190 400, 770 120, 190, 805 404 770 412
Bypass protection mechanism 89, 400, 601, 665 79, 89, 190, 352, 400, 601, 798 14, 79, 184, 190, 733, 798 400 89, 190, 352 665, 798 601, 798, 807 14, 733
Hide activities 327 78 78 327 78

Understanding the relationship between various assessment/detection methods and the artifacts available over the life-cycle will enable you and your decision-makers to plan:

  • specific issue(s) to review
  • at what point(s) in the effort
  • using what method(s) and
  • which capability(s) could be leveraged based on the coverage claims representations of the various tools and services you have available to find the weaknesses

Matching Claims to Needs
Matching Claims to Needs

As shown above, matching coverage claims (with the weaknesses your organization has prioritized) can assist you in planning assurance activities. This will better enable you to combine the groupings of weaknesses that lead to specific technical impacts with the specific detection methods needed for gaining insight into whether the dangerous issues have been addressed.

In the future, the same type of information in the technical impact vs. detection method table can be used to produce an assurance tag that could be attached to an executable code bundle, leveraging ISO/IEC 19770-2:2009 as implemented for Software Identification (SWID) Tags. This will allow communication with others, so that they may gain insight into the assurance efforts made on a piece of software created by someone else. The basic idea would be that a SWID Tag would contain assurance information to convey the types of assurance activities and efforts undertaken against what types of failure modes. The receiving enterprise could then review this tag and match that information against their own ideas about how they will use the software and what failure modes they are most concerned about, and attest to whether the appropriate level of attention was paid to the technical impacts they are trying to avoid. This same table of information can also support an ISO/IEC 15026 assurance cases.

Page Last Updated: April 02, 2018