CWE

Common Weakness Enumeration

A Community-Developed Dictionary of Software Weakness Types

CWE/SANS Top 25 Most Dangerous Software Errors Common Weakness Scoring System
Common Weakness Risk Analysis Framework
Home > CWE List > CWE- Individual Dictionary Definition (2.6)  

Presentation Filter:

CWE-359: Exposure of Private Information ('Privacy Violation')

 
Exposure of Private Information ('Privacy Violation')
Weakness ID: 359 (Weakness Class)Status: Incomplete
+ Description

Description Summary

The software does not properly prevent private data (such as credit card numbers) from being accessed by actors who either (1) are not explicitly authorized to access the data or (2) do not have the implicit consent of the people to which the data is related.

Extended Description

Mishandling private information, such as customer passwords or Social Security numbers, can compromise user privacy and is often illegal. An exposure of private information does not necessarily prevent the software from working properly, and in fact it might be intended by the developer, but it can still be undesirable (or explicitly prohibited by law) for the people who are associated with this private information.

Privacy violations may occur when:

  1. Private user information enters the program.

  2. The data is written to an external location, such as the console, file system, or network.

Private data can enter a program in a variety of ways:

  1. Directly from the user in the form of a password or personal information

  2. Accessed from a database or other data store by the application

  3. Indirectly from a partner or other third party

Some types of private information include:

  • Government identifiers, such as Social Security Numbers

  • Contact information, such as home addresses and telephone numbers

  • Geographic location - where the user is (or was)

  • Employment history

  • Financial data - such as credit card numbers, salary, bank accounts, and debts

  • Pictures, video, or audio

  • Behavioral patterns - such as web surfing history, when certain activities are performed, etc.

  • Relationships (and types of relationships) with others - family, friends, contacts, etc.

  • Communications - e-mail addresses, private e-mail messages, SMS text messages, chat logs, etc.

  • Health - medical conditions, insurance status, prescription records

  • Credentials, such as passwords, which can be used to access other information.

Some of this information may be characterized as PII (Personally Identifiable Information), Protected Health Information (PHI), etc. Categories of private information may overlap or vary based on the intended usage or the policies and practices of a particular industry.

Depending on its location, the type of business it conducts, and the nature of any private data it handles, an organization may be required to comply with one or more of the following federal and state regulations: - Safe Harbor Privacy Framework [R.359.2] - Gramm-Leach Bliley Act (GLBA) [R.359.3] - Health Insurance Portability and Accountability Act (HIPAA) [R.359.4] - California SB-1386 [R.359.5].

Sometimes data that is not labeled as private can have a privacy implication in a different context. For example, student identification numbers are usually not considered private because there is no explicit and publicly-available mapping to an individual student's personal information. However, if a school generates identification numbers based on student social security numbers, then the identification numbers should be considered private.

Security and privacy concerns often seem to compete with each other. From a security perspective, all important operations should be recorded so that any anomalous activity can later be identified. However, when private data is involved, this practice can in fact create risk. Although there are many ways in which private data can be handled unsafely, a common risk stems from misplaced trust. Programmers often trust the operating environment in which a program runs, and therefore believe that it is acceptable store private information on the file system, in the registry, or in other locally-controlled resources. However, even if access to certain resources is restricted, this does not guarantee that the individuals who do have access can be trusted.

+ Alternate Terms
Privacy leak
Privacy leakage
+ Time of Introduction
  • Architecture and Design
  • Implementation
  • Operation
+ Applicable Platforms

Languages

Language-independent

Architectural Paradigms

Mobile Application

+ Common Consequences
ScopeEffect
Confidentiality

Technical Impact: Read application data

+ Demonstrative Examples

Example 1

In 2004, an employee at AOL sold approximately 92 million private customer e-mail addresses to a spammer marketing an offshore gambling web site [R.359.1]. In response to such high-profile exploits, the collection and management of private data is becoming increasingly regulated.

Example 2

The following code contains a logging statement that tracks the contents of records added to a database by storing them in a log file. Among other values that are stored, the getPassword() function returns the user-supplied plaintext password associated with the account.

(Bad Code)
Example Language: C# 
pass = GetPassword();
...
dbmsLog.WriteLine(id + ":" + pass + ":" + type + ":" + tstamp);

The code in the example above logs a plaintext password to the filesystem. Although many developers trust the filesystem as a safe storage location for data, it should not be trusted implicitly, particularly when privacy is a concern.

Example 3

This code uses location to determine the user's current US State location.

First the application must declare that it requires the ACCESS_FINE_LOCATION permission in the application's manifest.xml:

(Bad Code)
Example Language: XML 
<uses-permission android:name="android.permission.ACCESS_FINE_LOCATION"/>

During execution, a call to getLastLocation() will return a location based on the application's location permissions. In this case the application has permission for the most accurate location possible:

(Bad Code)
Example Language: Java 
locationClient = new LocationClient(this, this, this);
locationClient.connect();
Location userCurrLocation;
userCurrLocation = locationClient.getLastLocation();
deriveStateFromCoords(userCurrLocation);

While the application needs this information, it does not need to use the ACCESS_FINE_LOCATION permission, as the ACCESS_COARSE_LOCATION permission will be sufficient to identify which US state the user is in.

+ Relationships
NatureTypeIDNameView(s) this relationship pertains toView(s)
ChildOfWeakness ClassWeakness Class200Information Exposure
Research Concepts (primary)1000
ChildOfCategoryCategory254Security Features
Development Concepts (primary)699
Seven Pernicious Kingdoms (primary)700
ChildOfCategoryCategory857CERT Java Secure Coding Section 12 - Input Output (FIO)
Weaknesses Addressed by the CERT Java Secure Coding Standard (primary)844
ChildOfCategoryCategory907SFP Cluster: Other
Software Fault Pattern (SFP) Clusters (primary)888
ParentOfWeakness VariantWeakness Variant202Exposure of Sensitive Data Through Data Queries
Research Concepts (primary)1000
+ Taxonomy Mappings
Mapped Taxonomy NameNode IDFitMapped Node Name
7 Pernicious KingdomsPrivacy Violation
CERT Java Secure CodingFIO13-JDo not log sensitive information outside a trust boundary
+ References
[R.359.1] J. Oates. "AOL man pleads guilty to selling 92m email addies". The Register. 2005. <http://www.theregister.co.uk/2005/02/07/aol_email_theft/>.
NIST. "Guide to Protecting the Confidentiality of Personally Identifiable Information (SP 800-122)". April 2010. <http://csrc.nist.gov/publications/nistpubs/800-122/sp800-122.pdf>.
[R.359.2] [REF-2] U.S. Department of Commerce. "Safe Harbor Privacy Framework". <http://www.export.gov/safeharbor/>.
[R.359.3] [REF-3] Federal Trade Commission. "Financial Privacy: The Gramm-Leach Bliley Act (GLBA)". <http://www.ftc.gov/privacy/glbact/index.html>.
[R.359.4] [REF-4] U.S. Department of Human Services. "Health Insurance Portability and Accountability Act (HIPAA)". <http://www.hhs.gov/ocr/hipaa/>.
[R.359.5] [REF-5] Government of the State of California. "California SB-1386". 2002. <http://info.sen.ca.gov/pub/01-02/bill/sen/sb_1351-1400/sb_1386_bill_20020926_chaptered.html>.
[R.359.6] [REF-1] Information Technology Laboratory, National Institute of Standards and Technology. "SECURITY REQUIREMENTS FOR CRYPTOGRAPHIC MODULES". 2001-05-25. <http://csrc.nist.gov/publications/fips/fips140-2/fips1402.pdf>.
[REF-33] Chris Wysopal. "Mobile App Top 10 List". 2010-12-13. <http://www.veracode.com/blog/2010/12/mobile-app-top-10-list/>.
+ Content History
Submissions
Submission DateSubmitterOrganizationSource
7 Pernicious KingdomsExternally Mined
Modifications
Modification DateModifierOrganizationSource
2008-07-01Eric DalciCigitalExternal
updated Time_of_Introduction
2008-09-08CWE Content TeamMITREInternal
updated Relationships, Other_Notes, Taxonomy_Mappings
2009-03-10CWE Content TeamMITREInternal
updated Other_Notes
2009-07-27CWE Content TeamMITREInternal
updated Demonstrative_Examples
2009-12-28CWE Content TeamMITREInternal
updated Other_Notes, References
2010-02-16CWE Content TeamMITREInternal
updated Other_Notes, References
2011-03-29CWE Content TeamMITREInternal
updated Other_Notes
2011-06-01CWE Content TeamMITREInternal
updated Common_Consequences, Relationships, Taxonomy_Mappings
2011-09-13CWE Content TeamMITREInternal
updated Other_Notes, References
2012-05-11CWE Content TeamMITREInternal
updated Related_Attack_Patterns, Relationships, Taxonomy_Mappings
2013-02-21CWE Content TeamMITREInternal
updated Applicable_Platforms, References
2014-02-18CWE Content TeamMITREInternal
updated Alternate_Terms, Demonstrative_Examples, Description, Name, Other_Notes, References
Previous Entry Names
Change DatePrevious Entry Name
2014-02-18Privacy Violation
Page Last Updated: February 18, 2014