Sensitive Data Exposure Vulnerability exists in a web application when it is poorly designed & it does not adequately protect critical information. A couple of examples would be, exposed data that someone mistakenly uploaded somewhere, weak crypto which an attacker would be able to read if they successfully compromised the target and also the lack of headers which prevent the browser caching. As a result, it permits attackers to apply different security practices and find the sensitive data which is related to a particular web application or an organization. By the flaw of Sensitive Data Exposure, attackers can find sensitive data such as authentication credentials, session tokens and databases etc. Through that they will be able to exploit the web application and as a result of that, the security of the site also be breached. Consequently, it can damage the business, breach customer privacy, ruin the trust of customer and in extreme cases, affect the security and international relations of nations etc.
What Makes Data Sensitive?
To decide how sensitive a specific data is and how it ought to be classified, consider the CIA triad (confidentiality, integrity, and availability) of that data and how it would affect your application, organization, or its users if it is exposed. This is a typical method to quantify data sensitivity which is a framework provided in the Federal Information Processing Standards (FIPS) by the National Institute of Standards and Technology (NIST).
Sensitive data can be separated into two broad categories:
- Regulated data — This sort of information remains sensitive for the duration of its life cycle, yet its level of sensitivity varies with the time. It is encouraged to keep regulated data classified all the time.
- Unregulated data — This sort of data does not generally appear to be sensitive. In simple words, the data does not seem sensitive from the outset but makes much more critical when the context of it is considered. For example, publicly known data seems non-sensitive, but there are times when the organization’s confidential data and some intellectual property are exposed freely when it ought to be delegated as highly sensitive. This is what makes unregulated data.
Whether it is original or copied, the information is sensitive when it contains;
- Usernames and associated passwords.
- Protected Health Information.
- Banking details (credit card numbers, account numbers).
- Personally Identifiable Information (social security number, full name, driver’s license number).
The Root Cause for Sensitive Data Exposure
- Data in Transit- While transmitting across unprotected channels or to the APIs that permits applications to communicatee with each other, data in transit is exceptionally vulnerable. One of the main attack which targets data in transit is the man-in-the-middle attack, which intercepts traffic and monitors communications.
- Data at Rest- Attackers use various vectors to get into the stored information most frequently with the use of malwares like Trojan horses and computer worms. through that it is possible to gain access into systems stored data through direct downloading from a malicious USB drive or by clicking a malicious link which is sent via an email or a text.
On the other hand, Data exposure is not a vulnerability that you can search in similar sense as the traditional vulnerabilities as it cannot be scanned due to two main reasons:
- To decide the risk, it should be chosen what information is considered to be sensitive which is a hard task to automated.
- A third-party pen tester could never know whether the internal data is encrypted or not since the internal data is not subject to third-party exposure.
To evaluate whether the application is vulnerable to Sensitive Data Exposure, the best practice is to establish the steps in the prevention if any of them have not already been taken. As a rule, this is the best way to recognize this flaw. Nonetheless, some of the findings such as lack of sufficient headers can be automatically scanned to mitigate caching behind pages that require authentication or lack of HTTPS on logins.
- Directory Busting — There are some instances where the critical files are stored on a web server to make smooth functioning of a web application. With the use of tools like Dir search, Burp suite or Dir buster following are some of the most critical files which can be exploited if found publicly exposed on the internet.
.git | .backup | .htaccess | .sql /admin
- GitHub — As it consist of public and private repositories, just imagine a case where the developer uploaded a SQL data of a web application on a GitHub Repo and forgot to make it private. In that case by using automated tools such as Git hound, Git Grabber and Git Rob attackers are able to exploit the database of the particular application.
Sensitive Data Exposure, After-Effects
- Identity hijacking.
- Financial loss.
- Decreased brand trust.
As the finding just applies to sensitive information, the potential effect is considered very high. What the data comprises of changes thus does the effect. The impact depends on the data being exposed, and the potential impact reflects the sensitivity of data. For instance, if credit card data is exposed, the attacker can exhaust the victim’s bank account and If credentials are compromised, the attacker can abuse these credentials. on the other hand, if certificates of a site are taken, the attacker can profess to be the target and It all relies upon what kind of data is in danger of being exposed.
Protecting against Sensitive Data Exposure
- Data Encryption and Defining Accessibility — Always encrypt sensitive data whether it is data in transit or data at rest. Further it is better to limiting its accessibility to only a handful of authorized users with separate private keys (in case of encryption).
- Regular Risk Assessment — When it comes to sensitivity of data, the Level of risk changes with time. It is recommended to regularly monitor and conduct risk assessments for any potential threat for your sensitive data.
- Maintain Strong Passwords — Can use the hashing function algorithms to generate strong passwords, and it is recommended to change your passwords regularly. further maintaining a unique password for every different platform is a best practice.
- Implement high quality security software — Always use updated software suite which includes malware and virus protection.
- Use Advanced Standard Security — It is important to have secure authentication gateways. With the use of advanced standard security, such as SSL and TSL, it is possible to ensure that the data flowing between a web browser and a web server is not only encrypted but also remains private. On top of that, applications which using HTTPS offers secure communication protocol.
Most of the time during the development lifecycle of an application, developers are prioritized and concerned with meeting the requirements of the system from the primary Use case and log out. So that the consideration for security issues during this phase is very low and most of the time security is checked as the last milestone. With that the developers never get much time to check how they are passing and saving their passwords, session IDs, and sharing data from one point to another. This is the major point of concern where the sensitive data exposure is most common in small applications. but to be real even in the bigger picture, there are a lot of applications that expose sensitive data such as Session IDs and, in some cases, they expose back-end information in error messages, etc. All this is sensitive data, and it becomes critical because, once it is in the wrong hands it can be used for unlimited malice. So as that, Sensitive data exposure can happen either intentionally or accidentally and by taking appropriate measures when storing and transmitting data it can be mitigated.