Memory Management is the Leading Cause of Security Vulnerabilities in Google ChromeTweet
Google recently has studied the root cause of high severity security vulnerabilities detected in their Chrome browser project (specifically the open source Chromium project which Chrome and other browsers are based on) and found that 70 percent were “memory unsafety” problems. They attribute this to mistakes made with C/C++ pointers. This analysis was based on 912 high or critical severity bugs found since 2015, so there is a significant amount data to base this on.
Not surprisingly, Microsoft came to similar conclusion with similar numbers in their internal study. In this case, 70 percent of the CVEs (common vulnerabilities and exposures) that Microsoft patched were due to memory safety issues.
Google’s data show that use after free make up 36% of their high impact security vulnerabilities and 32% are other memory unsafety issues, which are presumably, buffer overflow errors and out-of-bound access. The following chart was published showing the distribution of high and critical security vulnerabilities.
The distribution of root causes for the high and critical severity security vulnerabilities in the Google Chromium project. Source: Memory Safety, The Chromium Projects.
Hitting the Limits of Runtime Approaches to Mitigation
There are various approaches taken in Chrome’s security architecture to mitigate the impact of these memory issues. For example, the use of customized memory management libraries and garbage collection for C++ to help detect and minimize the impacts of these errors at runtime. Notable is the “rule of 2” approach to prevent these security vulnerabilities the chance to cause a serious exploit. The idea is that you can choose two of the following three options but not all three:
Google’s “Rule of 2” which prevents vulnerabilities from being serious security threats.
The idea behind this is that untrustworthy inputs (what we call tainted data) is never handled by C/C++ code without a sandbox – the intersection of all three risks (Doom!). However, this rule considers it acceptable to handle tainted data in C/C++ in a sandbox and to handle this data in a “safe” language without a sandbox.
Google is finding that the rule of 2 is limiting the ability to add features to Chrome. Isolation into separate processes has done a lot to prevent a malicious site from accessing data from other open sites. However, the cost has been performance and loss of efficiency. Creating a new process to handle untrusted data, for example, is too much overhead for a feature.
The report mentions the things they are trying to mitigate the problems which, surprisingly, do not include static (nor dynamic) analysis. Obviously, there are likely other security controls in place not mentioned as part of this study. Regardless, I think the combination of coding standards, early use of SAST and application of unit testing and runtime (DAST) tools can help remove many of these memory issues before they ever become security vulnerabilities.
Removing Root Causes of Memory Unsafety and the Role of Static Analysis
The report discusses a few approaches to help mitigate the impact of memory safety issues on Chrome security. Obviously, as a static analysis tools vendor we feel there is an important role to play in mitigation of these errors detected with good prevision with tools like CodeSonar. Considering some of Google’s approaches, the following are discussed in terms of how tools like GrammaTech CodeSonar can help:
Custom C++ Libraries: This approach improves memory management library by using techniques, at runtime, that check for pointer validity before performing operations, for example. This approach assumes developers make mistakes and any inputs used in memory management functions should be considered untrustworthy. This is a good approach, of course, but it still helps to detect and remove poor memory management regardless of a “safer” C++ library. Static analysis tools are well suited to detect the types of errors that are most commonly detected and exploited. Detect memory errors before they even enter the code repository and build provides the best return on investment for the tools.
Transition to safer languages: Although it is impractical to change language in large project such as Chrome, Google is looking to move to other safer languages to mitigate memory issues. However, C and C++ aren’t going away. To reduce the potential dangers of C and C++, it is possible to adopt a more secure subset of a language such as SEI CERT C.. Static analysis tools are ideal for helping teams evaluate their code against a known standard and for helping to enforce the rules on an ongoing basis. Even if C and C++ can be fraught with memory errors, more secure coding practices, used from the start, can help eliminate many of these issues.
Custom C++ Dialects: Limiting the types of language constructs that developers use can help prevent memory issues upfront. As mentioned above, more secure language subsets are already defined to help increase code security. Adopting these standards plus rigorous detection of memory errors can help eliminate them early in the development lifecycle.
Focus on Memory Issues to Reduce “Noise” in Static Analysis Results
I suspect the first pushback we would get from a Chrome developer would be “we tried static analysis and there were too many false positives!” Without any context, it’s difficult to counter. However, with the right configuration and a focus on the key memory “unsafe” issues, it’s possible for analysis tools to detect these high-risk issues with better precision and much less “noise.” We go into detail on this approach in our whitepaper, “Easing the Adoption of Static Analysis into Existing Projects.”
Google’s analysis shows that 70 percent of their high and critical severity security vulnerabilities are due to poor memory handling in C and C++ code. They mitigate these issues through isolation techniques such as sandboxing but that has performance and efficiency impacts. They also insist through their Rule of 2 approach, that unsafe languages such as C and C++ are limited to running in a Sandbox when handling any kind of untrusted data. This limitation is preventing new features in the product and sandboxing is a performance concern.
Static analysis and other techniques used early in the lifecycle can help eliminate unsafe memory handling before they become security vulnerabilities. There is also a case to be made for adopting established secure subsets of C and C++ to reduce the use of potentially dangerous constructs in C/C++. Focusing static analysis on very specific memory errors can help eliminate overwhelming developers with too many warnings to alleviate the perception of “noise” or false positives.