To measure application security effectiveness, teams must identify the processes and tools embedded in their CI/CD pipeline, then evaluate the coverage and gaps spanning stages from pre-production to production.
There are several additional metrics DevSecOps teams must consider when measuring the security of their applications.
Security Starts Left but Doesn’t End There
Application security teams must continually review application code changes as they are developed and deployed to confirm they are free of known vulnerabilities, misconfigurations and other security problems. To accomplish this, application security teams use a variety of scanning and testing tools to uncover potential issues in the application code, configuration or runtime execution.
But application security is a tricky process to navigate for many DevOps and DevSecOps teams. Today’s applications are constantly evolving with new features and updates, continuously introducing the possibility of vulnerabilities and misconfigurations that could heighten risk. Further, organizations navigating the transition from DevOps to DevSecOps may lack the metrics needed to effectively track and measure their application security posture.
Here we discuss the challenges in securing cloud-native applications and identify ways to measure the efficacy of an application security strategy in reducing risk.
DevOps, initially focused on accelerating software development and delivery, has for many organizations evolved into DevSecOps, which is responsible for integrating security into the CI/CD process.
Moving Beyond DevOps KPIs
While these tools are essential for secure development, there must be a balance across the entire lifecycle of the application. Pre-production security tools can either miss issues altogether or generate an abundance of false positives. For example, software composition analysis (SCA) tools find vulnerabilities in open source libraries but are unable to identify vulnerabilities in custom code. Static application security testing (SAST) is valuable for analyzing custom code but cannot detect runtime configuration issues and business logic errors. Conversely, dynamic application security testing focuses on discovering vulnerabilities that occur during execution, such as injection attacks or misconfigurations, but it does not have visibility into the source code, meaning it may miss vulnerabilities like hardcoded credentials, logic errors or insecure coding practices that are only detectable by SAST.
Integrating security early and frequently in the software development lifecycle and CI/CD pipeline — a process known as “shifting security left” — is an application security best practice. However, modern applications constantly evolve and frequently receive updates after they’re deployed. According to the CrowdStrike 2024 State of Application Security Report, 71% of organizations update applications weekly, and 19% do so multiple times a day.
Measuring the success of an application security strategy shouldn’t focus solely on the number of detected vulnerabilities or security issues, or on remediation speed. Rather, success should be measured by managing risk and prioritizing the most impactful issues.
Making Sense of Application Security and CI/CD
As DevOps evolves into DevSecOps, the metrics used to track and measure success must also change. Google’s DevOps research assessment (DORA) team tracks the following five metrics to determine DevOps success and maturity: deployment frequency, lead time for code changes, mean time to recovery, change failure rate, and reliability. While many industry leaders use DORA metrics to measure DevOps success, these tend to focus on the volume of vulnerabilities and the speed of resolution rather than the efficiency and effectiveness of resolving security risk. They lack the specific application security metrics necessary for DevSecOps.