Many organizations trying to mature their Application Security Programs are buying SAST (Static Application Security Testing) and DAST (Dynamic Application Security Testing) solutions. For those unfamiliar, SAST tools are used for binary, byte, or source code analysis, and look for flaws at the code level, whereas DAST tools are meant to test an application at run time. These tool sets can add a lot of value to an organization, but how they are implemented into the SDLC will determine the true return on investment. Some organizations create a budget, then buy some tools…but beyond that, still need help figuring out next steps. Where there may not be a cookie cutter solution for this, there are common factors that will help you determine the most effective strategy for implementation.

Before we talk about implementing SAST and DAST tools into the SDLC, organizations should first gain an understanding of the size of their application portfolio, how many licenses they can reasonably budget for, and the amount of resources required to implement, tune, support and run these tools. Once those factors are understood, one must put the cart before the horse and ask how the results from these tests will be reviewed, who will review them, and how they will get tracked and prioritized for remediation.

Smaller development shops tend to have tighter budgets and a more tactical approach, given that they may only have one or two application security resources. With environments like this, the development leads are often getting asked for help, and being trained to run the tools themselves so that the application security resources can focus their time on reviewing, validating, analyzing, and tracking the results. Organizations should try to avoid implementing tools which are licensed per user. Why should you have to choose which developer should be able to proactively find issues in the code being developed? The whole purpose of driving automated tools into the SDLC is to encourage all developers to develop based on secure coding principles and be able to test their code as early in the SDLC as possible. When everyone on the development team has the same chance at secure development, a formalized secure coding standard starts to take shape.

Developers leveraging these tools are a very good thing for an organization, but this activity should never replace the more formal review performed by application security professionals. Frequency of testing factors in several other considerations that are a bit off topic for this blog, but may be revisited in a future article.

For issue tracking, the organization may leverage their ticketing, bug tracking, or GRC systems, but needs to also take into consideration what kind of detail is contained within the tickets. In other words, not everyone who can access the tickets should be able to access vulnerability details or application specifics. The ticket should be as generic as possible with details tracked in a system that can be limited to least privilege. Even a developer of one application shouldn’t necessarily have access to the vulnerabilities of another application they don’t work with. It’s important to keep the existence of insider threats in mind when deciding how much detail to reveal within an environment. If the application security issues are available to everyone, and an attack is executed before remediation is in place, this could introduce a great deal of complexity into an internal investigation.

Another important part of the process is aligning the findings that come out of the tools with the security policies/standards that may already be in place. Each tool assigns default levels of severity for each finding. These are typically configurable and should be reviewed, as some organizations may want to change some of these levels based on their own unique environments or controls. It is common for our clients to have a policy or standard in place (whether it be formal or informal) that requires the remediation of all high or medium severity findings prior to code being implemented to production. Ensuring the findings in the tools are configured to help meet this standard also aligns the business and security with the process. It should be noted that if developers can access and run these tools, they should not be able to reconfigure the severity levels themselves and should not deem anything a false positive without a formal review by the security team. Checks and balances are important to maintain, even in a large development shop or organization.

Overall, automated tools are an important part of a Secure SDLC program and provide a lot of value to any development organization. They can help increase the coverage for testing, help identify “low hanging fruit”, and are a great first step to help kick start a new Application Security Program within an existing SDLC. However, organizations must consider implementing usage plans and developing processes to expand the quality and security of the code, as well as provide a much more significant return on investment. Just remember, the solution is as unique as your development environment and overall business. There are no cookie cutter solutions to implementing tools, but GuidePoint is here to help you, and we might even have cookies!

About the Author

Kristen Bell, Managing Security Consultant – Application Security

Kristen is a Managing Security Consultant at GuidePoint Security who started in Application Security in 2005. Prior to joining GuidePoint, Kristen consulted for numerous companies performing application security services. Kristen has a background in the government sector, building application security programs and providing guidance in secure application design.

Kristen’s experience includes conducting application security assessments and database security reviews, secure SDLC consulting, as well as working with clients to improve their enterprise vulnerability management. Kristen’s ability to bridge the gap between technical and non-technical people, coupled with her strong interpersonal skills, has made her a strong champion for application security frameworks and controls for her customers. Kristen earned a Bachelor of Science degree in Computer Science from Kentucky State University.