A recent Checkmarx survey has shown that 92% of organisations struggle to bake security into their DevOps processes, even though the desire to do so is present.

The so-called principle of DevSecOps is even more vital than before, with the emergence of the General Data Privacy Regulation (GDPR).

Compliance with GDPR makes it compulsory to ensure security is implemented in products and solutions at the early stages of the project lifecycle and further reflected in the continuous integration and deployment pipeline.

This requires various business units working collaboratively to ensure security is a functional requirement in the rapid release of products.

One of the challenges of fast-paced deployment is modernising security practices to ensure devops processes are not slowed down or halted.  This means traditional methods of applying threat modelling and risk management must evolve to cater for rapid deployment of codes, especially where code pushes can increase to 50 iterations or more a day.

Checklists and standards need to be orchestrated to meet up with the high demands of cloud computing and micro services.

Reason for this is DevOps work at a very fast pace that security struggles to keep up with.  This is why tools and processes need automation more than ever.

A survey by Chef in 2017 said that 64% of DevOps organisations have regulatory standards to follow. Of those, 73% wait to assess compliance after development has already started, and 59% don’t assess compliance until code is already running in production!”

Excerpt From GDPR for DevOp(Sec) – The laws, Controls and solutions

Alasdair Gilchrist.

So what is the approach that addresses the privacy by design requirements of GDPR and meets the rapid speed at which deployment takes place in the current world?

There are four key activities

 that will provide coverage in attaining a suitable level of security and compliance that aligns with agile principles.

1. DevOps Pipeline.

The first stage of the DevOps pipeline is where code is reviewed and tested prior to it being compiled.  This stage is crucial to ensuring majority, if not all, of the vulnerabilities are reviewed and mitigated. A risk-based approach will ensure unnecessary delays are not encountered at this stage.

So how does this work?

Firstly, a benchmark is needed to determine what the criteria is to decide if a system is acceptably secure and compliant.  This is commonly referenced in the requirements document.

Developers must address security and data privacy deficiencies that can be introduced in the different stages of the SDLC, as part of their process.  This process must entail; requirements deficiencies, design deficiencies, coding defects, and operational deficiencies.  This can then be argued considering competency of the workforce and types of defects in the system’s code.

This can be achieved by ensuring developers are trained in the best coding practices that help to ensure that security and data privacy  vulnerabilities are not introduced in code.

A description of best coding practices is addressed in the OWASP Top 10: https://www.owasp.org/index.php/Category:OWASP_Top_Ten_Project

Following an integration of a best coding practice, developers can argue over the various classes of coding defects that may introduce vulnerabilities.

There are a number of options to address coding defects.  These can be used in combination for severe defects or in unison based on applicability.

Code reviews should he used to identify any coding defect type conditions.

As an alternative, or additionally, application of a static analysis tool could potentially flag defects.

Robustness testing optionally could validate that the result of testing the code with incorrect input is rejected and causes no harm.

The results from these tests then require analysis.

Developers need to be provided tools to  manage, document and address false positives before progressing the compiled code to the next stage.

2.  Pre-production checks.

Dynamic Application Security Testing is validation on compiled code.  Any findings in this quality assurance stage should be remedial, after mitigations in the pipeline.  Reason for this is findings at this layer is difficult to fix, without having to review logical and physical security architecture designs.  This can result in incurring cost that was not originally accounted for.  These tests should also take into account the data lifecycle management for GDPR.

Enterprise tools, like Qualys, can help automate the full spectrum of auditing, compliance, and protection of IT systems and web applications.

3. Continuous monitoring. 

Static and Dynamic analysis would ensure any deployment to a production environment is clear of known vulnerabilities.

This does not, however, mean the systems require no further monitoring.  This is, especially, the case in a cloud environment.

Continuous monitoring in the cloud consists of two core elements. First, security teams need to implement baseline monitoring and logging for virtual machine instances, containers and cloud services in general, including activity within software-as-a-service environments. Baseline monitoring can be accomplished by gathering and processing logs made available via cloud service provider APIs — such as Amazon Web Service (AWS) CloudTrail logs — gathering specific instance logs, performing image and instance integrity validation to ensure the environment is sound, and gathering any networking logs that may be available, such as flow logs for virtual private clouds in AWS.

AWS Config is a tool offered in AWS that continuously monitors configuration changes. This allows a security team to review changes to configuration and determine the overall compliance against the configurations specified in the corporate guidelines.

Other cloud vendors offer similar tools that provide a layer of control against such non-conformities.

4. Event Driven Response.

To achieve fully automated security and compliance controls, it is imperative to have an event driven response to threats and vulnerabilities.

This principle consists of; security events and response as well as security automation.

It offers a kill-chain that can minimise the impact by applying a layered approach to the mitigation strategy.

The whole nature of event driven security is automation.  It is an additional layer of control, rather than an alternative to traditional security controls. Reason for this is that the response takes place after an event or an incident has taken place, and so this works best with a layered approach to achieve a defence-in-depth.

An enabler is to put the code through the devops checks whilst integrating automated compliance checks in the pipeline.  This should take into account; breach notification, right of access, right to he forgotten, privacy by design, and a need for data protection officers.

Summary

Trying to mitigate all risks can be a costly exercise.  As a result, all

Stakeholders must be aware of the risk tolerance and corporate posture, so that risks are prioritised based on severity and likelihood, and the risks most likely to cause harm are addressed.

Organisations are employing legal firms to assist them with GDPR compliance.  Whilst this can prove beneficial in understanding loopholes and ways around the regulation, it fails to address the core requirement of attaining privacy by design, and privacy by default requirements.

It is possible to have security without compliance, but it is not the same the other way round.

Provention has a dedicated focus on GDPR compliance and integration to meet the evolving requirements of IoT, micro services, and cloud computing.

We have a program to address orchestration and automation of checks and controls to meet Compliance and security needs by way of a library of security architecture designs, that are implementable with ease.

Provention can  partner with legal firms to assist with the technical aspects of achieving GDPR controls that align with Agile, Cloud friendly frameworks that puts focus on elastic, flexible and rapid-paced processes.

To learn more about the guidance and tools we can provide, or more information on the research carried out by Provention, please email us on [email protected]