A.8.29: Security testing in development and acceptance

A.8.29: Security testing in development and acceptance

Version: 3.0

Valid until: 2025-04-10

Classification: Low

Version Management


Version

Author(s)

Change(s)

Date approved

1.0

Stefan van Aalst

Initiation document

2022-07-07

1.1

Edward Robinson

Johanna Hakonen

Sayali Shitole

Additions/changes as part of the periodic review and improvement.


Created A.14.2.8 System security testing & A.14.2.9 System acceptance testing to replace the Knowledge Base article Software Testing Life Cycle.

2022-12-19
2.0Edward RobinsonAdditions/changes as part of the annual review.

No changes were made.
2023-05-26
3.0
Edward Robinson
Sayali Shitole
Additions/changes as part of the annual review.

Added Production Deployment Approval section.

Purpose & background


In the interest of all the stakeholders, the top management of anDREa B.V. (hereafter called anDREa) is actively committed to demonstrably maintain and continually improve an information management system in accordance with the requirements of the ISO 27001:2017.


The purpose of this document is to describe the system security testing and system acceptance testing of anDREa and the associated controls, checks and administrations. 


This document will be reviewed at least annually and when significant change happens.

Objectives


The objectives of this control are:


  • To ensure that information security is designed and implemented within the development lifecycle of information systems (A.14.2).

Scope

The scope of this document corresponds to Clause 4 Context of the organisation.

Availability

This document is:


  • required reading for:

    • all employees and contractors of anDREa.

  • available for all interested parties as appropriate.

Norm elements

A.14.2 Security in development and support processes

A.14.2.8 System security testing & A.14.2.9 System acceptance testing


“Testing of security functionality shall be carried out during development.”


“Acceptance testing programs and related criteria shall be established for new information systems, upgrades and new versions.”


Software testing life cycle

Requirement analysis

During this phase, the test team goes through the acceptance criteria, security criteria and/or requirements/user stories mentioned on the sprintboard to identify test scenarios. If the acceptance criteria are not clear, the tester may contact system architects or business analysts to understand the requirements.


Test planning

This phase involves effort estimation and the creation of a Test plan. Each Test plan is reviewed by the involved developer(s), anDREa management and in some cases involved system architects or business analysts. 


Regression test plan

This phase involves the creation of a Regression Test plan in which we have identified tests which need to be executed after every deployment. This usually happens at the end of a sprint. 


See the regression test plan


Effort estimation of a functional PBI 

A task is created in the new feature PBI and an estimate in hours is added for the amount of testing required. 


Test design

During this phase, the test team creates test cases for the PBI which will be developed in the current sprint. Test cases are stored in Azure DevOps.


See sample test cases



Test environment setup

This environment is controlled by the development team. The test environment is deployed with new builds/bug fixes or any change requests. Once the environment is ready, it is handed over to the test team to test the deployed changes.


Test execution

During this phase, the testers will carry out the testing based on the Test plans and the test cases prepared. Bugs will be reported back to the development team.


Test execution is done using Azure DevOps. Tests can be executed as below:



Test closure

This involves providing a test summary to the entire team after completion of all tests. The final test result can be found in DevOps. The Regression Test report can be found in the Regression Test plan. Once testing is completed, the Test team updates the Product backlog item and moves the item in a state Ready for Acceptance, further the PBI will get deployed on acceptance for the testing.


Types of testing


Sanity testing

This is done to verify that basic functionality of the application is still working. This is done after a new build is received on the test environment. If there are any failures in the build, the Test team immediately informs the Development Team for further investigation.


The tests below are conducted as part of sanity testing:

  • Log in and log out.

  • Verify if Workspaces are loading.

  • Verify if the Data Request page is loading.

  • Verify if the Owner can add/remove members from the Workspace.


Functional testing

In this testing phase, the system is tested against functional requirements/specifications. The purpose of functional testing is to test features by feeding them input and examining output. Functional tests are written on the PBI level and are tracked in the PBI.


Please refer the screenshot for the reference-


Regression testing

Regression testing is a full partial selection of already executed test cases which are re-executed to ensure existing functionalities still work. Regression testing is performed based on the Regression Test plan.

Positive Testing

This testing checks whether an application behaves as expected with positive inputs. 


Examples of positive testing done in Shared Tenant:

  • Verify that only owners can add/ remove members from workspace

  • Verify that an email is sent to all owners of workspace when a member is added to a workspace

Negative Testing

Negative testing is a method of testing an application that ensures that the plot of the application is according to the requirements and can handle the unwanted input and user behaviour. 


Examples of negative testing done on myDRE:

  • Verify that members(non-owners) of a workspace cannot add/ remove other members in a workspace

  • Verify that members cannot approve data transfer requests

Acceptance Testing & Production Deployment Approval

In this testing, the feature or a bug is tested against functional requirements/specifications before it is introduced to the production environment. The purpose of acceptance testing is to test whether a feature is functional in a production-like environment. 

Findings of acceptance testing are tracked in a PBI with a separate task. 

At anDREa, our new "Support & Assurance" team oversees end-to-end testing across Test and Acceptance environments. They execute User Acceptance Testing (UAT) based on predefined test plans. Working closely with the Product Owner, the team collaboratively decides when to give a go ahead on production deployment for features and maintains corresponding release notes.

Defect life cycle

Below is a cycle which we follow in Shared Tenant. The only thing is we don’t create bugs for the test environment. We only create bugs found in Production. Bugs which are found in the test environment are tracked using tasks. But the cycle remains the same.

API Testing

As part of this testing, we need to validate that any changes made to the Shared Tenant APIs do not break any dependencies of that API. Either on the API surface (Contracts) or the expected results (Behaviour). The team has agreed to create a regression test bench for the Shared Tenant APIs using Postman together with the release pipelines.


We are writing API tests for following APIs to start with:

  • Workspace API

  • Compute API

We are also planning to integrate these tests with release pipelines. This will be owned by developers.


To view API tests. Login to postman → Go to Team workspace → Collections


UI Automation

As part of this implementation, we have automated UI based scenarios. This is done using Playwright automation tool, reasons we started with Playwright-

  1. Automates web application scenarios

  2. The framework supports cross-browser development

  3. Auto-wait, smart assertions that retry until an element is found

  4. It's available as a VS Code extension to run tests in a single click and comes with features for step-by-step debugging, exploring selectors, and recording new tests.

  5. Testing cross-language, including JavaScript, TypeScript, Python, Java, and .NET – choose the environment that suits you while still covering all areas and formats

  6. Easy to integrate with azure pipeline

  7. Generate an HTML report to view test execution results in the browser

 Playwright framework looks something like this -


    • Related Articles

    • 20220607 Security Management Report

      As part of anDREa's commitment to maintaining an Information Security Management System (ISMS) based on ISO 27001 please feel free to download and read the attached anDRE's 20220607 Security Management Report.
    • 20220713 Report Azure White Box Security Audit

      Version: 2022-07-14 Introduction anDREa has a Pentest Program program as part of the commitment to protect the security of its business information. At least once a year we request an external party to do the pentest and a white box security audit. ...
    • 20210224 Pentest 2021-Q1 Report & 20210301 White Box Security Audit 2021-Q1 Report

      In accordance with our Pentest Program, anDREa engaged nSEC/Resilience for the anDREa White Box Security and the Pentesting 2021-Q1. The core questions being: Can non-authorized people or services access Workspaces or affect anDREa’s core services? ...
    • A.14.2.1 Secure development policy

      Version: 3.0 Valid until: 2025-04-10 Classification: Low Version Management Version Author(s) Change(s) Date approved 1.0 Stefan van Aalst Edward Robinson Sarang Kulkarni Johanna Hakonen Initiation document 2022-07-07 1.1 Edward Robinson ...
    • A.14 System acquisition, development and maintenance

      Version: 3.0 Valid until: 2024-04-10 Classification: Low Version Management Version Author(s) Change(s) Date approved 1.0 Stefan van Aalst Edward Robinson Sarang Kulkarni Johanna Hakonen Initiation document 2022-07-07 1.1 Edward Robinson Johanna ...