Testing

Input sources

Input source

Summary

Input source

Summary

Mock data

Raw data that was provided as the output from some security device.
This data is stored as a code in indeni-knowledge repository.

Mock server

Raw data that was provided as the output from some security device.
This data is stored on a mock server.

Real device

Raw data that is provided in real-time from some security device.

Testing levels

Level

Summary

At Indeni

Method

Input source

Level

Summary

At Indeni

Method

Input source

Unit Testing

A level of the software testing process where individual units of a software are tested.
The purpose is to validate that each unit of the software performs as designed.

Mock data only to be used, as the only component that can be changed is the unit code.

 

 

ADE
No unit testing so far, both for AWK and Python INDs.
By Q1/20, all Python INDs are planned to have automated unit testing.

ATE

  • Parser testing for device_task block

  • Unit testing for logic block

CI

  • ADE integration to CI by 15-March-20

  • ATE integration to CI by end of Feb/20

Automatic

Mock data

Integration Testing

A level of the software testing process where individual units are combined and tested as a group.
The purpose of this level of testing is to expose faults in the interaction between integrated units.

 

 

ADE

  • TODO

By Q1/20, all Python INDs are planned to have automated unit testing.

ATE

  • A utility that validates that (WIF - Workflow Integration Testing)

    • All conclusions are reachable

    • All blocks are reachable

CI

  • Integration to CI by end of Feb/20

Automatic

Mock data

System Testing

A level of the software testing process where a complete, integrated system is tested.
The purpose of this test is to evaluate the system’s compliance with the specified requirements.

 

Indeni live system acts a the black box

  • Issue creation

  • Existence of values in DB

  • If ATE exist

    • Validate workflow execution

    • Validate conclusion is reached

  • Validate issue closure

  • Look for edge cases

  • Negative testing

 

For each ADE/ATE need to create a table with planned to be tested tag variants: (can’t add table within a table in Confluence, hence writing as a list)

  • Knowledge element (ADE/ATE)

  • Vendor

  • Version

  • no-VSX / VSX / both

  • no-cluster / cluster / both

  • Is Chassis

  • Is Maestro member

Manual

Mock server

Acceptance Testing

A level of the software testing process where a system is tested for acceptability.
The purpose of this test is to evaluate the system’s compliance with the business requirements and assess whether it is acceptable for delivery.

To be tested manually by IKE/Knowledge developer:

  • ADE

    • Trigger the issue by changing real device state

    • Validate issue existence in UI

  • ATE (if implemented)

    • View mode (Knowledge Explorer)

      • Validate workflow graph existence and accuracy

      • Validate UI text

    • Execution mode (Workflow Overview)

      • Validate workflow execution

      • Validate per issue item, where applicable

      • Validate all flows are covered

      • Validate all flows have the correct conclusion

      • Validate UI text

Manual

Real device

Testing Process

What

Phase

Owner

Where

What

Phase

Owner

Where

Code development

Development

Developer

Private Indeni server

Manual, unit and integration testing as part of development

Automated unit testing

CI

Automated by the System

Jenkins / AWS

Automated integration testing

Manual system testing

QA

IKE / Knowledge Developer (that did not wrote the ADE/ATE)

develop server in KDLab

Manual acceptance testing

Acceptance

Support / Customer

Beta production environment

Guidelines

  • Before a PR is created, the new code should be tested on your private indeni server and/or on one of the development servers we have in KDLab.