Skip to content

10.6.2 Test Plan

Test execution

Execution of tests in SiVa is divided in three categories: Automated, Automated SoapUI, Manual. The method is also indicated in test case description with TestType field.

Execution of Automated type of tests

These tests are run automatically by Maven every time SiVa project is built. These tests must pass for the build to be successful. Its also possible to execute the tests separately with Maven wrapper using the command:

./mvnw verify

These tests are part of SiVa project and can also be executed using IntellJ IDEA. To do that, SiVa project must be loaded into IDEA. Tests are located in following packages:

  • ee.openeid.siva.integrationtest
  • ee.openeid.siva.resttest
  • ee.openeid.siva.soaptest

To run the tests in IDEA right click on package name and select "Run tests in". Its also possible to run individual tests by right clicking on specific test code and selecting "Run".

Execution of Automated SoapUI type of tests

The description how to execute the tests together with SoapUI project file to be used can be found in SiVa GitHub

Execution of Manual type of tests

Most of the manual tests require that SiVa service is set up together with SiVa Demo Application. The instructions how to set up SiVa are given in System integrator's guide

Execution of manual tests depends on testable area. These tests can be divided into following categories:

  • Statistics tests - test data can be prepeared with exectuting ee.openeid.siva.manualtest package. It can be also generated by uploading files into SiVa Demo application. Results have to be verified in logs and Google Analytics.
  • Report tests - test data can be prepeared with exectuting ee.openeid.siva.manualtest package. Results have to be verified manually.
  • Configuration tests - SiVa configuration files have to be modified by hand and service must be set up. Correct behavior of the service must be checked.
  • Other tests - tests are executed by loading files into SiVa Demo application and validating the results shown in the SiVa Demo application.

Files to use in manual tests can be found in SiVa GitHub

Integration Test introduction

This section of the document gives overview of Integration Testing carried out on SiVa web service and SiVa Demo Application.

SiVa web service Integration Testing is using RestAssured library v2.9.0 to implement automatic checks for REST/SOAP based tests.

The testing of the SiVa web service is divided into sections based on the software architecture and functionalities provided to the users. The sections are:

  • REST API
  • SOAP API
  • DDOC container signature validation
  • BDOC container signature validation
  • ASIC-S container signature validation
  • PDF signature validation
  • X-Road ASIC-E signature validation

The goal is to focus testing on functionalities implemented in SiVa web service application. Functionalities provided by Validation libraries are not explicitly tested.

In addition SiVa Demo Application is tested. These tests are carried out manually.

Testing of REST API

The goal of the REST API testing is to check that the API is accepting the requests based on the specification and the output result is in correct format and has all the required elements.

Validation request tests

Following areas are tested on input:

  • Wrong (not accepted) values in input parameters
  • Empty values in input paramters
  • Too many parameters
  • Too few parameters
  • Inconsistencies on stated parameters and actual data (wrong document type)
  • Case insensitivity on parameter names
  • Empty request

In all of the negative cases correctness of returned error message is checked.

Specific test cases and input files can be found in:

Get Data Files request tests

Following areas are tested on input:

  • Empty request
  • Empty values in input parameters
  • Too many parameters
  • Too few parameters
  • Changed order of parameters
  • Case insensitivity on parameter names
  • Inconsistencies on stated parameters and actual data

In all of the negative cases correctness of returned error message is checked.

Specific test cases and input files can be found in:

Validation report and report siganture tests

SiVa web service returns uniform Validation Report on all the supported document types. This also includes correct document types without actual signature (for example PDF document without signature).

Following areas are tested on output (Validation Report):

  • JSON structure on DDOC, BDOC, PDF, ASIC-E, ASIC-S and ASICE-E X-Road document types
  • Presence of the mandatory elements on DDOC, BDOC, PDF, ASIC-E, ASIC-S and ASICE-E X-Road document types
  • Presence of optional elements on DDOC, BDOC, PDF, ASIC-E, ASIC-S and ASICE-E X-Road document types
  • Verification of expected values
  • JSON structure on containers without signatures

Specific test cases and input files can be found in:

Testing of SOAP API

The goal of the SOAP API testing is to check that the API is accepting the requests based on the specification and the output result (Validation Report) is in correct format and has all the required elements. In general the tests follow the same principles as with REST API. Compatibility with X-Road security server is out of scope for these tests and will be covered in X-Road System Test plan.

Validation request tests

Following areas are tested on input:

  • Wrong (not accepted) values in input parameters
  • Empty values in input paramters
  • Too many parameters
  • Too few parameters
  • Inconsistencies on stated parameters and actual data (wrong document type)
  • Case insensitivity on parameter names
  • Empty request

In all of the negative cases correctness of returned error message is checked.

Specific test cases and input files can be found in:

Get Data Files request tests

Following areas are tested on input:

  • Empty request
  • Empty values in input parameters
  • Too many parameters
  • Too few parameters
  • Changed order of parameters
  • Case insensitivity on parameter names
  • Inconsistencies on stated parameters and actual data

In all of the negative cases correctness of returned error message is checked.

Specific test cases and input files can be found in:

Validation report tests

SiVa web service returns uniform Validation Report on all the supported document types. This also includes correct document types without actual signature (for example PDF document without signature). However not all values may be present for all the document types.

Following areas are tested on output (Validation Report):

  • Presence of the mandatory elements on DDOC, BDOC, PDF, ASIC-S ASIC-E and ASICE-E X-Road document types
  • Presence of optional elements on DDOC, BDOC, PDF, ASIC-S, ASIC-E and ASICE-E X-Road document types
  • Verification of expected values

Specific test cases and input files can be found in:

Get Data Files report tests

Following areas are tested on output:

  • Presence of the mandatory elements
  • Verification of expected values
  • Extraction of all data files

Specific test cases and input files can be found in:

Testing of DDOC container signature validation

The goal of the DDOC container signature validation testing is to check that the validation results given by JDigiDoc library are properly presented in validation report.

The testing of DDOC signatures consists of following main cases:

  • Containers with valid signature(s) are validated.
  • Containers with invalid signature(s) or no signature are validated
  • Containers sizes near maximum are validated
  • Containers with DDOC v1.0 - 1.3 are validated

Specific test cases and input files can be found in:

What is not tested:

  • Verification of different causes in container for invalid result is out of scope.

Testing of BDOC container signature validation

The goal of the BDOC container signature validation testing is to check that the validation results given by DigiDoc4J library are properly presented in validation report.

The testing of BDOC container signatures consists of following main cases:

  • Containers with valid signature(s) are validated
  • Containers with invalid signature(s) or no signature are validated
  • Containers sizes near maximum are validated
  • Containers with baseline LT, LTA, T and B profiles

Specific test cases and input files can be found in:

What is not tested:

  • Verification of different causes in container for invalid result is out of scope.

Testing of ASIC-S container signature validation

The goal of the ASIC-S container signature validation testing is to check that the validation results given by DSS library are properly presented in validation report.

The testing of ASIC-S container signatures consists of following main cases:

  • Containers with valid signature(s) are validated
  • Containers with invalid signature(s) or no signature are validated
  • Containers sizes near maximum are validated

Specific test cases and input files can be found in:

What is not tested:

  • Verification of different causes in container for invalid result is out of scope.

Testing of PDF signature validation

Portion of the validation rules for PDF documents are implemented in SiVa web apllication itself. Therefor different test area selection is used for PDF compared to other containers.

The testing of PDF signatures consists of following main cases:

  • Containers with invalid signature(s) (different reasons for failure) are validated
  • Containers with no signature are validated
  • Containers sizes near maximum are validated
  • Containers with different baseline profiles are validated
  • Containers with serial and parallel signatures are validated
  • Containers with different signature cryptocaphic algorithms are validated
  • Containers with OCSP values inside and outside bounds are validated
  • Containers with baseline LT, LTA, T and B profiles

Specific test cases and input files can be found in:

Testing of X-Road ASIC-E container signature validation

The goal of the ASICE container signature validation testing is to check that the validation results given by X-Road signature validation utility are properly presented in validation report.

The testing of ASICE signatures consists of following main cases:

  • Containers with valid signature(s) are validated
  • Containers with invalid signature(s) are validated

Specific test cases and input files can be found in:

What is not tested:

  • Verification of different causes in container for invalid result is out of scope.

Testing of Data Files Extraction

The goal of the Data Files Extraction testing is to check if right data files returned for DDOC container and error messages for other types of containers.

The testing of Data Files Extraction consists of following main cases:

  • Extracting the data files from valid DDOC container
  • Extracting the data files from Hashcoded DDOC returns null for Base64 encoded String
  • Extracting the data files from DDOC container with 12 different types of files
  • Extracting the data files from not DDOC container
  • Extracting the data files from DDOC container with wrong Document Type

Specific test cases and input files can be found in:

Testing of user statistics

Testing of user statistics is carried out in combination of automatic data preparation and generation by integration tests and manual verification of the results. SiVa supports two parallel ways of gathering user statistics:

  • Validation results are printed to system log and can be gathered by any suitable means
  • Validation results are sent to Google Analytics using Google Measurement Protocol API

Note

Testing of Google Analytics requires creation and configuration of Google Analytics account and configuring SiVa service to send statistics to this account. Configuration of SiVa service is explained in SiVa system deployment.

As both systems use the same data the testing follows the same principles for both. Following areas are covered:

  • Statistics values are checked in log and Google Analytics for all container types (this also includes parameters not present in validation report)
  • Valid and invalid signatures are validated
  • Error situations on signature validation (instead of validation report, error message is returned)

Specific test cases and input files can be found in:

What is not tested:

  • Configuring Google Analytics reports is out of scope. Only verification of data presence is done.

SiVa Demo Application tests

Testing of SiVa Demo Application is done manually. The main cases are:

  • Cross browser usage (IE, Edge, Chrome, Firefox and Safari)
  • File upload (different sizes, suported and unsupported file types)
  • Displayment of Validation Report both for REST and SOAP
  • Layout of the page
  • Error representation

Sample test cases with input files can be found in:

System Test introduction

While Integration Tests were mostly carried out automatically then System Testing is mostly depending on manual testing.

System testing is carried out using two access points:

  • Testing through SiVa Demo Application
  • Testing through X-Road security server using SoapUI

Note

Testing through X-Road security server requires presence and configuration of X-Road security server to use SiVa service. Tests are run using SoapUI that simulates request to X-Road security server.

Testing through X-Road security server

Following areas are covered for document validation:

  • Validation of valid signature
  • Validation of invalid signatur
  • Validation that returns Soap error

All of the above test cases are run with BDOC, DDOC, PDF, ASIC-S and X-Road ASiC-E containers.

Following areas are covered for file extraction:

  • Extraction of data files from valid ddoc
  • Extraction of data files from invalid ddoc
  • Error response from data file extraction

Tests along with test case descriptions are available for rerun in github.

Specific test cases and input files can be found in:

Configuration/administration testing

Following areas are covered:

  • SiVa Web Application configuration
  • X-Road validation service configuration
  • Demo application configuration

Specific test cases can be found in:

Load Test introduction

The goals of the load test was to:

  • Determine the throughput capabilities of a single Siva node and how it handles requests under increasing load.
  • Test whether the SiVa service throughput is horizontally scalable up to 50 requests per second.

Each container type was load-tested separately since the business logic and underlying mechanics for validating specific container types are vastly different.

Load tests were run in three stages – firstly a single Siva service node was tested to determine the baseline performance metrics. Second and third stage involved adding additional service node to previous setup and testing the horizontal scalability performance. All three target Siva service nodes had identical virtual machine set-up. Virtual machines were installed on separate physical hardware. Siva web service nodes on a target Linux virtual machine were packaged inside a Docker container (along with Java with 4 GB allocated for Heap). The test runner (JMeter plugin used by Maven) resided on a separate machine on local area network. Simple reverse proxy was used as a load balancer to distribute the load between nodes (using round robin algorithm).

Load testing is carried out on following environments:

  • System under test enviroment (processor: Intel(R) Xeon(R) CPU E5-2620 v3 @ 2.40GHz memory: 6GB (4GB allocated for Java heap))
  • Load balancer enviroment (processor: Intel(R) Xeon(R) CPU E5-2620 v3 @ 1.60GHz memory: 4GB)
  • Jmeter executer enviroment (processor: Intel(R) Xeon(R) CPU E5-2620 v2 @ 2.10GHz memory: 10GB)

Following test data is used in load test:

  • BDOC-TS file with two valid signatures (~100KB and 5MB)
  • BDOC-TM file with two valid signatures (~100KB and 5MB)
  • PDF file with two valid signatures (~200KB and 5MB)
  • DDOC file with two valid signatures (~300KB and 5MB)
  • ASIC-E X-Road container with one valid signature (~10KB)
  • ASIC-S file with two valid signatures (~20KB and 50KB)

Each of the files are validated through REST interface. SOAP interface is used with small files for a comparison. It is evaluated that the interface (REST or SOAP) do not play noticeable effect on overall results.

Each of the tested files follow the same test plan:

  • Five concurrent requests are made per second
  • This load is held for period of time
  • Concurrent requests are increased by five until 50 concurrent requests per second is achieved
  • Latency and throughput is measured on each concurrent request steps