03 - Test Strategy applied to test tools

Preparation of the test strategy

 

  • 1 what is tested
    2 when we test
    3 why we test

Elements to be tested

 

Tool: Test Management
Type: Gazelle Test Bed
Description: TM is the application used to manage the connectathon from registration process to the generation of the test report

 

Tool: Proxy
Type: Support Tool
Description: "Man in the middle": capture the messages exchanged between two systems and forwards them to the validation service front-end

 

Tool: Gazelle HL7 Validator
Type: Validation Service
Description: Offers web services to validate HL7v2.x and HL7v3 messages exchanged in the context of IHE

 

Tool: CDA Validator
Type: Validation Service
Description: Embedded in CDA Generator tool, it offers a web service to validate a large set of different kinds of CDA using a model-based architecture

 

Tool: Schematron-Based Validator
Type: Validation Service
Description: Offers a web service to validate XML documents against schematrons

 

Tool: External Validation Service Front-end
Type: Validation Service
Description: The EVSClient is a front-end which allows the user to use the external validation services from a user-friendly interface instead of the raw web service offered by the Gazelle tools.

 

Tool: XD* Client
Type: Simulator
Description:

Emulates the initiating actors of the XD* profiles (XDS.b, XCPD, XDR, XCA, DSUB ...), and validate XDS metadatas using a model based validation strategy

Tool: TLS Simulator
Type: Simulator
Description: Uses to test the TLS-based transactions for various protocols

 

Testing effort

The test effort will be prioritized as follows:

  • Major release
    • Level changes for backwards incompatible API changes, such as changes that will break user interface
    • The test effort is prioritized in order to test all levels of tests "Must" and "Should" (a test report is mandatory)
  • Minor release
    • Level changes for any backwards compatible API changes, such as new functionality/features
    • The test effort is prioritized in order to test all levels of tests "Must" (a test report is not mandatory bur recommended)
  • Bug fixes
    • Level changes for implementation level detail changes, such as small bug fixes
    • The test effort is considered by Test Tools Development Team (a test report is not mandatory)

See Test Tools Lifecycle

Criticality/Risks to test

A risks analysis was realized to determine project, product and lab risks. It is confidential to preserve the test strategy of Test Tools.

It permisses to evaluate critical parts of Test Tools to test.

  • Must - Must have this requirement to meet the business needs.
  • Should - Should have this requirement if possible, but project success does not rely on it.
  • Could - Could have this requirement if it does not affect anything else in the project.
  • Would - Would like to have this requirement later, but it won't be delivered this time.

 "from the MoSCoW Method"


 

Development of the strategy

 

  • 4 which tests are performed
    5 how it is tested
    6 who tests

Test Tools types and levels of testing

 

 Test Management Tests levels  
 Test types  Unit testing Integration testing  System testing
 Functional testing Should  Must Must
 No-functional testing Could  Must Must
 Structural testing Would  N/A N/A
 Tests related to change Should  Should Should

 

 Proxy Tests levels  
 Test types  Unit testing Integration testing  System testing
 Functional testing Must Must  Must 
 No-functional testing  Could  Should Could
 Structural testing  N/A  N/A N/A
 Tests related to change  Should Should Should

 

 HL7 Validator Tests levels  
 Test types  Unit testing  Integration testing System testing
 Functional testing  Would Must Must 
 No-functional testing  N/A Would Would 
 Structural testing  N/A  Would  Would
 Tests related to change  Could Must  Must 

 

 CDA Generator Tests levels  
 Test types  Unit testing Integration testing  System testing
 Functional testing  Must Must Should
 No-functional testing Must Should  N/A
 Structural testing N/A  N/A N/A
 Tests related to change  Would  Should Would

 

 Schematron Validator Tests levels  
 Test types  Unit testing  Integration testing System testing
 Functional testing   Should Should  Should
 No-functional testing  Should Should N/A 
 Structural testing  N/A N/A  N/A 
 Tests related to change  Would Should Would 

 

 EVS Client Tests levels  
 Test types  Unit testing Integration testing  System testing
 Functional testing Would  Must Must
 No-functional testing  N/A Would  Would
 Structural testing N/A  N/A N/A
 Tests related to change Would Must Would

 

 XD* Client Tests levels  
 Test types  Unit testing Integration testing  System testing
 Functional testing Must  Must  Should
 No-functional testing  Must Should  N/A  
 Structural testing N/A   N/A N/A 
 Tests related to change  Would Should  Would

 

 TLS Simulator Tests levels  
 Test types Unit testing  Integration testing System testing
 Functional testing  Should Should Could
 No-functional testing  Would Would Could
 Structural testing  Would Would Would
 Tests related to change  Should Should Should

 

Test Tools functional requirements (system testing level)

Each test tool has its features detailed in a requirements management tool (testlink). A set of high level requirements provides an overall view of the tool and the tests that needs be performed.

 

Test Management High level requirements
Application Management
Prepare test session
Execute test session
Report test session

 

Gazelle Proxy High level requirements
Proxy Core functions
Proxy interfacing

 

HL7 Validator High level requirements
HL7 Message Profile Management
HL7 Resources Management
HL7v2.x validation service
HL7v3 validation service

 

 CDA Generator High level requirements
 CDA Validation Service
 Documentation Management

 

Schematron Validator High level requirements
 Schematron validation service
 Schematrons Management

 

EVS Client High level requirements
Clinical Document Architecture - Model Based Validation
Clinical Document Architecture - Schématrons
Cross Enterprise Document Sharing - Model Based Validation
Transverse - Constraints
Functional requirements

 

 XDStar Client High level requirements
 XDS Metadatas validation service
 XDS Documentation management

 

TLS Simulator High level requirements
Not logged in users
Logged in users

 


Organization

Cutting in testing / campaigns

Two campaigns of system tests are planned in the project:

  • Campaign system testing including functional testing to achieve the different objectives of the above tests
  • Campaign of retest (verification of the correction of anomalies) and regression

Moreover, when a release is planned, a testing day could be prepared to test the platform globally.

Unit and integrating tests are managed during Test Tools Development cycle, and results are analysed during the campaigns to implemented a Test Tools test report.

Then, security audit of the platform premises to assure the using of the platform, it is renewed once a year.

Tests criticality

Levels and types of tests defined in the "Test Tools types and levels of testing" of "Development of the strategy" part allow actors to test quickly and easily organize the testing tools.The unit and integration level, all typed tests "Must" should be held. Will be played typed "Should" tests according to the time allotted for testing during development.
On the system level, campaigns specified below will be organized according to the delivery of tools provided with consideration of typed tests "Must" and "Should" in order of priority.

The Test Tools Development Manager is in charge of requiring and/or organizing tests campaigns, according to the targeted delivery.

Day type testing (example of test day before US CAT delivery)

 

Delivery validation process

 

 

Roles and responsabilities

Unit & Integrating testing (Test Tools Development Team)

  • Test platform administrator
  • Unit test management
  • Provision of datasets

System Testing (Testers)

  • Test plan writing
  • Testing balance writing
  • Monitoring the implementation of system tests
  • Test design
  • Datasets design and manage
  • Test execution
  • Bug report management

Acceptance Testing (Testers as Users)

  • Beta testing in real conditions of use
  • Bug or suggestions report

Test environment

Access to the application under test (URLs located on the web)

Each Test Tool is available within the development environment of developers for unit testing.

Datasets tests

Datasets are managed before a test campaign, relative to the need.

Calibration

Anyway on some test tools (TM and EVS excluded), reference dataset are specifically managed to calibrate the test tools.

Their purpose is to check that test result provided by the test tools are still valid, even after a change on the tool.

Software Test Management

The test management platform is integrated with TMS TestLink in specifics projects relative to the Test Tools. It is located on http://gazelle.ihe.net/testlink/login.php.

Software bug management

The software bug managemer is JIRA, it is located on http://gazelle.ihe.net/jira/secure/Dashboard.jspa . From the observation of a failure in the application, a bug report is written in Jira.


Testing supplies

 

  To forward To archive
Documentation
Test plan X X
Test folder   X
Test report X X
Bug report X X
Data    
Datasets tests   X
Basic input data   X
Basic output data   X
Records + trace X X