Test Tools Development and Test Processes

Test Tools interactions

 

Tools interactions

 

Test Tools Lifecycle

 


 

Development & Validation

Objectives

  • Manage evolution and maintenance of KEREVAL HEALTH Lab platform

 

Validation tools

Simulation tools

Test management tool

Support Tools

Tool release management

Gap Analysis

A watch is kept on the evolution of our references.

When a reference evolves, impact analysis is performed by the Lab Manager to determine possible changes.

Following this analysis, one (or more) ticket (s) is created in JIRA to take these changes into account.

These changes are then included in the traditional development process of testing tools.

 


Test Strategy applied to test tools

  • Introduction

The purpose of this section is to present the test strategy and KEREVAL Health test Lab approach.

This section takes into account the comprehensive approach to Project Management in the IHE group and applies to any project portfolio IHE.

Test strategy describes in an organization all the activities and methods of conducting activities verification and validation in a project to ensure:

  • The objectives addressed by the project
  • Lack of critical regression system
  • Compliance with the specifications
  • Compliance with the quality
  • Robustness of the system

It defines the test phases, tasks and deliverables, roles and responsibilities of different stakeholders, manage the coordination of actors and the sequence of tasks.

ISTQB advocates that it is not possible to test everything, and test strategy is based on the analysis of risks and priorities to focus testing efforts.

01 - Test Tools Lifecycle

Test Tools release cycle

02 - Development & Validation

01-XML Based Validators Development and Validation Process

01-Requirements

Requirements Process

Objectives

  • To identify new validator from standardization bodies

Pre-requisites

  • N/A

Inputs

  • Needs of the new validator
  • Requirements concerned by the validator

Actions

  • Writing needs in specification document or an accessible documentation to be analysed by development team

Outputs

  • Specification Documents
  • Normative XSD
  • Formal requirements

02-Specification

Specification Process

Objectives

  • Identify and analyse impact from requirements process outputs
  • Identify functionnality and release
  • Identify tool impact

Pre-requisites

  • Requirements are identified by bodies standardization

Inputs

  • Specification Documents
  • Normative XSD

Actions

  • Analysis specifications documents
  • Analysis normative XSD
  • Extract requirements of documentation

Outputs

  • XML Format Document with requirements to manage linking documentation with test

03-Design

Design Process

Objectives

  • Design receive specifications

Pre-requisites

  • Specfication Documents are delivered and analysed

Inputs

  • XML Format analysed from Specification Process
  • Formalized requirements

Actions

  • Processing XSD -> UML
    • Convert the normative XSD document in a UML model
  • OCL Injection
    • Include constraints in the model to assure the validation

Outputs

  • UML Model
  • UML Model + OCL

04-Realisation

Realisation Process

Objectives

  • Develop new or existing validator module respecting normatives

Pre-requisites

  • Specfication Documents are translated in UML Model with OCL from normatives

Inputs

  • UML Model with OCL implemented from Design Process

Actions

  • Generating JAVA code from UML Model
  • Create unit test from each function/constraint
  • Generate documentation of the module

Outputs

  • JAVA code
  • Unit test JAVA code
  • JAVA code documentation of module

 

05-Test

Test Process

Objectives

  • Verify if module respect all rules and specifications

Pre-requisites

  • JAVA code is implemented

Inputs

  • JAVA Code
  • Validator webservice
  • Samples documents

Actions

  • Unit Test
    • Static analysis
    • UT Generation and verification
  • Integration Test
    • Services modules call
  • System Test
    • System testing relative Specifications documents
  • Acceptance Test
    • Beta testing by vendors relative their needs

 


Outputs

  • Test report

02-Gazelle HL7 Validator development and validation process

01-Requirements

Requirements Process

Objectives

  • Need to formalize a new development, evolution or correction of HL7 validator. 
  • Describe the features expected in the form of use cases or defect

Pre-requisites

  • N/A

Inputs

  • Need a new feature, a change or correction

Actions

  • Writing needs in an issue

Outputs

  • Requirements / Defects

02-Specification

Specification Process

Objectives

  • Identify and analyse impact from requirements process outputs
  • Identify features and releases
  • Identify tool impacts

Pre-requisites

  • Requirements are identified and created

Inputs

  • Issue which content the need

Actions

  • Analysis the issue to identity existing tools that can be used, and, in the contrary, new developments to be performed
  • Create a list of features to be offered by HL7 Validator
  • Specify the graphical user interface

Outputs

  • Gap analysis (gathers the support tools which required but missing and required by Test Management)
  • Features list
  • Mock-up of GUI
  • Development and Test plans

03-Design

Design Process

Objectives

  • Design and create specification

Pre-requisites

  • The need expression is available and formal

Inputs

  • Existing specification and design of module
  • Existing specification and design of tool
  • Existing platform architecture and design
  • Change requests (issue)
  • Management of documentation process

Actions

  • Design the modules hierarchy, that means, will HL7 validator inherit from existing Gazelle modules, which are the dependencies with other modules/third-party libraries, which are the new modules to be developed and that it would be convenient to keep as much as independant as possible for possible reuse.
  • Design the HL7 validator model: which are the needed entities, how they are linked together.
  • Analyse and change state of changement requests
  • Analyse impact on other module other than those specify on the change request

Outputs

  • Interactions diagram
  • Class and components diagrams
  • Specification and design of module
  • Specification and design of tool
  • HL7 validator architecture and design
  • Document index up to date

04-Realisation

Realisation Process

Objectives

  • Develop new HL7 validator features respecting normatives

Pre-requisites

  • Specfication documents are validated

Inputs

  • Tool specifications or module specifications
  • Tool design document or module design document
  • Coding rules
  • Development repository

Actions

  • Creation of new featuers or module, apply the KEREVAL development process available on SMQ

Outputs

  • JAVA code + XHTML
  • JAVA code documentation of module
  • New branch on development repository

 

05-Test

Test Process

Objectives

  • Verify if new features respect all rules and specifications

Pre-requisites

  • JAVA and XHTML code is compiled

Inputs

  • JAVA/XHTML Code
  • Branch of development on IHE SVN
  • Validation strategy
  • Validation objectives

Actions

  • Unit Test
    • Static analysis
    • Unit testing on critical methods
  • Integration Test
    • Verify services calling (webservices)
  • System Test
    • System testing against Specifications documents (features & GUI)
  • Acceptance Test
    • Beta testing by vendors according to their needs

Outputs

  • Test report with all part of test levels

03-Simulators development and validation process

01-Requirements

Requirements Process

Objectives

  • To identify new integration profiles/actors to be emulated from standardization bodies

Pre-requisites

  • N/A

Inputs

  • Needs of a new tool for testing an integration profile/actor implementation
  • Requirements from the technical framework concerning the chosen integration profile

Actions

  • Writing needs in specification document or an accessible documentation to be analysed by development team

Outputs

  • Technical Framework
  • Formal requirements

02-Specification

Specification Process

Objectives

  • Identify and analyse impact from requirements process outputs
  • Identify features and releases
  • Identify tool impacts

Pre-requisites

  • Requirements are identified by bodies standardization

Inputs

  • Requirements Documents
  • Technical Framework

Actions

  • Analysis requirements documentation/technical framework to identity existing tools that can be used, and, in the contrary, new developments to be performed
  • Create a lis of features to be offered by the new simulator
  • Specify the graphical user interface

Outputs

  • Gap analysis (gathers the support tools which required but missing and required by the new simulator)
  • Features list
  • Mock-up of GUI

03-Design

Design Process

Objectives

  • Design receive specifications

Pre-requisites

  • Specfication Documents are delivered and analysed

Inputs

  • Formalized requirements / technical framework

Actions

  • Design how the new simulator will be integrated within Gazelle platform (new application, integrated in an existing application) and how it will communicate with some of the others tools of the platform
  • Design the modules hierarchy, that means, will new simulator inherit from existing Gazelle modules, which are the dependencies with other modules/third-party libraries, which are the new modules to be developed and that it would be convenient to keep as much as independant as possible for possible reuse.
  • Design the simulator model: which are the needed entities, how they are linked together, CRUD schema...

Outputs

  • Interactions diagram
  • Class and components diagrams

04-Realisation

Realisation Process

Objectives

  • Develop new or existing simulator module respecting normatives

Pre-requisites

  • Specfication documents are validated

Inputs

  • All specification documentation

Actions

  • Develop simulator features

Outputs

  • JAVA code + XHTML
  • JAVA code documentation of module

 

05-Test

Test Process

Objectives

  • Verify if module respect all rules and specifications

Pre-requisites

  • JAVA and XHTML code is produced

Inputs

  • JAVA/XHTML Code
  • Simulator webservice
  • Message validated with suitable validator

Actions

  • Unit Test
    • Static analysis
  • Integration Test
    • Message exchange (allows to test the IHE interface and the correct integration of the message in the database if required)
  • System Test
    • System testing against Specifications documents (features & GUI)
  • Acceptance Test
    • Beta testing by vendors according to their needs

Outputs

  • Test report with all part of test levels

04-Test Management development and validation process

01-Requirements

Requirements Process

Objectives

  • Need to formalize a new development, evolution or correction of the platform. 
  • Describe the features expected in the form of use cases or defect

Pre-requisites

  • N/A

Inputs

  • Need a new feature, a change or correction

Actions

  • Writing needs in an issue

Outputs

  • Requirements / Defects

02-Specification

Specification Process

Objectives

  • Identify and analyse impact from requirements process outputs
  • Identify features and releases
  • Identify tool impacts

Pre-requisites

  • Requirements are identified and created

Inputs

  • Issue which contains the need

Actions

  • Analyse the issue, identitify existing tools that could be used. If there is not existing tools identify the developments to be performed
  • Create a list of features to be offered by Test Management
  • Specify the graphical user interface

Outputs

  • Gap analysis (gathers the support tools which required but missing and required by Test Management)
  • Features list
  • Mock-up of GUI
  • Development and Test plans

03-Design

Design Process

Objectives

  • Design and create specification

Pre-requisites

  • A formal expression of the needs is available

Inputs

  • Existing specification and design of module
  • Existing specification and design of tool
  • Existing platform architecture and design
  • Change requests (issue)
  • Management of documentation process

Actions

  • Design the modules hierarchy, that means, will Test Management inherit from existing Gazelle modules, which are the dependencies with other modules/third-party libraries, which are the new modules to be developed and that it would be convenient to keep as much as independant as possible for possible reuse.
  • Design the Test Management model: which are the needed entities, how they are linked together.
  • Analyse and change state of changement requests
  • Analyse impact on other module other than those specify on the change request

Outputs

  • Interactions diagram
  • Class and components diagrams
  • Specification and design of module
  • Specification and design of tool
  • Platform architecture and design
  • Document index up to date

04-Realisation

Realisation Process

Objectives

  • Develop new Test Mangement features respecting normatives

Pre-requisites

  • Specfication documents are validated

Inputs

  • Tool specifications or module specifications
  • Tool design document or module design document
  • Coding rules
  • Development repository

Actions

  • Creation of new featuers or module, apply the KEREVAL development process available on SMQ

Outputs

  • JAVA code + XHTML
  • JAVA code documentation of module
  • New branch on development repository

 

05-Test

Test Process

Objectives

  • Verify if new features respect all rules and specifications

Pre-requisites

  • JAVA and XHTML code is compiled

Inputs

  • JAVA/XHTML Code
  • Branch of development on IHE SVN
  • Validation strategy
  • Validation objectives

Actions

  • Unit Test
    • Static analysis
    • Unit testing on critical methods
  • Integration Test
    • Verify build integrity (components/modules)
  • System Test
    • System testing against Specifications documents (features & GUI)
  • Acceptance Test
    • Beta testing according to the expressed needs

Outputs

  • Test report with all part of test levels

05-Platform Delivery Process

Release

Objectives

  • Release of the Test Tools
  • Make a release note
  • Tag the version

Pre-requisites

  • Test Tool is implemented and tested

Inputs

  • Validation report of the Test Tools
  • Release sheet (Issues in JIRA)
  • Test Tools in SVN validated

Actions

  • Analyse the state of all change request in JIRA
  • Analyse the state of validation in the validation report
  • Close all change request
  • Deliver the test tool
  • Update tool index

Outputs

  • Release Note
  • Test Tools release

Plateform Documentation

Objectives

  • Maintain up to date documentation of the Test Tools for technical and marketing issues
  • Capitalization of Test Tools documents (user guides, presentations)

Pre-requisites

  • N/A

Inputs

  • Addition to major Test Tools features (e.g. coverage)
  • IHE presentations related to major events (e.g. customer demonstration, IHE presentation)

Actions

  • Store source files of document on network folder (Drupal)

Outputs

  • Any document to help to understand or to promote the Test Tools

06-Gazelle Proxy

Project overview

See Gazelle Proxy Informations.

 

Development

Proxy development is a V cycle similar at all other Test Tools projects. It is not critical to test result and his development process is minor priority.

 

Validation

Proxy validation is included in the global test strategy relative to its interactions with other Test Tools.

07-EVS Client

Project overview

See EVS Client Informations.

 

Development

EVS Client development is a V cycle similar at all other Test Tools projects. It is not critical to test result and his development process is minor priority.

 

Validation

EVS Client validation is included in the global test strategy relative to its interactions with other Test Tools.

03 - Test Strategy applied to test tools

Preparation of the test strategy

 

  • 1 what is tested
    2 when we test
    3 why we test

Elements to be tested

 

Tool: Test Management
Type: Gazelle Test Bed
Description: TM is the application used to manage the connectathon from registration process to the generation of the test report

 

Tool: Proxy
Type: Support Tool
Description: "Man in the middle": capture the messages exchanged between two systems and forwards them to the validation service front-end

 

Tool: Gazelle HL7 Validator
Type: Validation Service
Description: Offers web services to validate HL7v2.x and HL7v3 messages exchanged in the context of IHE

 

Tool: CDA Validator
Type: Validation Service
Description: Embedded in CDA Generator tool, it offers a web service to validate a large set of different kinds of CDA using a model-based architecture

 

Tool: Schematron-Based Validator
Type: Validation Service
Description: Offers a web service to validate XML documents against schematrons

 

Tool: External Validation Service Front-end
Type: Validation Service
Description: The EVSClient is a front-end which allows the user to use the external validation services from a user-friendly interface instead of the raw web service offered by the Gazelle tools.

 

Tool: XD* Client
Type: Simulator
Description:

Emulates the initiating actors of the XD* profiles (XDS.b, XCPD, XDR, XCA, DSUB ...), and validate XDS metadatas using a model based validation strategy

Tool: TLS Simulator
Type: Simulator
Description: Uses to test the TLS-based transactions for various protocols

 

Testing effort

The test effort will be prioritized as follows:

  • Major release
    • Level changes for backwards incompatible API changes, such as changes that will break user interface
    • The test effort is prioritized in order to test all levels of tests "Must" and "Should" (a test report is mandatory)
  • Minor release
    • Level changes for any backwards compatible API changes, such as new functionality/features
    • The test effort is prioritized in order to test all levels of tests "Must" (a test report is not mandatory bur recommended)
  • Bug fixes
    • Level changes for implementation level detail changes, such as small bug fixes
    • The test effort is considered by Test Tools Development Team (a test report is not mandatory)

See Test Tools Lifecycle

Criticality/Risks to test

A risks analysis was realized to determine project, product and lab risks. It is confidential to preserve the test strategy of Test Tools.

It permisses to evaluate critical parts of Test Tools to test.

  • Must - Must have this requirement to meet the business needs.
  • Should - Should have this requirement if possible, but project success does not rely on it.
  • Could - Could have this requirement if it does not affect anything else in the project.
  • Would - Would like to have this requirement later, but it won't be delivered this time.

 "from the MoSCoW Method"


 

Development of the strategy

 

  • 4 which tests are performed
    5 how it is tested
    6 who tests

Test Tools types and levels of testing

 

 Test Management Tests levels  
 Test types  Unit testing Integration testing  System testing
 Functional testing Should  Must Must
 No-functional testing Could  Must Must
 Structural testing Would  N/A N/A
 Tests related to change Should  Should Should

 

 Proxy Tests levels  
 Test types  Unit testing Integration testing  System testing
 Functional testing Must Must  Must 
 No-functional testing  Could  Should Could
 Structural testing  N/A  N/A N/A
 Tests related to change  Should Should Should

 

 HL7 Validator Tests levels  
 Test types  Unit testing  Integration testing System testing
 Functional testing  Would Must Must 
 No-functional testing  N/A Would Would 
 Structural testing  N/A  Would  Would
 Tests related to change  Could Must  Must 

 

 CDA Generator Tests levels  
 Test types  Unit testing Integration testing  System testing
 Functional testing  Must Must Should
 No-functional testing Must Should  N/A
 Structural testing N/A  N/A N/A
 Tests related to change  Would  Should Would

 

 Schematron Validator Tests levels  
 Test types  Unit testing  Integration testing System testing
 Functional testing   Should Should  Should
 No-functional testing  Should Should N/A 
 Structural testing  N/A N/A  N/A 
 Tests related to change  Would Should Would 

 

 EVS Client Tests levels  
 Test types  Unit testing Integration testing  System testing
 Functional testing Would  Must Must
 No-functional testing  N/A Would  Would
 Structural testing N/A  N/A N/A
 Tests related to change Would Must Would

 

 XD* Client Tests levels  
 Test types  Unit testing Integration testing  System testing
 Functional testing Must  Must  Should
 No-functional testing  Must Should  N/A  
 Structural testing N/A   N/A N/A 
 Tests related to change  Would Should  Would

 

 TLS Simulator Tests levels  
 Test types Unit testing  Integration testing System testing
 Functional testing  Should Should Could
 No-functional testing  Would Would Could
 Structural testing  Would Would Would
 Tests related to change  Should Should Should

 

Test Tools functional requirements (system testing level)

Each test tool has its features detailed in a requirements management tool (testlink). A set of high level requirements provides an overall view of the tool and the tests that needs be performed.

 

Test Management High level requirements
Application Management
Prepare test session
Execute test session
Report test session

 

Gazelle Proxy High level requirements
Proxy Core functions
Proxy interfacing

 

HL7 Validator High level requirements
HL7 Message Profile Management
HL7 Resources Management
HL7v2.x validation service
HL7v3 validation service

 

 CDA Generator High level requirements
 CDA Validation Service
 Documentation Management

 

Schematron Validator High level requirements
 Schematron validation service
 Schematrons Management

 

EVS Client High level requirements
Clinical Document Architecture - Model Based Validation
Clinical Document Architecture - Schématrons
Cross Enterprise Document Sharing - Model Based Validation
Transverse - Constraints
Functional requirements

 

 XDStar Client High level requirements
 XDS Metadatas validation service
 XDS Documentation management

 

TLS Simulator High level requirements
Not logged in users
Logged in users

 


Organization

Cutting in testing / campaigns

Two campaigns of system tests are planned in the project:

  • Campaign system testing including functional testing to achieve the different objectives of the above tests
  • Campaign of retest (verification of the correction of anomalies) and regression

Moreover, when a release is planned, a testing day could be prepared to test the platform globally.

Unit and integrating tests are managed during Test Tools Development cycle, and results are analysed during the campaigns to implemented a Test Tools test report.

Then, security audit of the platform premises to assure the using of the platform, it is renewed once a year.

Tests criticality

Levels and types of tests defined in the "Test Tools types and levels of testing" of "Development of the strategy" part allow actors to test quickly and easily organize the testing tools.The unit and integration level, all typed tests "Must" should be held. Will be played typed "Should" tests according to the time allotted for testing during development.
On the system level, campaigns specified below will be organized according to the delivery of tools provided with consideration of typed tests "Must" and "Should" in order of priority.

The Test Tools Development Manager is in charge of requiring and/or organizing tests campaigns, according to the targeted delivery.

Day type testing (example of test day before US CAT delivery)

 

Delivery validation process

 

 

Roles and responsabilities

Unit & Integrating testing (Test Tools Development Team)

  • Test platform administrator
  • Unit test management
  • Provision of datasets

System Testing (Testers)

  • Test plan writing
  • Testing balance writing
  • Monitoring the implementation of system tests
  • Test design
  • Datasets design and manage
  • Test execution
  • Bug report management

Acceptance Testing (Testers as Users)

  • Beta testing in real conditions of use
  • Bug or suggestions report

Test environment

Access to the application under test (URLs located on the web)

Each Test Tool is available within the development environment of developers for unit testing.

Datasets tests

Datasets are managed before a test campaign, relative to the need.

Calibration

Anyway on some test tools (TM and EVS excluded), reference dataset are specifically managed to calibrate the test tools.

Their purpose is to check that test result provided by the test tools are still valid, even after a change on the tool.

Software Test Management

The test management platform is integrated with TMS TestLink in specifics projects relative to the Test Tools. It is located on http://gazelle.ihe.net/testlink/login.php.

Software bug management

The software bug managemer is JIRA, it is located on http://gazelle.ihe.net/jira/secure/Dashboard.jspa . From the observation of a failure in the application, a bug report is written in Jira.


Testing supplies

 

  To forward To archive
Documentation
Test plan X X
Test folder   X
Test report X X
Bug report X X
Data    
Datasets tests   X
Basic input data   X
Basic output data   X
Records + trace X X