A watch is kept on the evolution of our references.
When a reference evolves, impact analysis is performed by the Lab Manager to determine possible changes.
Following this analysis, one (or more) ticket (s) is created in JIRA to take these changes into account.
These changes are then included in the traditional development process of testing tools.
The purpose of this section is to present the test strategy and KEREVAL Health test Lab approach.
This section takes into account the comprehensive approach to Project Management in the IHE group and applies to any project portfolio IHE.
Test strategy describes in an organization all the activities and methods of conducting activities verification and validation in a project to ensure:
- The objectives addressed by the project
- Lack of critical regression system
- Compliance with the specifications
- Compliance with the quality
- Robustness of the system
It defines the test phases, tasks and deliverables, roles and responsibilities of different stakeholders, manage the coordination of actors and the sequence of tasks.
ISTQB advocates that it is not possible to test everything, and test strategy is based on the analysis of risks and priorities to focus testing efforts.
See Gazelle Proxy Informations.
Proxy development is a V cycle similar at all other Test Tools projects. It is not critical to test result and his development process is minor priority.
Proxy validation is included in the global test strategy relative to its interactions with other Test Tools.
EVS Client development is a V cycle similar at all other Test Tools projects. It is not critical to test result and his development process is minor priority.
EVS Client validation is included in the global test strategy relative to its interactions with other Test Tools.
- 1 what is tested
2 when we test
3 why we test
Tool: | Test Management | ||
Type: | Gazelle Test Bed | ||
Description: | TM is the application used to manage the connectathon from registration process to the generation of the test report |
Tool: | Proxy | ||
Type: | Support Tool | ||
Description: | "Man in the middle": capture the messages exchanged between two systems and forwards them to the validation service front-end |
Tool: | Gazelle HL7 Validator | ||
Type: | Validation Service | ||
Description: | Offers web services to validate HL7v2.x and HL7v3 messages exchanged in the context of IHE |
Tool: | CDA Validator | ||
Type: | Validation Service | ||
Description: | Embedded in CDA Generator tool, it offers a web service to validate a large set of different kinds of CDA using a model-based architecture |
Tool: | Schematron-Based Validator | ||
Type: | Validation Service | ||
Description: | Offers a web service to validate XML documents against schematrons |
Tool: | External Validation Service Front-end | ||
Type: | Validation Service | ||
Description: | The EVSClient is a front-end which allows the user to use the external validation services from a user-friendly interface instead of the raw web service offered by the Gazelle tools. |
Tool: | XD* Client | ||
Type: | Simulator | ||
Description: |
Emulates the initiating actors of the XD* profiles (XDS.b, XCPD, XDR, XCA, DSUB ...), and validate XDS metadatas using a model based validation strategy |
Tool: | TLS Simulator | ||
Type: | Simulator | ||
Description: | Uses to test the TLS-based transactions for various protocols |
The test effort will be prioritized as follows:
A risks analysis was realized to determine project, product and lab risks. It is confidential to preserve the test strategy of Test Tools.
It permisses to evaluate critical parts of Test Tools to test.
"from the MoSCoW Method"
- 4 which tests are performed
5 how it is tested
6 who tests
Test Management | Tests levels | ||
Test types | Unit testing | Integration testing | System testing |
Functional testing | Should | Must | Must |
No-functional testing | Could | Must | Must |
Structural testing | Would | N/A | N/A |
Tests related to change | Should | Should | Should |
Proxy | Tests levels | ||
Test types | Unit testing | Integration testing | System testing |
Functional testing | Must | Must | Must |
No-functional testing | Could | Should | Could |
Structural testing | N/A | N/A | N/A |
Tests related to change | Should | Should | Should |
HL7 Validator | Tests levels | ||
Test types | Unit testing | Integration testing | System testing |
Functional testing | Would | Must | Must |
No-functional testing | N/A | Would | Would |
Structural testing | N/A | Would | Would |
Tests related to change | Could | Must | Must |
CDA Generator | Tests levels | ||
Test types | Unit testing | Integration testing | System testing |
Functional testing | Must | Must | Should |
No-functional testing | Must | Should | N/A |
Structural testing | N/A | N/A | N/A |
Tests related to change | Would | Should | Would |
Schematron Validator | Tests levels | ||
Test types | Unit testing | Integration testing | System testing |
Functional testing | Should | Should | Should |
No-functional testing | Should | Should | N/A |
Structural testing | N/A | N/A | N/A |
Tests related to change | Would | Should | Would |
EVS Client | Tests levels | ||
Test types | Unit testing | Integration testing | System testing |
Functional testing | Would | Must | Must |
No-functional testing | N/A | Would | Would |
Structural testing | N/A | N/A | N/A |
Tests related to change | Would | Must | Would |
XD* Client | Tests levels | ||
Test types | Unit testing | Integration testing | System testing |
Functional testing | Must | Must | Should |
No-functional testing | Must | Should | N/A |
Structural testing | N/A | N/A | N/A |
Tests related to change | Would | Should | Would |
TLS Simulator | Tests levels | ||
Test types | Unit testing | Integration testing | System testing |
Functional testing | Should | Should | Could |
No-functional testing | Would | Would | Could |
Structural testing | Would | Would | Would |
Tests related to change | Should | Should | Should |
Each test tool has its features detailed in a requirements management tool (testlink). A set of high level requirements provides an overall view of the tool and the tests that needs be performed.
Test Management | High level requirements | ||
Application Management | |||
Prepare test session | |||
Execute test session | |||
Report test session |
Gazelle Proxy | High level requirements | ||
Proxy Core functions | |||
Proxy interfacing |
HL7 Validator | High level requirements | ||
HL7 Message Profile Management | |||
HL7 Resources Management | |||
HL7v2.x validation service | |||
HL7v3 validation service |
CDA Generator | High level requirements | ||
CDA Validation Service | |||
Documentation Management |
Schematron Validator | High level requirements | ||
Schematron validation service | |||
Schematrons Management |
EVS Client | High level requirements | ||
Clinical Document Architecture - Model Based Validation | |||
Clinical Document Architecture - Schématrons | |||
Cross Enterprise Document Sharing - Model Based Validation | |||
Transverse - Constraints | |||
Functional requirements |
XDStar Client | High level requirements | ||
XDS Metadatas validation service | |||
XDS Documentation management |
TLS Simulator | High level requirements | ||
Not logged in users | |||
Logged in users |
Two campaigns of system tests are planned in the project:
Moreover, when a release is planned, a testing day could be prepared to test the platform globally.
Unit and integrating tests are managed during Test Tools Development cycle, and results are analysed during the campaigns to implemented a Test Tools test report.
Then, security audit of the platform premises to assure the using of the platform, it is renewed once a year.
Levels and types of tests defined in the "Test Tools types and levels of testing" of "Development of the strategy" part allow actors to test quickly and easily organize the testing tools.The unit and integration level, all typed tests "Must" should be held. Will be played typed "Should" tests according to the time allotted for testing during development.
On the system level, campaigns specified below will be organized according to the delivery of tools provided with consideration of typed tests "Must" and "Should" in order of priority.
The Test Tools Development Manager is in charge of requiring and/or organizing tests campaigns, according to the targeted delivery.
Each Test Tool is available within the development environment of developers for unit testing.
Datasets are managed before a test campaign, relative to the need.
Anyway on some test tools (TM and EVS excluded), reference dataset are specifically managed to calibrate the test tools.
Their purpose is to check that test result provided by the test tools are still valid, even after a change on the tool.
The test management platform is integrated with TMS TestLink in specifics projects relative to the Test Tools. It is located on http://gazelle.ihe.net/testlink/login.php.
The software bug managemer is JIRA, it is located on http://gazelle.ihe.net/jira/secure/Dashboard.jspa . From the observation of a failure in the application, a bug report is written in Jira.
To forward | To archive | |
Documentation | ||
Test plan | X | X |
Test folder | X | |
Test report | X | X |
Bug report | X | X |
Data | ||
Datasets tests | X | |
Basic input data | X | |
Basic output data | X | |
Records + trace | X | X |