IHE has a collection of tools for testing implementations of actors in IHE profiles. You will find links to all tools for testing IHE profiles and their test descriptions on the complete Index to IHE Test Tools
The test cases in this section are for a subset of IHE's tools and are generally organized in the menu at left according to the tool that is used.
User guides for tools associated with these test cases are here: http://gazelle.ihe.net/content/gazelle-user-guides
If you are preparing for an IHE Connectathon, Gazelle Test Management for your testing event contains a list of the Preparatory tests you must perform, customized according to the IHE profiles/actors/options you have registered to test. You can find your list of Preparatory tests in Gazelle Test Management under menu Testing -->Test execution. (For IHE European Connectathons, Gazelle Test Management is at https://gazelle.ihe.net/gazelle/home.seam; other testing events may have a diffeent URL).
This section contains test cases performed with the Gazelle Security Suite tool:
--> Prior to performing ATNA tests, please read this page for guidelines that address frequently asked questions about testing expectations. <--
THIS PAGE APPLIES TO ATNA TESTING AT 2024 IHE CONNECTATHONs.
The ATNA requirements are in the IHE Technical Framework:
NOTE: The folloing options were retired in 2021 via CP-ITI-1247 and are no longer tested at IHE Connectathons:
Tool-based testing of TLS (node authentication) and of the format and transport of your audit messages is consolidated in one tool - the Gazelle Security Suite (GSS).
In order to ensure interoperability between systems doing interoperability (peer-to-peer) testing over TLS (e.g. XDS, XCA...) the Connectathon technical managers have selected a TLS version and ciphers to use for interoperability tests during Connectathon week. (This is analagous to a hospital mandating similar requirements at a given deployment.)
TLS POLICY for [ITI-19]:
*** For the 2022 IHE Connectathon, interoperabily testing over TLS shall be done using:
AUDIT MESSAGE POLICY for [ITI-20]:
Before 2020, an ATNA Audit Record Repository (ARR) was required to support receiving audit messages in both TLS syslog and UDP syslog. That meant that all Secure Node/Applications could send their audit messaes to any ARR.
Now, all actors sending and receiving audit messages may choose to support TLS Syslog, UDP Syslog, and/or FHIR Feed for transport. We expect that the Audit Record Repositories at the NA and EU Connectathons will provide good coverage of the options (TLS, UDP, FHIR), though some ARRs may support a subset. In particular, the FHIR Feed Option in ITI-20 may have less support because it was new as of 2020.
Connectathon technical managers will not select one transport for all audit records exchanged during Connectathon. Instead, Secure Node/Applications will choose ARRs for test partners that are compatible with the audit records they send in ITI-20. Gazelle Test Management will show compatible partners for ITI-20 interoperability tests: "ATNA_Logging_*.
The Gazelle Security Suite (GSS) tool is the SINGLE PROVIDER OF DIGITIAL CERTIFICATES for IHE Connectathons.
To obtain a digital certificate from the GSS tool for preparatory & Connectathon testing, follow the instructions in test 11100. That test contains instructions that apply to an IHE Connectathon, whether face-to-face or online.
Some facts about the digital certificates for Connectathon testing:
Systems testing ATNA are required to complete the ATNA Questionnaire in the GSS tool, ideally prior to Connectathon week. Embedded in the questionnaire are Audit Record tests and TLS tests customized for the profiles & actors you registered to test at Connectathon.
Read the Technical Framework documentation; you are responsible for all requirements in Record Audit Event [ITI-20] transaction. We will not repeat the requirements here.
WHICH SCHEMA???: The Record Audit Event [ITI-20] specifies use of the DICOM schema for audit messages sent using the ATX: TLS Syslog and ATX: UDP Syslog options. The DICOM schema is found in DICOM Part 15, Section A.5.1.
We expect implementations to be compliant; we have tested audit messages using the DICOM schema at IHE Connectathons since 2016.
SENDING AUDIT MESSAGES: You can send your audit records to the GSS tool simulating an Audit Record Repository. See test 11117.
Contact the Technical Project Manager for the IT Infrastructure domain. Refer to the Contact Us page.
There is no specific evaluation for this test.
Create a text file stating that you found and read the page. Upload that text file into Gazelle Test Management as the Log Return file for test 11099.
This test contains instructions for obtaining a digital certificate for your test system that is registered for an IHE Connectathon. You will obtain your digital certificate(s) from the Gazelle Security Suite tool.
First, please read the ATNA Testing Resources page before proceeding with this test. That page contains important context for using the digital certificates for Connectathon-related tests.
When you generate your digital certificate in Gazelle Security Suite, you will need to know two values:
(1) The hostname(s) for your test system:
(2) Domain Name:
When logging in to GSS, you will use your username & password from Gazelle Test Management for your Connectathon. There are separate CAS systems for different instances of Gazelle Test Management, and you will have to take this into account when logging in to GSS:
On the GSS home page (http://gazelle.ihe.net/gss) find the "Login" link at the upper right of the page.
It is also possible to find your certificate using the menu:
You are now ready to use this certificate for performing:
There is no specific evaluation for this test.
Create a text file stating that you have requested & received your certificate(s). Upload that text file into Gazelle Test Management as the Log Return file for test 11100.
In subsequent tests (eg 11109 Authentication test), you will verify the proper operation of your test system with your digital certificate.
In this test you complete a form which collects information that will help us evaluate the Audit Logging and Node Authentication (ATNA) capabilities of your test system.
The contents of your ATNA Questionnaire are customized based on the the profiles and actors that you have registered in Gazelle Test Management for a given testing event (e.g. an IHE Connectathon). Depending on which profiles/actors you have registered for, the ATNA Questionnaire will ask you to validate audit messages for transactions you support, and you will be asked to demonstrate successful TLS connections for the transports you support (eg DICOM, MLLP, HTTP).
Before you can generate your on-line ATNA questionnaire:
When logging in to GSS, you will use your username & password from Gazelle Test Management for your testing event. There are separate CAS systems for different instances of Gazelle Test Management, and you will have to take this into account when logging in to GSS:
On the GSS home page (http://gazelle.ihe.net/gss) find the "Login" link at the upper right of the page.
1. In GSS, select menu Audit Trail --> ATNA Questionnaires
2. First, search for any existing questionnaires for your organization. Use the filters at the top of the page to search based on various criteria. You will only be able to access the questionnaires created for your organization's test systems. Admins and monitors can access all of them.
3. You can use the icons in the right 'Actions' column to:
4. If no questionnaire is available for your test system, you need to create a new one.
5. Complete the questionnaire. You are now in the ATNA Questionnaire Editor.
6. Mark your questionnaire "Ready for review"
Depending on the testing event, the results of this test may be reviewed in advance. More typically, it will be reviewed and graded by a Monitor during the test event itself (e.g. during Connectathon week).
Note: You cannot get connectathon credit (i.e. a "Pass") for your ATNA Secure Node/Application without completing and submitting your questionnaire.
(1) Read the ATNA Testing Resources page before proceeding with this test.
(2) To perform this test, your digital certificate must be set up on your system (server and/or client). Follow the instructions in test 11000 to obtain digital certificate(s) for your test system(s).
(3) You should create your ATNA Questionnaire (test 11106) prior to running this test.
In this test, you will use the Gazelle Security Suite (GSS) tool (https://gazelle.ihe.net/gss) to verify that you are able to communicate with TLS clients and servers using digital certificates.
The GSS tool contains multiple client and server simulators that check:
The TLS simulators available in the GSS tool are listed in Column 1 in the following table, along with notes on which you should use for this test:
Simulator Names (keyword) | To be tested by... |
Simulator configuration |
-- Server DICOM TLS 1.2 Floor -- Server HL7 TLS 1.2 Floor -- Server HTTPS/WS TLS 1.2 Floor -- Server Syslog TLS 1.2 Floor |
Connectathon test system that supports the "STX: TLS 1.2 Floor option" and is a client that... -- Initiates a TLS connection with DICOM protocol -- Initiates a TLS connection with MLLP protocol (i.e. HL7 v2 sender) -- Initiates a TLS connection for a webservices transaction -- Initiates a TLS connection to send an audit message over TLS syslog |
TLS 1.2 with 4 'strong' ciphers:
You may test with just one of the ciphers. |
-- Server RAW TLS 1.2 INVALID FQDN |
Connectathon test system that is a client supporting the "FQDN Validation of Server Certificate option" |
TLS 1.2 with 4 'strong' ciphers; see list above. Certificate has an invalid value for subjectAltName. |
-- Client TLS 1.2 Floor |
Connectathon test system that supports the "STX: TLS 1.2 Floor option" and is a server that... -- Accepts a TLS connection with DICOM protocol -- Accepts a TLS connection with MLLP protocol (i.e. HL7 v2 responder) -- Accepts a TLS connection for a webservices transaction -- Accepts a TLS connection to receive an audit message over TLS syslog |
TLS 1.2 with 4 'strong' ciphers; see list above. |
When logging in to GSS, you will use your username & password from Gazelle Test Management for your testing event. There are separate CAS systems for different instances of Gazelle Test Management, and you will have to take this into account when logging in to GSS:
On the GSS home page (http://gazelle.ihe.net/gss) find the "Login" link at the upper right of the page.
If your test system (SUT) does not act as a client (i.e., does not initiate any transactions), then skip this portion of the test and only test the Server side below).
If your SUT acts as a client, you must be able to access to TLS server's public IP. You have to test your client by connecting to Server Simulators in the Gazelle Security Suite tool.
1. On the home page for the Gazelle Security Suite, select menu TLS/SSL-->Simulators-->Servers to find the list of server simulators. There are servers for different protocls (DICOM, HL7...) and for different ATNA options (e.g., TLS 1.2 Floor...).
2. Configure your client to connect to the test TLS server.
3. Check that the server is started before trying to connect to it. Click on the link for the server you want and look for status "Running"
4. In your SUT, perform a connection (eg send a query) to the test server. The TLS connection is valid, but at transaction level you will get invalid replies because we are only checking for the TLS connection.
5. You should then get a time-stamped entry in the results list at the bottom of the page. Blue dot means OK, red NOT OK.
5. For each successful connection, view the result with the icon in the "Action" column. Copy the Permanent link (URL) to the result into your ATNA Questionnaire, on the "TLS Tests" tab The link must be formatted like https://.../connection.seam?id=...
6. Repeat these steps for each supported protocol (HL7v2 , DICOM, Syslog server ...) : e.g., if your system has no DICOM capabilities, you can skip that portion of the test.
If your test system (SUT) does not act as a server (i.e., does not respond to any transactions initiated by others), then skip this portion of the test and only perform the Client test above).
If your SUT acts as a server (i.e. a responder to IHE transactions), your server must be accessible from the outside so that the GSS tool, as a client simulator, can connect to your SUT.
1. On the home page for the Gazelle Security Suite, select menu TLS/SSL-->Simulators-->Clients to find the list of client simulators.
2. In the "Start Connection" section of the page, you will have to specify, for each supported protocol :
3. Then click on "Start client".
4. You should then get a time-stamped entry in the results list. Blue means OK, red NOT OK.
5. For each successful connection, view the result at the bottom of the page using the icon in the "Actions" column. Copy the Permanent Link (URL) to the result into your ATNA Questionnaire, on the "TLS Tests" tab. The link must be formatted like https://.../connection.seam?id=...
6. Repeat these steps for each supported protocol (HL7v2, DICOM, Syslog client, ...) : e.g., if your system has no DICOM capabilities, you can skip that portion of the test.
Depending on the testing event, the results of this test may be reviewed in advance. More typically, it will be reviewed and graded by a Monitor during the test event itself (e.g. during Connectathon week).
The tool reports success or failure for each test you perform. Your test system must demonstrate successful TLS handshake for each inbound and outbound protocol you support.
If you are performing this test in preparation for an IHE Connectathon, a Connectathon monitor will verify your results as follows. The monitor will:
*** If your ATNA Secure Node/Secure Application is only a client (ie it only initiates transactions), then this test case is not applicable for you. Skip it. ***
This test exercises several error cases. You will use the TLS Tool in the Gazelle Security Suite as a simulated client, trying to connect to a Secure Node (SN) or Secure Application (SA) acting as a server.
Perform test 11109 Authentication Test before running this 'error cases' test.
When logging in to GSS, you will use your username & password from Gazelle Test Management for your testing event. There are separate CAS systems for different instances of Gazelle Test Management, and you will have to take this into account when logging in to GSS:
On the GSS home page (http://gazelle.ihe.net/gss) find the "Login" link at the upper right of the page.
Depending on the testing event, the results of this test may be reviewed in advance. More typically, it will be reviewed and graded by a Monitor during the test event itself (e.g. during Connectathon week).
Each error case must have a result of 'PASSED'.
Each transport type (HL7v2, DICOM, HL7, DICOM_ECHO, WEBSERVICE, SYSLOG, or RAW) implemented by your system as a server must have been tested at least one time in the list of error cases.
If you are performing this test in preparation for a Connectathon, a Connectathon monitor will verify your results pasted into each test step.
This test applies to a Secure Node/Application that supports the ATX: TLS Syslog or ATX: TLS UDP Option.
In this test, a Secure Node or Secure Application tests audit messages it sends.
The Gazelle Security Suite tool provides the ability to validate audit messages against the DICOM schema and the audit message definitions for many transactions in IHE Technical Frameworks. (We are not longer testing the RFC 3881 schema; the ATNA profile requires support for the DICOM schema for syslog audit messages sent via ITI-20.)
When logging in to GSS, you will use your username & password from Gazelle Test Management for your testing event. There are separate CAS systems for different instances of Gazelle Test Management, and you will have to take this into account when logging in to GSS:
On the GSS home page (http://gazelle.ihe.net/gss) find the "Login" link at the upper right of the page.
You may perform this test directly in the ATNA Questionnaire **or** you may use the Gazelle EVSClient tool. If you are preparing for an IHE Connectathon, you should use the instructions below for the ATNA Questionnaire.
---->Instructions for checking audit messages using the ATNA Questionnaire:
---->Instructions for checking audit messages using the EVSClient tool:
Depending on the testing event, the results of this test may be reviewed in advance. More typically, it will be reviewed and graded by a Monitor during the test event itself (e.g. during Connectathon week).
The tool reports the results of the validation of your messages. We are looking for PASSED results.
In this test, a client sends audit records or event reports using transaction [ITI-20] Record Audit Event to the Syslog Collector tool acting as an Audit Record Repository or Event Repository. The Syslog Collector is one of the tools embedded in the Gazelle Security Suite.
This test is performed by an ATNA Secure Node, Secure Application or Audit Record Forwarder. It is also performed by a SOLE Event Reporter.
Note that this test checks the transport of audit messages. The content of your audit message is verified in a different test.
When logging in to GSS, you will use your username & password from Gazelle Test Management for your testing event. There are separate CAS systems for different instances of Gazelle Test Management, and you will have to take this into account when logging in to GSS:
On the GSS home page (http://gazelle.ihe.net/gss) find the "Login" link at the upper right of the page.
You must check that your audit message has been received by the Syslog Collector and that the protocol SYSLOG is correctly implemented.
TCP Syslog is using the same framing requirement as TLS Syslog. You can first use the TCP port of Syslog Collector to debug your implementation. Keep in mind that the IHE ATNA Profile expects at least UDP or TLS for actors that produce SYSLOG messages.
.
This test applies to a Secure Node/Application that supports the ATX: FHIR Feed Option.
The RESTful ATNA TI Supplement, Section 3.20.4.2.2.1, defines a mapping between DICOM Audit Messages and FHIR AuditEvent Resources. Implementers should be creating their AuditEvent Resources according to the defined mappings, and expect that they will be examined according those mappings at IHE Connecthons.
---->Instructions for checking additional constraints on AuditEvent Resources (mapping defifned in ITI TF-2b: 3.20.4.2.2.1):
Depending on the testing event, the results of this test may be reviewed in advance. More typically, it will be reviewed and graded by a Monitor during the test event itself (e.g. during Connectathon week).
The tool reports the results of the validation of your Resources. We are looking for PASSED results.
Pre-connectathon testing for systems implementing the CPI (Consistent Presentation of Images) Profile are perfomed using the DICOMScope tool and associated test plans developed by OFFIS. This tool is also used by Image Display actors in the MAMMO, DBT and SMI profiles.
Find links to specific instructions for each actor below.
Image Displays some profiles are required to be able to calibrate a monitor according to the the DICOM Grayscale Standard Display Function (GSDF):
Ideally, the monitor calibration would be done during the Connectathon; however, we recognize that there are costs associated with shipping a monitor to the Connectathon.
(1) If you are testing an Image Display the CPI profile but not MAMMO, DBT, or SMI, as an alternative to shipping a monitor to the Connectathon you may:
(2) If you are an Image Display actor in the MAMMO, SMI or DBT profile:
There is no evaluation for this informational 'test'. You will share you calibration results in the DICOMScope test.
Image Displays some profiles are required to be able to calibrate a monitor according to the the DICOM Grayscale Standard Display Function (GSDF):
We use software and test plans developed by OFFIS to test these actors.
Please refer to test Hardware_Rqmts_ImageDisplay.
Please upload the completed gsdf_lum.xls file produced in the DICOMScope test procedure into Gazelle Test Management as the results for this test.
Actors in several profiles are required to implement the [RAD-23] “Print Request with Presentation LUT” transaction as an SCU:
We use software and test plans developed by OFFIS to test these actors.
Capture and submit your results:
Actors in several profiles are required to implement the [RAD-23] “Print Request with Presentation LUT” transaction as an SCP:
We use test plans developed by OFFIS to test these actors. The goal of the Print calibration tests is to demonstrate that printed output conforms to the GSDF.
Capture and submit your results:
This section contains test cases that contain instructions or other background material that prepares a test system for interoperability testing at an IHE Connectathon.
Ideally, the preparation described in these tests should be completed BEFORE the Connectathon begins.
This is an index of Do This First and Read This First tests that apply to testing across multiple domains
This test applies to test systems that have implemented one or more IHE Profiles based on HL7®© FHIR or FHIRcast®©.
IHE publishes CapabilityStatements aligned with profile requirements on the Artifacts page of the FHIR Implementation Guide (IG) for that profile (e.g. for the IHE ITI PIXm profile, see https://profiles.ihe.net/ITI/PIXm/artifacts.html).
==> During the Connectathon Preparatory phase: You will create a FHIR or FHIRcast CapabilityStatement Resource that represents the FHIR capabilities in your system/product, i.e. CapabilityStatement.kind has value "instance". You will upload it as a sample into Gazelle Test Management. Finally, you will use Gazelle External Validation Service (EVS) to validate your CapabilityStatement.
==> Later during Connectathon:
Reference: IHE (ITI) Appendix Z on HL7 FHIR, Section Z.3: "HL7 FHIR allows service implementers to publish a CapabilityStatement Resource describing the resources, transport, formats, and operations that can be performed on a series of resources for the service instance. The CapabilityStatement Resource is described in FHIR: http://hl7.org/fhir/CapabilityStatement.html. Actors providing http server functionality shall publish a CapabilityStatement on the metadata endpoint as described in FHIR http://hl7.org/fhir/R4/http.html#capabilities."
First, create a Sample entry in Gazelle Test Management for your CapabilityStatement:
Second, validate your CapabilityStatement using Gazelle EVSClient:
NOTE: You will be asked to provide this CapabilityStatement during Connectathon, and Monitors will examine it then, so it is to your benefit to do this preparation in advance of Connectathon.
To enable interoperability testing at an IHE Connectathon, some actors require OIDs to be assigned for various attributes in the messages they will exchange. Gazelle Test Management assigns these OID values.
For example:
Once you find the OIDs you need, configure your Test System in advance so that you are ready to start testing at the beginning of the Connectathon.
There is no result file to upload into Gazelle Test Management for this test.
This is an index of Do This First and Read This First tests defined for profiles in the IHE IT Infrastructure (ITI) domain.
This informational 'test' provides an overview of APPC tests and defines value sets used when creating Patient Privacy Policy Consent documents during Connectathon. Please read what follows and complete the preparation described before the Connectathon.
The APPC profile enables creating Patient Privacy Policy Consent documents of many, many variations. We have created test cases that cover some of the common use cases in the APPC profile for consent documents that would be commonly created for patients in health care facilities, eg consent disclose patient's data to an facility or to restrict disclosure of data from a specific individual provider.
Content Creators:
Content Consumers:
The APPC Profile does not mandate how Consent documents get from a Content Creator to a Content Consumer. It could be XDS, another IHE profile, or a non-IHE method.
At the Connectathon, we ask Content Creators to upload the Patient Privacy Policy Consent documents it creates into the Samples area of Gazelle (menu Connectathon-->Connectathon-->List of samples, on the 'Samples to share' tab. Content Consumers will find uploaded samples under the same menu on the 'Samples available for rendering' tab.
Each APPC Patient Privacy Policy Consent document applies to a single PATIENT. In a consent document, the patient ID and assigning authority values are encoded with the AttributeId urn:ihe:iti:ser:2016:patient-id . A patient may have more than one privacy policy consent.
We do not identify a single 'test patient' that must be used for creating APPC documents for Connectathon testing. The Content Creator may include any valid Patient ID. If the policy restricts or allows access based on values the XDS metadata for a patient's documents, the Content Creator may use a Patient ID in for clinical document(s) the policy applies to.
APPC Patient Privacy Policy Consent documents rely on the affinity domain agreeing to a set of PATIENT PRIVACY POLICIES that apply to ORGANIZATIONS and INDIVIDUAL PROVIDERS. These policies, organizations, providers are identified by unique identifiers that are recognized within the affinity domain, and are encoded in a patient's consent document.
To enable the APPC Content Creator to specify different policies, this test defines values for various attributes used in policies:
The tables below contain value sets that are defined for the purpose of Connectathon testing of the APPC profile.
POLICIES FOR CONNECTATHON TESTING OF APPC:
Policies: (APPC use case) |
PolicySetIdReference | Policy description |
FOUNDATIONAL POLICY (all use cases) |
urn:connectathon:bppc:foundational:policy |
By default, in this Connectathon affinity domain, document sharing is based on the value of Confidentiality Code (DocumentEntry.confidentialityCode). This policy (inherited from BPPC) is applied to all documents.
A patient may also choose to apply one of the additional policies below.
|
FULL ACCESS TO ALL | urn:connectathon:policy:full-access | The patient agrees that the document may always be shared. (This is equivalent to having a confidentiality code of "U".) |
DENY ACCESS TO ALL | urn:connectathon:policy:deny-access | The patient prohibits the document from ever being shared. (This is equivalent to having a confidentiality code of "V".) |
DENY ACCESS EXCEPT TO PROVIDER | urn:connectathon:policy:deny-access-except-to-provider | The patient prohibits the document from being shared except with the provider(s) identified in the consent document. |
DENY ACCESS TO PROVIDER |
urn:connectathon:policy:deny-access-to-provider | The patient prohibits the document from being shared with the provider(s) identified in the consent document. The referenced individual provider(s) is prohibited from accessing this patient's documents (ie no read or write access). |
DENY ACCESS EXCEPT TO FACILITY |
urn:connectathon:policy:deny-access-except-to-facility | The patient prohibits the document from being shared except with the facility(ies) identified in the consent document. |
DENY TO ROLE | urn:connectathon:policy:deny-access-to-role | The patient prohibits the document from being shared with providers who have the role(s) identified in the consent document |
FULL ACCESS TO ROLE | urn:connectathon:policy:full-access-to-role | The patient allows the document to be shared with providers who have the role(s) identified in the consent document. The patient prohibits the document from being shared with providers with any other role(s). |
LIMIT DOCUMENT VISIBILITY (use case 6) |
1.3.6.1.4.1.21367.2017.7.104 | The patient prohibits sharing the referenced clinical document(s) and this privacy policy consent document with any healthcare provider or facility. |
ORGANIZATIONS/FACILITIES defined in the "Connectathon Domain":
XACML AttributeId--> Facility |
urn:ihe:iti:appc:2016:document-entry:healthcare-facility-type-code (DocumentEntry.healthcareFacilityTypeCode) |
urn:oasis:names:tc:xspa:1.0:subject:organization-id |
Connectathon Radiology Facility for IDN One | code=”Fac-A” displayName=”Caregiver Office” codeSystem=”1.3.6.1.4.1.21367.2017.3" |
urn:uuid:e9964293-e169-4298-b4d0-ab07bf0cd78f |
Connectathon Radiology Facility for NGO Two | code=”Fac-A” displayName=”Caregiver Office” codeSystem=”1.3.6.1.4.1.21367.2017.3" |
urn:uuid:e9964293-e169-4298-b4d0-ab07bf0cd12c |
Connectathon Dialysis Facility One | code=”Fac-B” displayName=”Outpatient Services” codeSystem=”1.3.6.1.4.1.21367.2017.3" |
urn:uuid:a3eb03db-0094-4059-9156-8de081cb5885 |
Connectathon Dialysis Facility Two | code=”Fac-B” displayName=”Outpatient Services” codeSystem=”1.3.6.1.4.1.21367.2017.3" |
urn:uuid:be4d27c3-21b8-481f-9fed-6524a8eb9bac |
INDIVIDUAL HEALTHCARE PROVIDERS defined in the "Connectathon Domain":
XACML AttributeId--> Provider |
urn:oasis:names:tc:xacml:1.0:subject:subject-id | urn:oasis:names:tc:xspa:1.0:subject:npi | urn:oasis:names:tc:xacml:2.0:subject:role |
Dev Banargee | devbanargee | urn:uuid:a97b9397-ce4e-4a57-b12a-0d46ce6f36b7 |
code=”105-007” |
Carla Carrara | carlacarrara | urn:uuid:d973d698-5b43-4340-acc9-de48d0acb376 |
code=”105-114” |
Jack Johnson | jackjohnson | urn:uuid:4384c07a-86e2-40da-939b-5f7a04a73715 | code=”105-114” displayName=”Radiology Technician” codeSystem="1.3.6.1.4.1.21367.100.1" |
Mary McDonald | marymcdonald | urn:uuid:9a879858-8e96-486b-a2be-05a580f0e6ee | code=”105-007” displayName=“Physician/Medical Oncology” codeSystem="1.3.6.1.4.1.21367.100.1" |
Robert Robertson | robertrobertson | urn:uuid:b6553152-7a90-4940-8d6a-b1017310a159 | code=”105-007” displayName=“Physician/Medical Oncology” codeSystem="1.3.6.1.4.1.21367.100.1" |
William Williamson | williamwilliamson | urn:uuid:51f3fdbe-ed30-4d55-b7f8-50955c86b2cf | code=”105-003” displayName=“Nurse Practitioner” codeSystem="1.3.6.1.4.1.21367.100.1" |
XDS Document Sources:
XACML AttributeId--> Source Id: |
urn:ihe:iti:appc:2016:source-system-id (SubmissionSet.sourceId) |
Use sourceId as assigned in Gazelle to Connectathon XDS Doc Sources | Various XDS Document Sources systems |
CONFIDENTIALITY CODES:
XACML AttributeId--> ConfidentialityCode: |
urn:ihe:iti:appc:2016:confidentiality-code (DocumentEntry.confidentialityCode) |
normal | code=”N” displayName=”normal” codeSystem=”2.16.840.1.113883.5.25" |
restricted | code=”R” displayName=”restricted” codeSystem=”2.16.840.1.113883.5.25" |
very restricted | code=”V” displayName=”very restricted” codeSystem=”2.16.840.1.113883.5.25" |
unrestricted | code=”U” displayName=”unrestricted” codeSystem=”2.16.840.1.113883.5.25" |
PURPOSE OF USE:
XACML AttributeId--> Purpose of use: |
urn:oasis:names:tc:xspa:1.0:subject:purposeofuse |
TREATMENT | code=”99-101” displayName=”TREATMENT” codeSystem="1.3.6.1.4.1.21367.3000.4.1" |
EMERGENCY | code=”99-102” displayName=”EMERGENCY” codeSystem="1.3.6.1.4.1.21367.3000.4.1" |
PUBLICHEALTH | code=”99-103” displayName=”PUBLICHEALTH” codeSystem="1.3.6.1.4.1.21367.3000.4.1" |
RESEARCH | code=”99-104” displayName=”RESEARCH” codeSystem="1.3.6.1.4.1.21367.3000.4.1" |
There is no evaluation for this informational test. If the systems testing the APPC Profile do not do the set-up described above, then APPCC tests at Connectathon will not work.
This is an informational 'test'. We want all actors involved in testing the BPPC Profile and the BPPC Enforcement Option to read the "Privacy Policy Definition for IHE Connectathon Testing".
Prior to arriving at the Connectathon, read this document: Privacy Policy Definition for IHE Connectathon Testing. This contains the policy for XDS Affinity Domains at the Connectathon, including 2 BPPC-related items.
There is no evaluation for this informational test. If the systems do not do the set-up described above, then BPPC Profile tests and BPPC Enforcement Options tests at Connectathon will not work.
This is a "task" (ie not a test) that ensures that your '''CSD Care Services Directory''' is loaded with the entries that we will use as part of Connectathon testing.
The Care Services Directory is loaded with Connectathon test data: (1) Codes, and (2) Organization, Provider, Facility, and Service information.
(1) Load Connectathon code sets:
ITI TF-1:35.1.1.1 states, "Implementing jurisdictions may mandate code sets for Organization Type, Service Type, Facility Type, Facility Status, Provider Type, Provider Status, Contact Point Type, Credential Type, Specialization Code, and language code. A Care Services Directory actors shall be configurable to use these codes, where mandated."
For Connectathon testing, we define these codes and ask that you load them onto your Directory prior to arriving at the connectathon. They are documented in the format defined in IHE's SVS (Sharing Value Sets) profile, though support for SVS is not mandated in IHE.
The code sets are found here in Google Drive under IHE Documents > Connectathon > test_data > ITI-profiles > CSD-test-data > CSD_Directory_Codes. (They are also available in the SVS Simulator: http://gazelle.ihe.net/SVSSimulator/browser/valueSetBrowser.seam
(2) Load Connectathon Organization, Provider, Facility, and Services entries
In order to perform query testing with predictable results, Care Services Directories must be populated with the entries in the following files here in Google Drive under IHE Documents > Connectathon > test_data > ITI-profiles > CSD-test-data > CSD_Directory_Entries.
Some directories may support only a subset of these entry types:
(3) Additional Organization, Provider, Facility, and Services entries
The Connectathon entries are limited in scope. We expect Directories to be populated with additional Organization, Provider, Facility & Service entries. We give no specific guidance on the number of entries, but we are looking for a more realistic database. Good entries offer better testing opportunities.
Create a short text file saying that you have finished loading your codes. Upload that text file into Gazelle Test Management as the results for this 'test'. That is your signal to use that you are ready for Connectathon testing.
At the Connectathon, the HPD tests assume that a pre-defined set of Organizational and Individual provider information has been loaded on all of the Provider Information Directory actors under test.
There are no result files to upload into Gazelle Test Management for this test. Preloading these prior to the Connectathon is intended to save you precious time during Connectathon week.
Attachment | Size |
---|---|
HPD_test_providers.xls | 51.5 KB |
This is not an actual "test". Rather it is a task that ensures that the mCSD Care Services Selective Supplier is loaded with the Resources and value sets that we will use as part of Connectathon testing.
The instructions below apply to mCSD Supplier systems. (The mCSD Consumer actor is included on this test so that it is aware of this test mCSD test data, but it has no preload work to do. During Connectathon, the Consumer will be performing queries based on the content of these Resources.)
(1) Connectathon FHIR Resources
In order to perform query testing with predictable results, the Care Services Selective Supplier system must be populated with the entries from pre-defined FHIR Resources:
**Some Suppliers may support only a subset of these. **
These resources are available in two places (the test data is the same in both places, so you only need to access/load one set):
(2) Additional Resources
The pre-defined Connectathon test data are limited in scope. We expect Suppliers to be populated with additional Organization, Provider, Facility & Service Resources. We give no specific guidance on the number of Resources, but we are looking for a more realistic database. Good entries offer better testing opportunities.
(3) Value Sets for some codes:
The FHIR Resources for mCSD testing contain codes from some pre-defined ValueSet Resources.
These ValueSets are also found in Github and on the FHIR Read/Write Server at the links above.
code | FHIR ValueSet Resource id |
Organization Type | IHE-CAT-mCSD-organizationTypeCode |
Service Type | IHE-CAT-mCSD-serviceTypeCode |
Facility Type | IHE-CAT-mCSD-facilityTypeCode |
Facility Status | IHE-CAT-mCSD-facilityStatusCode |
Provider Type | IHE-CAT-mCSD-providerTypeCode |
Provider Status | IHE-CAT-mCSD-providerStatusCode |
Credential Type | IHE-CAT-mCSD-credentialTypeCode |
Specialty Type | IHE-CAT-mCSD-specializationCode |
Language | languages |
Provider Qualification | v2-2.7-0360 |
The mCSD Resources also contain codes from these FHIR ValueSets:
There are no result files to upload into Gazelle Test Management for this test. Preloading these prior to the Connectathon is intended to save you precious time during Connectathon week.
This is not an actual "test". The Instructions section below describe the testing approach for the mXDE Profile. It provides context and preparation information prior to performing Connectathon tests with your test partners.
Please read the following material prior to performing Connectathon tests for the mXDE Profile.
Overall Assumptions:
(1) There is a significant overlap between the mXDE and QEDm profiles. Each mXDE actor must be grouped with its QEDm actor counterpart. Thus, you should successfully complete tests for QEDm before attempting mXDE tests.
(2) The mXDE Profile refers to extracting data from documents but does not specify the document types. For purpose of Connectathon testing, we will provide and enforce use of specific patients and specific documents. We intend to use the same clinical test data for both QEDm and mXDE tests. See details about test patients and documents in the QEDm ReadThisFirst and DoThisFirst tests.
(3) The mXDE Data Element Extractor actor is grouped with an XDS Document Registry and Repository or an MHD Document Responder.
(4) The tests reference several patients identifed in QEDm: Read_This_First. These same patients are used for mXDE tests. The Data Element Extractor may choose to reference the patients on the Connectathon FHIR Read/Write Server or may import the Patient Resources and host them locally.
(5) The Provenance Resource is required to contain a reference to the device that performed the extraction. Because the device is under control of the Data Element Extractor, the Data Element Extractor will be required to host the appropriate Device Resource. You are welcome to use multiple devices as long as the appropriate Device resources exist. (See QEDm Vol 2, Sec 3.44.4.2.2.1).
(6) The QEDm Profile says the Provenance Resource created by the mXDE Data Element Extractor shall have [1..1] entity element which point to the document from which the data was extracted.
(6) During the Connectathon, we want you to execute mXDE tests using the Gazelle Proxy. That will simplify the process of collecting transaction data for monitor review.
mXDE Data Element Extractor actor:
Overall mXDE test workflow:
(1) Create one or more Device Resources in your server (to be referenced by Provenance Resources you will create).
(2) Import the required test patients or configure your system to reference the test Patient Resources on the FHIR Read/Write Server.
(3) Repeat this loop:
mXDE Data Element Provenance Consumer actor:
Overall mXDE test workflow:
(1) Configure your system with the endpoint of the Data Element Extractor partner.
(2) Repeat this loop for each data element supported (Observation, Medication, ...); some of the items might occur in a different order based on your software implementation:
There are no result files to upload into Gazelle Test Management for this test. Understanding the testing approach in advance is intended to make testing during Connectathon week more efficient.
Attachment | Size |
---|---|
HPD_test_providers.xls | 51.5 KB |
On this page:
These instructions apply to the Patient Identity Registry actor in the Patient Master Identity Registry (PMIR) Profile. In this test, a PMIR Registry will load its database with Patient Resources formatted for [ITI-93] Mobile Patient Identity Feed, to support subscription tests that will occur later, during the Connectathon.
Read this section for background information about test patients used for PMIR testing at the Connectathon. Otherwise, for instructions for loading test patients, skip to the Instructions below.
In a normal deployment, a product is operating in an environment with a policy for patient identity creation and sharing that remains stable.
However, at the Connectathon, we test multiple profiles (for patient management: PIX, PDQ, PMIR... for document sharing: XDS, MHD...). Thus, the Connectathon provides special challenges when requirements for actors differ across IHE Profiles. Particularly relevant in PMIR and PIXm is the behavior of the server actor (the PMIR Patient Identity Registry & the PIXm Cross-Reference Manager).
A PIXm Patient Identifier Cross-Reference Manager:
Source of Patients in the PIXm Profile: The PIX Manager has many patient records, and a single patient (person) might have several records on the PIX Manager server that are cross-referenced because they apply to the same patient. The Patient Identity Feed [ITI-104] transaction in PIXm was introduced in PIXm Rev 3.0.2 in March 2022. The PIX Manager may also have other sources of patient information (eg HL7v2 or v3 Feed).
At the Connectathon, we ask PIX Managers to preload “Connectathon Patient Demographics” that are are provided via the Gazelle Patient Manager tool (in HL7v2, v3, or FHIR Patient Resource format). These Connectathon Patient Demographics contain four Patient records for each ‘patient’, each with identical demographics (name, address, DOB), but with a different Patient.identifier (with system values representing the IHERED, IHEGREEN, IHEBLUE, and IHEFACILITY assigning authority values). We expect that the PIX Manager will cross-reference these multiple records for a single patient since the demographics are the same.
QUERY: When a PIXm Consumer sends a PIXm Query [ITI-83] to the PIXm Manager with a sourceIdentifier representing the assigning authority and patient ID (e.g. urn:oid:1.3.6.1.4.1.21367.3000.1.6|IHEFACILITY-998), the PIXm Manager would respond with [0..*] targetId(s) which contain a Reference to a Patient Resource (one reference for each matching Patient Resource on the server).
At the Connectathon, if a PIXm Consumer queried by a Patient ID in the IHEFACILITY domain (above example), if there is a match, the PIXm Manager would return a response with three matches, one each for RED, GREEN, and BLUE, e.g.:
A PMIR Patient Identity Registry:
Source of Patients in the PMIR Profile: The PMIR Profile is intended for use in an environment where each patient has a single “Golden Patient record”. In PMIR, a patient has a single “Patient Master Identity” (a.k.a. Golden Patient record) that is comprised of identifying information, such as business identifiers, name, phone, gender, birth date, address, marital status, photo, contacts, preference for language, and links to other patient identities (e.g. a mother’s identity linked to a newborn).
The PMIR Patient Identity Source actor sends the Mobile Patient Identity Feed [ITI-93] transaction to the PMIR Patient Identity Registry to create, update, merge, and delete Patient Resources.
The PMIR Regsitry persists one "Patient identity" per patient. The PMIR Registry relies on the Patient Identity Source actor as the source of truth about new patient records (FHIR Patient Resources), and about updates/merges/deletes of existing patient records (i.e. the Registry does whatever the Source asks). The PMIR Registry does not have algorithms to 'smartly' cross-reference multiple/separate records for a patient. .
In the FHIR Patient Resource in PMIR, there are two attributes that hold identifiers for a patient:
At the Connectathon, because some systems support PIX/PIXv3/PIXm and PMIR, we provide separate Patients (in [ITI-93] format) for PMIR testing with identifiers in a single assigning authority - urn:oid:1.3.6.1.4.1.21367.13.20.4000 (aka IHEGOLD). PMIR Registry systems will preload these patients, and they are only used for PMIR tests. These patients have different demographics than the traditional red/green/blue Connectathon patients used for PIX, PDQ, XDS, and XCA.
QUERY: When a PMIR Patient Identifier Cross-Referece Consumer sends a PIXm Query [ITI-83] to the PMIR Registry with a sourceIdentifier representing the assigning authority and patient ID (e.g. urn:oid:1.3.6.1.4.1.21367.13.20.4000|IHEGOLD-555), if there is a match, the PMIR Registry would return a response with the one (matching) Patient Resource (the 'Golden' patient record).
At the Connectathon, if a Patient Identifier Cross-Reference Consumer send a PIXm Query by a Patient ID in the GOLD domain, if there is a match, the PMIR Registry would return a response with a reference to one Patient Resource.
In conclusion, using the RED/GREEN/BLUE Patients for PIX* testing, and the GOLD Patients for PMIR testing enables us to separate expected results that differ depending on whether a server is a PIX* Patient Identifier Cross-reference Manager or a PMIR Patient Identity Registry in a given test. We have managed testing expectations by using patients in different domains for testing the two profiles, but we don't tell you how you manage this in your product if you support both PMIR and PIX.
In this test, the PMIR Patient Identity Registry loads a set of FHIR Patient Resources used in PMIR peer-to-peer subscription tests. We use a set of FHIR Patient Resources for PMIR testing that are different than the Connectathon patients for testing the PIX* & PDQ* Profiles.
The patient test data is a Bundle formatted according to the requirements for a Mobile Patient Identity Feed [ITI-93] transaction.
The PMIR Patient Identity Manager should follow these steps to load the patient test data:
There is no log file associated with this 'test', so you do not need to upload results into Gazelle Test Management for this test.
Attachment | Size |
---|---|
Connectathon_TwinUseCases.xls | 43.5 KB |
PIXmResponseExample.png | 335.28 KB |
PIXm-example-response-in-PMIR.png | 86.3 KB |
To prepare for testing the ITI Patient Location Tracking (PLT) Profile, the PLT Supplier, Consumer and Supplier actors must use a common set of HL7 codes.
We ask these actors to load codes relevant to their system in advance of the Connectathon.
The codes you need are identified in the peer-to-peer test that you will perform at the Connectathon.
1. In Gazelle Test Management, find the test "PLT_TRACKING_WORKFLOW" on your main Connectathon page.
2. Read the entire test to understand the test scenario.
3. Load the codes for PV1-11 Temporary Patient Location. These are in a table at the bottom of the Test Description section.
Note: These codes are are a subset of the HL7 codes used during Connectathon. If you already performed pre-Connectathon test "Preload_Codes_for_HL7_and_DICOM", then you already have these codes.
There is no evaluation for this 'test'. If you do not load the codes you need on your test system prior to the Connectathon, you may find yourself wasting valuable time on the first day of Connectathon syncing your codes with those of your test partners.
These instructions apply to:
To support PIX, PDQ, XDS, and XC* tests, we ask you to arrive at the Connectathon with specific patients already loaded in your database. These demograpics include two "well-known" patients FARNSWORTH^STEVE (for PIXv2) and WALTERS^WILLIAM (for PIXv3 & PIXm tests), plus several patients for PDQ/PDQv3/PDQm. There are also several patients used in XDS, SDC/MHD and XC* testing.
We use the Gazelle PatientManager tool to enable you to load these patients on to your Connectathon test system.
You can use the PatientManager to:
Use the PatientManager tool to pre-load selected patients onto your test system. It will help you to get this task done before arriving at Connectathon.
Which patients?
The PatientManager contains patients to load for Connectathon testing.
How?
The User Manual contains documentation about:
There is no log file associated with this 'test', so you do not need to upload results into Gazelle Test Management for this test.
These instructions apply to actors that support the Pediatric Demographics option. This test asks actors to load their database with patients for "twin" use cases.
Actors that support the Pediatric Demographics Option must run this 'test' to ensure that test patients with proper pediatric demographics are in its system in order to run subsequent tests on the Pediatric Demographics option.
The Pediatric Demographics test patients are here: http://gazelle.ihe.net/files/Connectathon_TwinUseCases.xls
We ask you to manually load these patients. Unfortunately, we cannot use the Gazelle Patient Manager tool to load these patients because the 'special' pediatric fields are not supported by the tool.
There is no log file associated with this 'test', so you do not need to upload results into Gazelle Test Management for this test.
Attachment | Size |
---|---|
Connectathon_TwinUseCases.xls | 43.5 KB |
This “Read This First” test helps to prepare you to test RFD-based profiles at an IHE Connectathon.
Documenting RFD_Form_Identifiers: This is a documentation task. We request all Form Managers and Form Processors to help their Form Filler test partners by documenting the Form Identifiers (formID) that the Form Fillers will use during Connectathon testing. Follow the instructions in preparatory test RFD_formIDs
RFD and its 'sister' profiles:
The number of IHE profiles based on RFD has grown over the years. These 'sister' profiles re-use actors (Form Filler, Receiver, Manager, Processor, Archiver) and transactions (ITI-34, -35, -36) from the base RFD profile.
Starting at 2016 Connectathons, to reduce redundant testing, we have removed peer-to-peer tests for RFD only. If you successfully complete testing for an actor in a 'sister' profile, you will automatically get a 'Pass' for the same actor in the baseline RFD profile. For example, if you "Pass" as a Form Filler in CRD, you will get a "Pass" for a Form Filler in RFD for 'free' (no additional tests).
Similar test across profiles:
Baseline Triangle and Integrated tests: These RFD tests exercise the basic RFD functions Retrieve Form and Submit Form.
"Triangle" and "Integrated" refer to the combinations of actors in the tests. A Triangle test uses a Form Filler, Form Manager and Form Receiver (triangle). An Integrated test refers to the Form Processor that is an integrated system (supports both ITI-34 and ITI-35); the Integrated test uses a Form Filler and a Form Processor.
CDA Document Tests: We have tried to be more thorough in our definitions of tests for CDA documents; we still have some work to do. There are “Create_Document” tests that ask actors that create CDA documents to produce a document and submit that document for scrutiny/validation by a monitor. There are test sequences that need those documents for pre-population or as an end product; you cannot run those tests until you have successfully completed the “Create_Document” test. We have modified the test instructions for the sequence tests that use CDA documents to require the creator to document which “Create_Document” test was used. We do not want to run the sequence tests before we know we have good CDA documents.
Archiving Forms: We have a single test -- "RFD-based_Profiles_Archive_Form" to test Form Archivers and Form Fillers that support the 'archive' option. There are separate tests for archiving source documents.
Testing of options: IHE does not report Connectathon test results for Options in IHE profiles. We readily admit that the tests that cover options will vary by domain and integration profile. If you read the tests in domains other than QRPH (or even in QRPH), you may find profiles that have no tests for named options. We continue to try to enhance tests for named options and other combinations of parameters found in the QRPH profiles.
ATNA Logging: If you implement the ATNA profile, one of the requirements is that you send audit messages for certain transactions, primarily those that carry PHI. The ITI-35 can be a PHI export transaction, implying that Form Filler who also supports ATNA should send an audit message. This is an issue for a Form Filler when the form was retrieved as an unencoded form (just retrieved the Form URI); the Form Filler does not actually have control over the form.
If your Form Filler has requested an encoded form and has an XML form to process, it does have control over the ITI-35 transaction and should be able to send the appropriate audit message. The same issue exists for the ITI-36 Archive Form transaction.
Form Instance Identifiers: The RFD profile discusses the concept of partial or interim storage of a form. In order to make this work, the Form Filler needs to have a Form Instance ID to retrieve the correct instance of the form that was stored. There are two different mechanisms for a Form Filler to obtain a Form Instance ID:
That is, the Form Processor or Form Manager is going to control when the Form Instance ID is returned to the Form Filler. We need to get clarification from the ITI Technical Committee if the intended sequence included the Form Filler obtaining that Form Instance ID from the ITI-34 or from the ITI-35 transaction. Until we have that clarification, we will have to work with test participants and their interpretation of this mechanism.
Use of the Gazelle Proxy
Because RFD transactions are generally not sent over TLS connections (unless required by a specific profile), RFD tests are good candidates to use the Gazelle proxy. It will record HTTP transactions and allow you and the monitors to review those logs in a centralized manner. We highly encourage you to use the proxy when performing this test. It will allow you to easily see the messages exchanged between test partners and to document them for the monitor.
There is no result to upload into Gazelle Test Management for this informational test.
In this tes preparatory test, Form Managers and Form Processors document the formID(s) that will be used during a Connectathon in the [ITI-34] transaction.
The formID may apply to the base RFD profile, or it may apply to content profile(s) based on RFD (eg CRD, VRDR, many others)
Form Managers and Form Processors:
For Connectathon testing, we expect that you create your own forms for content profiles with your own identifiers (formID).
Edit the google spreadsheet linked below. Add one row to the spreadsheet for each formID hosted on your test system.
Please do this in advance of the Connectathon. The goal is to provide documentation for your Form Filler test partners.
Form Fillers:
There is no specific task for you; however, access the spreadsheet linked below to see the formIDs you will use during Connectathon testing.
RFD Form Identifiers google spreadsheet: https://docs.google.com/spreadsheets/d/11LM9eKzuA_CKJZKsQA7PRJ8uXjYLQ7LTukNJU4LzkDg/edit#gid=1667031332
When you complete the task above, create a small text file stating that your entries are complete. Upload that file into Gazelle Test Management as the results for this test.
At the Connectathon, the SVCM Connectathon tests assume that a pre-defined set of FHIR Resources have been loaded on all of the Terminology Repository actors under test.
Prior to performing Connectathon tests, SVCM Terminology Repositories must load FHIR Resources:
These resources are available in Github here: https://github.com/IHE/connectathon-artifacts/tree/main/profile_test_data/ITI/SVCM
For reference only: This google sheet contains a list of the above resources.
Your Repository may also contain other additional FHIR Resources during Connectathon.
There are no result files to upload into Gazelle Test Management. Preloading these Resources in advance is intended to save you precious time during Connectathon week.
At the Connectathon, the SVS tests assume that a pre-defined set of value sets have been loaded on all of the Value Set Repository actors under test.
(1) Prior to the Connectathon, SVS Repositories must load these value sets: http://gazelle.ihe.net/content/gazelle-value-sets
(2) Ideally, your Repository will also contain other additional value sets during Connectathon.
There are no result files to upload into Gazelle Test Management. Preloading these prior to the Connectathon is intended to save you precious time during Connectathon week.
This informational 'test' provides an overview of peer-to-peer Connectathon testing for the cross-community (XC*) family of IHE Profiles. Please read what follows and complete the preparation described before the Connectathon.
Cross-community test scenarios are more complex than most in the Connectathon because of the effort to set up the initiating and responding communities' configuration.
Try to be efficient in testing gateways that support Cross-Community profiles. If your Gateway also supports multiple XC* profiles, you will want to run these tests while the XCA configurations are in place.
Please refer to these resources used to manage cross-community profile testing. These are shared during the Preparatory phase of the Connectathon:
This is what we will look for when we grade the XCA profile Connectathon tests:
-- Any XC* actor doing Supportive testing:
-- XCA Initiating Gateways doing Thorough testing:
-- XCA Responding Gateways doing Thorough testing:
Each XC*IGateway is assigned a homeCommunityId value for use during Connectathon week. Gateways can find the homeCommunityId for their system and for their partner gateways in Gazelle Test Management under menu Preparation-->OID Registry
XCA Initiating Gateways...
XCA Responding Gateways...
XCA-I Responding Imaging Gateways...
XCA-I Initiating Imaging Gateways...
XCPD Responding Gateways have test data identified in the "XCPD_Responding_GW_Setup" test.
You must demonstrate your ability to support asynchronous communication You cannot pass as an XCPD Responding GW without demonstrating async support. ***NEW as of 2019***, Responding Gateways shall support one of two types of async messaging. You may support both.
We have a specific test patient we use for XCA and XCPD tests.
This patient is part of the 'connectathon demographics' which should be pre-loaded on XDS.b Registries, PIX Managers and Patient Identity Sources prior to the Connectathon. (Note that this patients also available in the 'pre-load' demographics provided in the Gazelle PatientManager tool. See instructions in preparatory test Preload Connectathon Test Patients.)
The XCA Responding Gateway must host in its community documents for the test patient. If you have an XDS Registry/Repository behind your Responding Gateway, host documents there. If your Responding Gateway is not using XDS.b, it will find another way to host documents for the test patient.
For XCA-I, each Initiating & Responding Gateway represents and XDS affinity domain with an Imaging Document Source, and XDS Registry and Repository. Each XCA-I community must host a DICOM Study and associated Manifest. for XCA-I testing, we one of the three DICOM studies that the Imaging Document Source used for XDS-I.b tests.
Summary of the DICOM studies
Patient ID Procedure Code Modality Series Count Image Count --------------------------------------------------------------------- C3L-00277 36643-5 DX 1 1 C3N-00953 42274-1 CT 3 11 <-----we use this one for XCA-I TCGA-G4-6304 42274-1 CT 3 13 Procedure Codes (0008,1032) --------------------------- 36643-5 / (LOINC / 2.16.840.1.113883.6.1) / XR Chest 2V 42274-1 / (LOINC / 2.16.840.1.113883.6.1) / CT Abd+Pelvis WO+W contr IV
Patient IDs to use with the XDS-I Manifest for the XCA-I tests.
The Patient ID in the DICOM header for the images is considered the 'local' or 'departmental' Patient ID for this patient, (ie sourcePatientId in the DocumentEntry metadata). When submitting a Manifest for this study to an XDS Repository/Registry, the Imaging Doc Source must use the affinity domain ID for the patient in the XDS metadata for the submitted manifest. This patient has Patient IDs included in the Connectathon Patient Demographics pre-loaded onto each Registry at Connectathon as follows:
For the CT study with "local" Patient ID C3N-00953, the affinity domain Patient IDs are listed here:
The Patient ID in the manifest will depend on the patient ID affinity domain (red, green, blue) of your local Registry & XCA-I Initiating or Responding Imaging Gateway.
There is no evaluation for this informational test. If the systems testing XC* profiles do not do the set-up described above, then cross-community tests at Connectathon will not work.
This test applies to Portable Media Creators in the XDM Profile that create either CD-R or USB media, or a ZIP file for the ZIP over Email optionl.
This test case is used to create XDM-formatted media (CD-R and/or USB). The media you create will be used by Portable Media Importers during the Connectathon in the XDM_Import_* peer-to-peer tests.
As a Portable Media Creator, when you create your media, we encourage to to put documents from the various IHE content profiles your system supports (eg APR, BPPC, EDES, EDR, IC, XDS-MS, XDS-SD, XD-LAB, XPHR, etc). A larger variety of document types will help the Importer systems find compatible content.
You will also be required to demonstrate that you send an 'export' audit message when you create XDM media.
To create your XDM media for exchange during Connectathon:
STEP 1: Create 2 copies of physical media: USB and/or CD/R, if you support those options.
STEP 2: Label your physical media. The label should contain your system name in Gazelle Test Management, your table location, and the name of a technical contact at your table. Also include the document types you have included on media (eg XPHR, XDS-SD, etc...) (We recognize the space limitations on USB; create a piece of paper that can accompany your media.) Bring your media with you to the Connectathon.
STEP 3: Create a zip file of the file structure on your XDM media. Upload that zip file into the samples area of Gazelle Test Management: menu Connectathon-->Connectathon-->List of Samples. On the 'Samples to share' tab, upload your zip file under the 'XDM' entry.
During Connectathon, a Monitor will do a two-part evaluation of your media. You should do these for yourself in advance of the Connectathon so that you are confident your test will pass.
EVALUATION PART 1 - METADATA VALIDATION:
Earlier versions of this test involved manual scrutiny of the METADATA.XML file. Now, we use the Gazelle EVSClient:
EVALUATION PART 2 - Validate XDM structure using the Edge Test Tool
IHE profiles for Document Sharing (XD*, XC*, MHD) rely on coded values provided in the metadata when documents are submitted and searched. These Document Sharing profiles define the structure of the document metadata as well as coded values for some metadata attributes; however, allowable values for many of the coded values are not constrained by IHE, but are defined by the Affinity Domain that will deploy and support the document sharing systems.
For testing of Document Sharing profiles at IHE North America and Europe Connectathons, the set of allowable code values for document sharing metadata are defined by IHE Technical Project Managers and deployed in the NIST XDS Toolkit.
This page describes where to find the set of allowable codes for document sharing testing at IHE Connectathons. This enables you to configure your test system prior to performing these types of tests.
(NOTE: Some Connectathons or Projectathons may use different codes for metadata. If that is the case, the Technical Project Manager will provide other guidance.)
These documentEntry metadata attributes have defined codes:
Find the coded values then load these coded values onto your test system. Loading these codes is a prerequisite to performing any preparatory tests with the NIST XDS tools or NIST FHIR Tools. It is also a prerequisite to performing peer-to-peer Connectathon test for the XD*, XC* and MHD profles.
For IHE NA and EU Connectathons, allowable codes for Document Sharing metadata are contained in the codes.xml file distributed here: https://tools.iheusa.org/xdstools/sim/codes/default
Note 1: These codes are deployed on the public version of NIST XDS Toolkit hosted here: https://tools.iheusa.org/xdstools/
Note 2 : These codes are also available in SVS format, but values of codes in SVS format may not exactly match those in codes.xml above. See the Gazelle SVS Simulator tool that hosts many value sets, including codes for metadata attributes.
There is no result file to upload to Gazelle Test Management for this test. If you do not do the configuration described above, then tests with tools or your test partners will not work.
If you find an error in codes.xml, or to request that a code be added, please submit an issue in the Github repository for XDS Toolkit: https://github.com/usnistgov/iheos-toolkit2/issues. You may also directly edit the codes.xml file here with your suggested change and submit a Pull request.
Attachment | Size |
---|---|
codes.xml file for EU CAT2023 | 84.66 KB |
XDS.b Document Registries must complete the preparation described here before performing XDS.b Connectathon tests.
(1) Affinity Domains for Registries: During Connectathon, each XDS.b Document Registry has been assigned to an affinity domain that determines the Patient IDs your Registry will accept. These affinity domains are referred to as the "Red", "Blue" or "Green". (If this is your first Connectathon, these affinity domains are explained here.) The Connectathon Project Manager announces the Red/Blue/Green assignments in advance of the Connectathon. It is documented in this google spreadsheet.
(2) Connectathon patients to pre-load on your Registry: To support XDS tests, Registries load patient demographics provided in advance by the Connectathon Technical Project Manager. If you have performed pre-Connecthon test Preload_Connectathon_Test_Patients , you already have these patients in your database; otherwise follow the instructions in that test now. You will only load patients for the Affinity Domain you were assigned to above
(3) Metadata Codes: Document Registries must also be configured with codes for Connectathon Affinity Domain Metadata. These are documented in the codes.xml file found in the latest release of the NIST XDS Toolkit here: https://github.com/usnistgov/iheos-toolkit2/releases/. First-time Connectathon participants can read background information about metadata codes here.
NOTE: Some Connectathons may use different codes for metadata. If that is the case, the Technical Project Manager will provide other guidance.
There is no result file to upload to Gazelle Test Management for this informational test. If the Document Registry does not do the set-up described above, then peer-to-peer XDS.b tests at Connectathon will not work.
Prior to arriving at the Connectathon, it is important for participants testing XUA (or the IUA profile with the SAML Token option) to be familiar with:
The description that follows:
Locate and use the Gazelle-STS Security Token Service:
To familiarize yourself with the Gazelle-STS tool used for Connectathons:
Assertions used for Connectathon testing:
The [ITI-40] transaction (ITI TF-2: 3.40.4.1.2) specifies the SAML assertion, including that all assertions contain a Subject (principal). The 'Subject' in the assertion could be a user or it could be an 'application'.
For Connectathon, we have pre-defined Subjects (ie HTTP authentication users) that we use for XUA testing . Several different Subject/users are defined, and they are associated with a different assertions used for the XUA "success" test - XUA_Transaction_with_Assertion and the "fail" test XUA_Restrict_Access.
Please refer to the Gazelle STS user manual for the list of assertions available: https://gazelle.ihe.net/gazelle-documentation/Gazelle-STS/user.html#requesting-a-security-token.
The Gazelle-STS tool is able to generate assertions with the success and failure conditions defined in the tests. (We expect that X-Service Users that are generating their own assertions will have less flexibility.)
Note - Many options are possible for the AuthnStatement parameter in the Assertion. For the Connectathon, the assertions provided to the X-Service Users by the X-Assertion Providers will encode a timestamp representing when the authentication occurred and that the password class was used, eg:
Configuration Details:
For X-Service Users who will request assertions from the Gazelle-STS, three configuration items have been identified. When requesting a security token, the X-Service User needs to provide the X-Assertion Provider with:
(1) An HTTP authentication user
(2) A valid password
(3) The 'AudienceRestriction' of the X-Service Provider
For item (3) at the Connectathon, to ease configuration, we will apply the same URL to all X-Service Providers, eg all Registries and Repositories. (Note that this URL is **not** the URL the Document Consumer will use to query a Registry or retrieve documents from a Repository). This same, general URL used as the value of 'AudienceRestriction' for all service providers will simplify the STS configuration and will ensure that users can' access any Registry/Repository with the SAML token obtained from the STS.
The required URL is :
Actors in the XUA Profile can be grouped with actors in any IHE profile using webservices transactions (eg. XDS.b, XDR, PDQv3, PIXv3, RFD, XCA, many others...). Thus, you will be testing the exchange of XUA assertions in conjunction with transactions from another profile. This means you not only need to find your XUA test partners, you must also find a partner which supports a webservices transaction that you support.
Here is the sequence of activities you must do to accomplish your XUA testing during the Connetathon:
These notes apply to testing of the XUA profile at the Connectathon:
There is no result file to upload to Gazelle Test Management for this informational test. If the systems testing XUA do not do the set-up described above, then peer-to-peer XUA tests at Connectathon will not work.
If you're testing these profiles, please review this page prior to the IHE Connectathon.
At IHE Connectathons in both Europe and North America, we define three Affinity Domains. These represent three different document sharing (XDS) communities.
Patient IDs in Connectathon Affinity Domains
Each of these domains is associated with its own Patient ID assigning authority. For ease of reference we refer to these as:
We have a tool -- the Gazelle PatientManager -- for creating a patient with a Patient ID with a Red, Green or Blue assigning authority and sending it via HL7v2 or v3 to your test system. It also can create equivalent FHIR Patient Resources. Instructions on how to use this tool to populate your test system with patients for Connectathon testing are found in pre-Connectathon test Preload_Connectathon_Test_Patients.
Explanatory resources:
This is an index of Do This First and Read This First tests defined for profiles in the IHE Patient Care Coordination (PCC) domain.
At the Connectathon, the IPS Content Creator is required to provide documents that meet the requirements of the International Patient Summary (IPS) Profile. The IPS Content Creator is required to accept / retrieve documents and process them using one or more of the options defined by the IPS Profile.
This page provides a general overview of the IPS testing process.
The table below lists the patients defined for testing. Demographic information can be found in the Connectathon Artifacts GitHub repository (see the IPS-QEDm README.md) or in the Gazelle Patient Manager. The Optionality column indicates if the patient data is required for IPS testing (R for Required, O for Optional).
Name | DOB | Trillium Bridge ID | IHERED ID | Optionality |
Charles Merlot | 1966.04.04 | EUR01P0008 | IHERED-3158 | R |
Mary Gines | 1963.09.09 | EUR01P0020 | IHERED-3159 | R |
Annelise Black | 1988 | EUR01P0002 | IHERED-3160 | O |
Marco Peroni | 1995.07.28 | EUR01P0011 | IHERED-3163 | O |
Allen Perot | 1963.02.18 | EUR01P0013 | IHERED-3161 | O |
As mentioned above, a set of patients is defined for QEDm testing. Clinical content should be extracted from the files described here: https://github.com/IHE/connectathon-artifacts/tree/main/profile_test_data/PCC/IPS-QEDm. The README.md file in the GitHub repository provides an index to files but does not describe the clinical content. Further notes:
Merlot (DSTU3) |
Gines (DSTU3) |
Black (DSTU3) |
Peroni (R4) |
Perot (DSTU3) |
|
Required | |||||
Medication Summary | 2 | No information | 2 | 2 | 3 |
Allergies and Intolerances | NKA | 1 | NKA | NKA | 1 |
Problem List (Condition) | 2 | No known | 3 | 4 | 5 |
Recommended | |||||
Immunizations | 1 | 1 | 2 | 2 | |
History of Procedures | |||||
Medical Devices | No known | ||||
Diagnostic Results | |||||
Optional | |||||
Vital Signs | |||||
Past History of Illness | |||||
Pregnancy | 1 | ||||
Social History | |||||
Functional Status | |||||
Plan of Care | |||||
Advance Directives |
You will find that clinical content in this folder https://github.com/IHE/connectathon-artifacts/tree/main/profile_test_data/PCC/IPS-QEDm in the Connectathon Artifacts GitHub repository. Find the data you need by matching the patients listed in the table above with the README.md file in that folder.
Please do not enter less content or more content than is defined for each patient. You might add content to an individual resource, but do not add or substract resources. Validation of test results is difficult when you do not include the expected data.
You might discover that the data in the patient record is contradictory and/or might generate alerts because medications do not go with diagnoses. Please contact the owner of the test data to help resolve the issue and make corrections.
The document that you create/export is a self-contained document. If you are creating a FHIR Bundle, the resources that are referenced in the document must also exist in the document. The FHIR Bundle does not refer to resources that exist on a FHIR server.
We use the Samples area of Gazelle Test Management to exchange IPS documents (CDA or FHIR format) between Content Creator and Content Consumer systems. There are a separate Preparatory tests containing instructions. See:
In addition, the IHE IPS Profile says that the Content Creator transmits the IPS document to the Content Consumer using the PCC-1 transaction. That transaction contains a number of options:
During Connectathon, we will communicate with test participants and work with them to resolve the mechanism for exchanging the document. The Content Creator creates the document and actively exports the document. This is not a FHIR search request to retrieve a summary document. The HL7 IPS Implementation Guide does not forbid a FHIR search / read requests, but the IHE profile has used the push model of a document.
For Preparatory testing purposes, we are more concerned with document content and reliable data import and less concerned with the mechanics of creating/exporting the document.
We use the Gazelle External Validation Service (evs) tool (https://gazelle.ihe.net/evs/home.seam) to validate IPS Content. There are a separate Preparatory tests containing instructions. See:
At the Connectathon, the IPS Content Creator is required to provide documents that meet the requirements of the International Patient Summary (IPS) Profile. This test tells you where to find the documentation that describes what is expected for testing.
First, create your IPS Documents:
These instructions are for the IHE IPS Content Creator.
The page IPS Read This First describes the set of patients and where to locate the clinical information for each patient. Please refer to that page prior to creating your IPS content below.
Create the clinical content in your system that you need to produce an IPS CDA document for these two patients:
Create the clinical content in your system that you need to produce an IPS FHIR document for these two patients:
Create the clinical content in your system that you need to produce an IPS CDA document for one or more of these patients:
Create the clinical content in your system that you need to produce an IPS FHIR document for one or more these patients:
Next, upload your IPS content into Gazelle Test Management:
At the Connectathon, the QEDm Clinical Data Source is required to respond to search requests for one or more FHIR Resources as defined by the QEDm profile. The QEDm Clinical Data Consumer is required to search for one or more one or more FHIR Resources as defined by the QEDm profile. The resources used in the search depend on the QEDm options declared by a system.
This page provides a general overview of the QEDm testing process.
The table below lists the patients defined for testing. Demographic information can be found in the Connectathon Artifacts GitHub repository (see the IPS-QEDm README.md) or in the Gazelle Patient Manager. The Optionality column indicates if the patient data is required for QEDm testing (R for Required, O for Optional).
Name | DOB | Trillium Bridge ID | IHERED ID | Optionality |
Charles Merlot | 1966.04.04 | EUR01P0008 | IHERED-3158 | R |
Mary Gines | 1963.09.09 | EUR01P0020 | IHERED-3159 | R |
Chadwick Ross | 1960.02.29 | IHERED-3162 | R | |
Annelise Black | 1988 | EUR01P0002 | IHERED-3160 | O |
Marco Peroni | 1995.07.28 | EUR01P0011 | IHERED-3163 | O |
Allen Perot | 1963.02.18 | EUR01P0013 | IHERED-3161 | O |
As mentioned above, a set of patients is defined for QEDm testing. Clinical content should be extracted from the files described here: https://github.com/IHE/connectathon-artifacts/tree/main/profile_test_data/PCC/IPS-QEDm. The README.md file in the GitHub repository provides an index to files but does not describe the clinical content. Further notes:
Merlot (IPS) |
Gines (IPS) |
Ross (CCD) |
Ross (Procedure) |
Ross (Diag Imaging) |
|
Observation | 4 (3*) | 1 | |||
AllergyIntolerance | 1 | 1 | 2 | ||
Condition | 2 | ||||
Diagnostic Report | 1 | 1 | |||
MedicationStatement | 2 | 1 | 2 | ||
MedicationRequest | |||||
Immunization | 1 | 5 | |||
Procedure | 3 | 1 | |||
Encounter | 1 |
You will find that clinical content in this folder https://github.com/IHE/connectathon-artifacts/tree/main/profile_test_data/PCC/IPS-QEDm in the Connectathon Artifacts GitHub repository. Find the data you need by matching the patients listed in the table above with the README.md file in that folder.
Please do not enter less content or more content than is defined for each patient. You might add content to an individual resource, but do not add or substract resources. Validation of test results is difficult when you do not include the expected data.
You might discover that the data in the patient record is contradictory and/or might generate alerts because medications do not go with diagnoses. Please contact the owner of the test data to help resolve the issue and make corrections.
*two of the observations are blood pressure observations (systolic, diastolic) which need to be combined in FHIR accoding to the Vital Sign profile.
The search parameters in QEDm depend on the FHIR resource. The summary table below shows the combinations of query parameters defined for the various resources. The last row for Provenance is special because you do not search directly for a Provenance resource. You search for a base resource and ask the server to include Provenance resources in the response.
Patient | Patient + Category | Patient + Category + Code | Patient + Category + Date | Patient + Category + Code + Date | Patient + Clinical Status | Patient + Date | _include | |
Observation | x | x | x | x | ||||
AllergyIntolerance | x | |||||||
Condition | x | x | x | |||||
DiagnosticReport | x | x | x | x | ||||
MedicationStatement | x | x | ||||||
MedicationRequest | x | x | ||||||
Immunization | x | |||||||
Procedure | x | x | ||||||
Encounter | x | x | ||||||
Provenance |
This page provides instructions for the Clinical Data Source when testing the QEDm profile. You should read the overview at QEDm: Read This First and then follow the instructions below.
Find clinical content in IPS bundles and CDA documents in the IPQ-QEDm folder of the Connectathon Artifacts repository. For each content option you support (e.g., Observation, Allergy/Intolerance), load FHIR resources into your system per the subsections below.
The Observation resource is more difficult than the other clinical items as there are 20 observations in the Ross CCD file and one observation in the Ross Procedure File.
Extract and enter 4 clinical data items for Allergy/Intolerance from these sources and enter into your system:
Merlot / IPS | 1 |
Gines / IPS | 1 |
Ross / CCD | 2 |
Extract and enter 2 clinical data items for Condition from these sources and enter into your system:
Merlot / IPS | 2 |
Extract and enter 2 clinical data items for DiagnosticReport from these sources and enter into your system:
Ross / Procedure | 1 |
Ross / Diagnostic Imaging | 1 |
The documents listed below include medications for three patients. Extract the medications from the documents and convert to MedicationStatement resources with the related Medication resources.
Merlot / IPS | 2 |
Gines / IPS | 1 |
Ross / CCD | 2 |
The section above tells you to create MedicationStatement resources from medications found in three documents. Follow the same guidance to create MedicationRequest resources for these three patients. That is, you can assume that each medication is also described by a MedicationRequest.
Merlot / IPS | 1 |
Gines / IPS | 1 |
Ross / CCD | 2 |
Extract and enter 5 clinical data items for Immunization from these sources and enter into your system:
Gines / IPS | 1 |
Ross / CCD | 4 |
Extract and enter 2 clinical data items for Procedure from these sources and enter into your system:
Ross / CCD | 1 |
Ross / Procedure | 1 |
Extract and enter 1 clinical data items for Encounter from these sources and enter into your system:
Ross / CCD | 1 |
You might be expecting to find directions for Provenance resources here. Testing for Provenance resources is coupled with mXDE testing. You can read about that testing environment on the mXDE Read This First page.
This is an index of Do This First tests defined for profiles in the IHE Radiology (RAD) domain.
The AI Results (AIR) Profile specifies how AI Results encoded as DICOM Structured Reports (SRs). Depending on the AI algorithms implemented on the AIR Evidence Creator (EC) actor, the EC will create/encode one or more of the different result primitives in its SRs, e.g. qualitative findings, measurements, locations, regions, parametric maps, tracking identifiers, image references.
For the Connectathon, there is a set of no-peer tests to evaluate how the Evidence Creator encodes its AI results; the tests follow the naming pattern AIR_Content_*. Each of these tests align with a different result primitive included in an AI results SR. We have created separate tests for the different result primitives to make it test execution and evaluation more manageable. The Evidence Creator will perform Connectathon tests that are applicable to the SRs and primitives it has implemented.
The purpose of this Preparatory test is to have the Evidence Creator describe in narrative form the nature of its AI results implementation. Reading this description will help the Connectathon monitor have the proper context to evaluate your Evidence Creator application, the AI results you produce, and the result primitives included in your AI SR instances.
For this test you (the Evidence Creator) will produce a short document describing your implementation in the context of the AI Results Profile specification. The format of the document is not important. It may be a PDF, a Word or google doc, or some other narrative format.
Your document shall include the following content:
There is no "pass/fail" for this test. However, you must complete it because it is a prerequisite for several Connectathon tests. The Connectathon monitor will be looking for the document you produce here and use it when s/he is evaluating your AI result content.
This Preparatory test is informational. It is intended to prepare the AIR Evidence Creator for Connectathon tests that will be used to evaluate the AI result SRs produced by the Evidence Creator.
Another Preparatory test, AIR_Sample_Exchange, instructs the Evidence Creator to upload AI result SRs into the Samples area of Gazelle Test Management. In that test, the Evidence Creator will also use the Pixelmed DICOM validator to perform DICOM validation of your SRs. The Pixelmed validator checks the baseline requirements of the DICOM SR, including the requirements of the Templale IDs (TIDs) within the SR. The tool does not, however, check the requirements and constraints that are part of the content specification in the AIR Profile.
In Gazelle Test Managment, on your Test Execution page, you will find a set of no-peer Connectathon tests used to evaluate encoding of AI results; these Connectathon tests follow the naming pattern AIR_Content_*. The different tests align with different result primitives that are included in an AI results SR, e.g. qualitative findings, measurements, locations, regions, parametric maps, tracking identifiers, image references.
Depending on the AI algorithms it implements, we expect an Evidence Creator to create/encode one or more of these types of result primitives. We have created separate tests for the different result primitives to make test execution and evaluation more manageable during the Connectathon.
Prior to the start of the Connectathon, we highly recommend that the Connectathon participant that will test the Evidence Creator actor read each AIR_Content_* Connectathon test
>>Note: There is a Content test for each of the AI result primitives. The AI algorithm(s) on your Evidence Creator may not include all of the defined result primitives (e.g. you may not produce parametric maps. For the Connectathon, you will only be required to perform the AIR_EC_Content* and AIR_Display* tests that are applicable to your system. (This separation of capabilities into separate tests results in some redundant test steps, but one large test for all primitives would have been difficult for testers and monitors to manage.)
In each AIR_Content_* test, you will find test steps and evaluation criteria for specific encoding requirements for the different result primitives. We recommend that you examine your AI result SR content using these test steps. If you find discrepancies, you may need to update your software to be compliant with the AIR content requirements. If you disagree with any of the tests or test steps, you should contact the IHE Radiology Domain Technical Project Manager to resolve your concern.
If you use the tests to review the SRs during the Prepatarory phase, you can be confident that the Connectathon monitor will find no errors when s/he evaluates your SRs during the Connectathon.
There is no result file to submit into Gazelle Test Management for this informational test.
The AI Results (AIR) Profile requires the Image Display to demonstrate specific display capabilities when rendering AI Result SRs. These requirements are in Display Analysis Result [RAD-136].
At the Connectathon, a monitor will sit down at your AIR Image Display and run through a set of tests to evaluate the display requirements in [RAD-136].
In this preparatory test, we are providing you with some test data advance of the Connectathon that you will use to demonstrate AIR display requirements. The test data includes:
NOTE: During the Connectathon, the Image Display will be required to perform tests with with AI Result IODs from the Evidence Creator test partners at that Connectathon. The Image Display may also be asked to use AI Result IODs in this test data, especially where this sample data contains DICOM object types or AIR primitives that the 'live' test partners do not produce.
For AIR IMAGE DISPLAY systems:
>> AIR_Display_Analysis_Result
>> AIR_Display_Parametric_Maps
>> AIR_Display_Segmentation_IOD
>> AIR_Display_* (etc...)
For ALL OTHER AIR ACTORS:
It is OPTIONAL non-Image-Display actors to access the samples, but we recognize the value of test data to all developers, so you are welcome to access the samples.
IMAGE DISPLAY SYSTEMS: Create a text file that briefly describes your progress in using the SRs with your Image Display. Upload that file into Gazelle Test Management as the result file for test. There is no pass/fail for this preparatory test . We want to make sure you're making progress toward what is expected during evaluation of your Image Display at the Connectathon.
The AI Workflow for Imaging (AIW-I) Profile specifies how to request, manage, perform, and monitor AI Inference on digital image data.
Both the sequence of transactions in AIW-I and the content of the workitem(s) created by the Task Requester depend on the AI inferences and workflows implemented on the AIW-I Task Performer actor. Therefore, the purpose of this Preparatory test is to gather information from the Task Performer which will influence how it will interact with its test partners during the Connectathon. The Task Performer will describe:
This description will help the Task Requester ensure that the workitems it creates are adequately populated for you, and that you test the workflow(s) you support with your partners at the Connectathon.
For this test you (the Task Performer) will produce a short document describing your implementation in the context of the AIW-I Profile specification. According to AIW-I, Section 50.4.1.1, a DICOM Conformance Statement is the ideal home for these details. If you have one, great! But, for the purpose of this preparatory test, the format of the document is not important. It may be a PDF, a Word or google doc, or some other narrative format.
Your document shall include the following content:
You will find and read the document provided by the Task Performer above.
There is no "pass/fail" for this test. However, you must complete it because it is a prerequisite for several Connectathon tests. Your AIW-I test partners, plus the Connectathon monitor, will be looking for the document produced here.
The Image Display actor in the Basic Image Review (BIR) Profile is unlike other IHE actors in that its requirements are primarily functional and do not require exchange of messages with other actors.
At the Connectathon, a monitor will sit down at your system and run through a set of tests to evaluate the requirements in the BIR profile. In this preparatory test, we are providing you with the test okab and the accompanying images in advance of the Connectathon. To prepare, we expect you to load the test data 9images) run these tests in your lab in preparation for the Connectathon itself.
After loading the test images onto your Image Display, run the test in the BIR Test Plan document using your display application.
Create a text file that briefly describes your progress in running these tests. Upload that file into Gazelle Test Management as the result file for test. There is no pass/fail for this preparatory test . We want to make sure you're making progress toward what is expected during evaluation of your Image Display at the Connectathon. .
To enable Connectathon testing, the Image Display is required host studies on its Image Display.
There is one Connectathon test -- IID Invoke Display -- to exercise the Image Display and Image Display Invoker in the IID profile. The 'Special Instructions' for that test ask you to host a set of studies. This preparatory 'test' ensures you have the proper data loaded on your system prior to arriving at the Connectathon.
We do not provide specific studies for you, but rather define the characteristics of the studies you should bring
Come to the Connectathon with:
There are no result files to upload into Gazelle Test Management for this test. Preloading these prior to the Connectathon is intended to save you precious time during Connectathon week.
The goal of this “test” is for the Portable Media Creator system to prepare, in advance of the Connectathon, your PDI media that the Portable Media Importer partners will test with during the Connectathon. Doing this in your home lab will save you valuable time during Connectathon week.
All PDI Portable Media Creators must support CD media; USB and DVD are optional. The media you create should contain a “representative sample” of the data produced by your system. Complete and representative data on your media makes for a better interoperability test.
At a Connectathon Online, it is not possible for test partners to exchange physical PDI media. In that case, we ask the Portable Media Creator (PMC) to:
Prior to Connectathon, you should create two copies of your media: CD, USB, and/or DVD, depending on what you support. On the first day of the Connectathon, you will give one copy to Connectathon monitor who is evaluating PDI tests. You will keep one copy and use it for your peer-to-peer tests with your Importer partners.
Use the following guidelines when creating your media:
Note that you may not have the information to make your label until you arrive at Connectathon.
Optional:
Starting in 2019, the ITI and Radiology Technical Framework contains specifications for including PDI and XDM content on the same media. If your Portable Media Creator supports both the PDI and XDM Profile, you should create media with the appropriate content. For details, see:
There are no test steps to execute for this test.
Instead, create a text file which documents the type of DICOM images your modality creates and lists the DICOM Baseline Template your Acquisition Modality uses when creating Dose SRs for the REM profile.
CT modalitites which report on irradiation events shall be capable of producing an SR compliant with TID 10011.
Actors which support on irradiation events for Modalities of type XR, XA, RF, MG, CR, or DX shall be capable of producing an SR compliant with TID 10001
Your text file should have the following naming convention: CompanyName_SystemName_REM.txt.
Submit the text file into the Gazelle Test Management as the results this test.
To prepare for testing the RAD Encounter-based Imaging Workflow (EBIW) Profile, the EBIW actors must prepare to use a common set of DICOM codes.
The codes you need are identified in the peer-to-peer test that you will perform at the Connectathon.
1. In Gazelle Test Management, find the test "EBIW_10_Read_This_First" on your main Test Execution page.
2. Read the entire Test Description to understand the test scenario.
3. For each of the DICOM attributes listed in the Test Description, the Encounter Manager should configure its system to be able to use the values in the bullet lists. This ensures that consistent values will be returned in modality worklist responses for EBIW tests during the Connectathon.
There is no file to upload to Gazelle Test Management for this preparatory test. If you do not load the codes you need on your test system prior to the Connectathon, you may find yourself wasting valuable time on the first day of Connectathon syncing your codes with those of your test partners.
To prepare for testing workflow profiles in RAD, CARD, LAB, and EYECARE domains, and also for the ITI PAM Profile, it is helpful for systems that send HL7 messages (eg patient registration and orders) and/or DICOM messages (modality worklist, storage) to work with a common set of codes.
We ask ADT, Order Placer, Order Filler and Acquisistion Modality actors and PAM and PLT actors to load codes relevant to their system in advance of the Connectathon
These codes include, for example:
The codes that you need depend on the profile/actors you support. HL7 and DICOM codes used for Connectathon testing are the same set that is used in the Gazelle OrderManager tool. OrderManager contains simulators for some actors in workflow profiles.
** HL7 codes ** - are documented here:
Some of these codes are also mapped into DICOM messages. Use the spy-glass icon in the right column to view the value set for each code. (Note that the format of these files is compliant with the IHE SVS Sharing Value Sets profile.)
** DICOM codes ** - Order Filler and Acquisition Modality actors need a mapping between Requested Procedure codes, Scheduled Procedure codes, and Protocol Codes.
For RAD and CARD, that hierarchy is here: https://gazelle.ihe.net/common/order-manager/orderHierarchy4Radiology.xml
For EYECARE, that hierarchy is here: https://gazelle.ihe.net/common/order-manager/orderHierarchy4Eyecare.xml. (Note that this is documented in excel form here.)
There is no result file to upload to Gazelle Test Management for this preparatory test. If you do not load the codes you need on your test system prior to the Connectathon, you may find yourself wasting valuable time on the first day of Connectathon syncing your codes with those of your test partners.
This test gives you access to DICOM studies used to test XDS-I Query & Retrieve, and the QIDO-RS Query [RAD-129] transaction that is used by actors in several profiles (WIA, AIR, ...). The data is also used to test the RAD-14 transaction with the Enterprise Identiy option in SWF.b
Location of the studies
There are four DICOM studies available. The Responder system (e.g. and Image Manager, Imaging Document Source or Imaging Document Responder) must load these four studies onto its system.
Summary of the DICOM studies
The contents of the studies are summarized in the "XDS-I,b XCA-I and WIA studies" google sheet.
There are 3 tabs in the sheet:
Patient ID Procedure Code Modality Series Count Image Count --------------------------------------------------------------------- C3L-00277 36643-5 DX 1 1 C3N-00953 42274-1 CT 3 11 TCGA-G4-6304 42274-1 CT 3 13 IHEBLUE-199 CT 1 1
Prior to the Connectathon, the Imaging Document Source should:
There is no file to upload to Gazelle Test Management for this preparatory test. If you do not load the studies you need on your test system prior to the Connectathon, you may find yourself wasting valuable time on the first day of Connectathon.
This test is for Imaging Document Source actors in the XDS-I.b and XCA-I Profiles that support the "Set of DICOM Instances" option. (If your Imaging Document Source only supports PDF or Text Reports, then this test does not apply to you.)
For this test, we ask you to create manifests for 3 studies that Connectathon Technical Managers provide. This enables us to check both the metadata and manifest for expected values that match data in the images and in the XDS metadata affinity domain codes defined for the Connectathon (i.e. codes.xml). (For other peer-to-peer tests during Connectathon, you will be able to also test with studies that you provide.)
The manifests you create for these 3 studies will be used for some XDS-I/XCA-I tests during Connectathon week.
Before you prepare the manfiests using the Instructions below, first load the DICOM Studies in the Test Data. See Prepratory Test DICOM_QR_Test_Data
Prior to the Connectathon, the Imaging Document Source should:
During Connectathon, a monitor will examine your Manifest; there are two verifications that Connectathon Monitors will perform:
(1) examine the DICOM Manifest for the study
(2) examine the metadata for the submitted manifest
We do not duplicate the Evaluation details here, but we encourage the Imaging Document Source to read those details now to ensure its manifest will pass verification during Connectathon. Find those details in Gazelle Tests Management on your Text Execution page in Connectathon test "XDS-I.b_Manifest_and_Metadata".
This section contains test cases performed with the Gazelle External Validation Service (EVS) tool.
Tool: https://gazelle.ihe.net/evs/home.seam
Tool user guide: https://gazelle.ihe.net/gazelle-documentation/EVS-Client/user.html
In this test, an XDS-SD Content Creator will use Gazelle External Validation Service (EVS Client) to:
(1) Instructions for validation your sample XDS-SD for Connectathon testing.
First, upload your XDS-SD document into Gazelle Test Management. If you support both PDF and Text, you will upload two samples
1. Create an XDS-SD document according to the capabilities of your application. Name the file using this convention:
2. Upload the document into the Samples area of Gazelle Test Management.
(2) Instructions for validating your XDS-SD using EVSClient directly:
Finally, paste the Permanent Link into Gazelle Test Management as the results for this test.
This test concerns the POCDM actor of the LPOCT profile. You will need to validate with EVSClient tool the IHE conformance of your HL7 messages.
As your system implements the POCDM actor, you will need to test the HL7 messages used in the LAB-32 transaction : "Accepted Observation Set".
Your system must be able to send HL7 messages (to the Oder Filler actor) of types :
To test the IHE conformance of your HL7 messages, go to the EVSCLient tool at this location : EVSClient
If it is your first time with this tool, please read the user manual : EVSClient User Manual
In the EVSClient tool, go to the HL7 menu entry and choose "HL7v2" then click on "Message Validation".
Paste your message to validate in the box. (You can hit the "Guess" button to preset the Message Profile OID.)
For example, for the ORU^R30^ORU_R30 message :
Do this for all messages and don't forget to copy/paste the "Permanent link" of the validation result to Gazelle.
This test concerns the Order Filler actor of the LPOCT profile. You will need to validate with EVSClient tool the IHE conformance of your HL7 messages.
As your system implements the Order Filler actor, you will need to test the HL7 messages used in the LAB-32 transaction : "Accepted Observation Set".
Your system must be able to send HL7 message (to the POCDM actor) of type:
To test the IHE conformance of your HL7 messages, go to the EVSCLient tool at this location : EVSClient
If it is your first time with this tool, please read the user manual : EVSClient User Manual
In the EVSClient tool, go to the HL7 menu entry and choose "HL7v2" then click on "Message Validation".
Paste your message to validate in the box. (You can hit the "Guess" button to preset the Message Profile OID.)
For example, for the ACK^R33^ACK message :
Do this for all messages and don't forget to copy/paste the "Permanent link" of the validation result to Gazelle.
This test applies to the Content Data Structure Creator actor in the Aggregate Data Exchange (ADX) profile.
This test ensures that the DSD file produced by the Content Data Structure Creator actor is conformant to the ADX schematron specification.
Reference: QRPH TF-3: 8.2 and Appendix 8A, currently in the ADX Trial Implementation Supplement.
The gazelle External Validation Service (aka EVSClient) hosts the schmatron file and is used to check the DSD.
Instructions
Evaluation
This test applies to the Content Creator actor in the Advanced Patient Privacy Consents (APPC) profile.
This test ensures that the file produced by the Content Creator actor is conformant to the specification.
Reference: ITI TF-3: 5.6, currently in the APPC Trial Implementation Supplement.
The gazelle External Validation Service (aka EVSClient) hosts the schmatron file used to check the APPC documents.
Instructions
Evaluation
You will use the Gazelle EVSClient to validate messages in SAML- and XACML-related profiles.
These messages are used in several IHE profiles. The messages you validate will depend on the profile/actor pairs supported by your test system.
Instructions
Evaluation
You will use the Gazelle EVSClient to evaluate CDA documents defined in IHE profiles in several domains.
The documents you validate will depend upon the profile/actor pairs supported by your test system, and the availability of schematron or model-based validation available in the EVSClient
Instructions
Evaluation
These validators available in the Schematron dropdown list. Note that the tool is updated over time, so there may be additional validators available in the tool itself.
These validators are available in the Model Based Validator dropdown list. Note that the tool is updated over time, so there may be additional validators available in the tool itself.
Overview
We use DICOM validator tools hosted in the Gazelle External Validation Service (EVS) to evaluate your DICOM objects.
In this test, Acquisition Modality, Lightweight Modality, or Evidence Creator systems evaluate samples of Composite Objects that you create using the DICOM evaluation tools available in the Gazelle External Validation Service (EVS). This test also applies to actors such as Importers that modify objects originally created by other actors.
The number of evaluations you run depends on the types of images or SRs that you produce. We will not list specific requirements, but ask you to apply good judgment. For example, a CT scanner that produces Localizer and Axial images would evaluate samples from both of those image types. A CR device may evaluate an AP chest, a lateral chest and an image of a limb. A Lightweight Modality might create a VL Photographic Image IOD or a Video Photographic Image IOD.
You must evaluate and provide the output for at least one DICOM Composite Object using '''one'' of the available validation tools. If you support multiple profiles and create different DICOM IODs, you should validate each type.
One or more of these tools may be available from within the Gazelle EVS. (Note: The links below are for resource pages for each tool.). (For some testing events, the list of DICOM validators may be smaller or larger.)
Evaluating your objects using the different tools available as well as evaluating different objects can only help your implementation.
There are two ways to access the validators: (1) in the Samples area of Gazelle Test Management, or (2) directly in the EVS tool. Either method may be used.
(1) Instructions for accessing the DICOM validators from the Samples area of Gazelle Test Management:
(2) Instructions for accessing the DICOM validators in the Gazelle EVS Tool:
Finally, capture your results:
This test applies to the Content Creator actor in the Document Digital Signature (DSG) profile.
This test ensures that the file produced by the Content Creator actor is conformant to the specification.
Reference: ITI TF-3: 5.5.
Instructions
Evaluation
You will use the Gazelle EVSClient to validate messages in the DSUB profile..
The messages you validate will depend upon the profile/actor pairs supported by your test system
Instructions
Evaluation
The Gazelle External Validation Service (EVS) performs validation using FHIR StructureDefinitions contained in IHE profiles published in FHIR Implementation Guide (IG) format (e.g. the ITI Mobile Access to Health Documents (MHD) IG, and many others). The StructureDefinitions used by EVS are found on the "Artifacts" page of the IG.
In this test, you will use the Gazelle EVS tool to validate:
The validators are available in EVS under menu IHE-->FHIR IG-based -->Validate.
This test appears in Gazelle Test Management as a Preparatory test, so...
The Gazelle External Validation Service (EVS) performs validation using FHIR StructureDefinitions contained in the IHE Radiology Interactive Multimedia Report (IMR) Profile. The StructureDefinitions used by EVS are found on the "Artifacts" page of the IMR IG.
In this test, you will use the Gazelle EVS tool to validate:
The validators are available in EVS under menu IHE-->FHIR IG-based -->Validate.
This test appears in Gazelle Test Management as a Preparatory test for IMR actors.
This table identifies IMR actors (Col 1), the content it creates (Col 2), and which validators in EVS to use to validate that content (Col 3).
IMR actor | Content that is validated |
RAD IMR Validator in EVS |
Report Creator | create/store a report with RAD-141 | Bundle |
Report Creator | in the IMR bundle | DiagnosticReport |
Report Creator | may include in IMR bundle | ServiceRequest |
Report Creator | may include in IMR bundle | ImagingStudy |
Report Creator | in the ImagingStudy | ImagingStudy Endpoint |
Report Repository, Report Reader, Rendered Report Reader |
response to a RAD-141 transaction | Bundle Response |
Report Repository | response to an RAD-143 transaction | Find Multimedia Report Response |
Imaging Observation (experimental) |
The Gazelle External Validation Service (EVS) performs validation using FHIR StructureDefinitions contained in the IHE Radiology Integrated Reporting Applications (IRA) Profile. The StructureDefinitions used by EVS are found on the "Artifacts" page of the IRA IG.
In this test, you will use the Gazelle EVS tool to validate:
The validators are available in EVS under menu IHE-->FHIR IG-based -->Validate.
This test appears in Gazelle Test Management as a Preparatory test for IRA actors.
This table identifies IMR actors (Col 1), the content it creates {Col 2), and which validators in EVS to use to validate that content (Col 3).
IRA actor |
Content that is validated |
RAD IRA Validator in EVS |
Image Display, Report Creator, Worklist Client |
Open Report Context [RAD-148] | DiagnosticReport Context Patient Context ImagingStudy Context |
Content Creator (see Note) | Update Report Content [RAD-150] | DiagnosticReport Update |
Content Creator | Update Report Content [RAD-150] | ImagingSelection Content |
Content Creator | Update Report Content [RAD-150] | Observation Content |
Content Creator | may include in DiagnosticReport | DiagnosticReport associated study |
Note: The Content Creator in Column 1 is grouped with one of these actors: Report Creator, Evidence Creator, Image Display, Evidence Creator, Stateless Evidence Creator. The type of 'content' it creates (i.e. the FHIR Resource validated in Column 3) depends on this grouped actor.
You will use the Gazelle EVSClient tool to validate:
Scope of testing available in Gazelle EVSClient
This test appears in Gazelle Test Management as a preparatory test for actors in all FHIR-based IHE profiles, so...
You will use the Gazelle EVSClient to validate HL7v2.x based messages.
These messages are applicable across many IHE profiles in several domains. The messages you validate will depend upon the profile/actor pairs supported by your test system.
Instructions
Evaluation
You will use the Gazelle EVSClient to validate HL7v3-based messages.
These messages are applicable across many IHE profiles. The messages you validate will depend upon the profile/actor pairs supported by your test system.
Instructions
Evaluation
You will use the Gazelle EVSClient to validate messages in the HPD profile..
The messages you validate will depend upon the profile/actor pairs supported by your test system
Instructions
Evaluation
This test uses the Gazelle EVS Tool to validate a QIDO-RS Query Request or Response message.
Actors tested: Any initiator or responder in the [RAD-129] QIDO-RS Query transaction
Instructions
Evaluation
Actors in the REM profile are required to create DICOM-compliant Radiation Dose SR objects. This applies to Acquisition Modalities and also to Dose Information Reporters that perform de-identification for transaction [RAD-63] Submit Dose Information
In this test, we validate your sample SRs using PixelMed's DoseUtility tool, one of the DICOM validators hosted in the Gazelle EVS tool.
Instructions
This test has two parts:
(1) First, the Event Reporter identifies the SOLE events that it supports. This helps your Connectathon test partners and the Connectathon monitors understand the capabilities of your system.
(2) Then, you will use the Gazelle EVSClient tool to test the content of the event record(s) you create against the DICOM schema & SOLE Event Requirements. This tool does not test the transport of the event record.
You will use the Gazelle EVSClient to validate messages in the SVS profile.
The messages you validate will depend upon the profile/actor pairs supported by your test system
Instructions
Evaluation
This tests the request in the WADO Retrieve [RAD-55] transaction
In XDS-I or IOCM, the Imaging Document Consumer is required to support one or more of RAD-55, RAD-68 or RAD-16 (or other C-MOVE).
The aim of this test is to verify that the Imaging Document Consumer is able to create a valid WADO request (the list of parameters is well specified, there are no inconsistency between them, etc). To do so, we use a WADO validator of the request from EVSClient tool.
In this test we do not verify that the request was well treated, or not, by the server side.
Instructions
Evaluation
You will use the Gazelle EVSClient to validate messages for the Cross-Enterprise (XD*) and Cross-Community (XC*) family of profiles.
The messages you validate will depend upon the profile/actor pairs supported by your test system
Instructions
Evaluation
Content Creator creates a Workflow dDocument and validates it
Content Creator will create a workflow document produced by your application. This could be as defined in the base XDW profile, or on the specialized workflow documents such as XTHM, XBeR, and others.
Then you will use the XDW Validator in Gazelle EVS Client to verify your Workflow Document.
Finally, you will upload your sample document into Gazelle Test Management so that Content Consumers can access your sample and test with it prior to the Connectathon.
Instructions
For this test, you will create a Workflow Document for a task that is relevant to your product/application; the contents of the task you complete should be in the context of the clinical functions your product performs.
For this test, you do not have to submit your workflow document to an XDS Repository/Registry.
(1) Once you have created/updated the workflow document, upload it into the Samples area of Gazelle Test Management:
Alternative: Access XDW Validator via the Gazelle External Validation Service (EVS):
Evaluation
Your test partners that are Content Consumers will also be able to access your sample in Gazelle Test Management.
Content Updater updates a Workflow Document
In this test you will access a sample workflow made by a Content Creator; then you will update that document using your application, and use the XDW Validator provided in Gazelle EVS Client to validate your updated document.
Finally, you will upload your 'updated' Workflow Document into the Samples area of Gazelle Test Management.
Instructions
For this test, you will update a workflow document
(1) For the base XDW profile, access this sample workflow document: XDW-Document-for-Updater.xml
-- If you are testing XBeR, XTHM, we do not have a generic sample for you to start from. You may use a sample submitted by a 'Creator' vendor in your profile.
(2) Download the Workflow Document into your local environment, then update the document and add a task that is relevant to your product/application.
For this test, you do not have to submit your workflow document to an XDS Repository/Registry
(3) Once you have created the updated workflow document, upload your document into the Samples area of Gazelle Test Management:
Alternative: Access XDW Validator via the Gazelle External Validation Service (EVS):
Evaluation
Your test partners that are Content Consumers will also be able to access your sample in Gazelle Test Management.
Attachment | Size |
---|---|
XDW-Document-for-Updater.xml | 3.9 KB |
This section contains test cases performed with the the Healthcare Provider Directory -- HPD -- Simulator tool.
Tool: https://gazelle.ihe.net/HPDSimulator/home.seam
Tool information page: https://gazelle.ihe.net/content/hpd-simulator
We use this 'test' to inform you of the Gazelle HPD Simulator & Validator tool available for your testing.
HPD actors simulated:
Location of the tool: https://gazelle.ihe.net/HPDSimulator/home.seam
Tool user manual: https://gazelle.ihe.net/content/hpd-simulator
We encourage you to test with the simulator prior to the Connectathon.
There are no results to upload into Gazelle Test Management for this test.
This section contains test cases performed with the LBL Simulator.
Tool: http://gazelle.ihe.net/LBLSimulator
Tool information page: http://gazelle.ihe.net/content/lbl-simulator
This test concerns the LB (Label Broker) actor. You will use the Order Manager tool to send a request to your system under test.
Access the LBL Simulator tool at this location : Order Manager
If it is your first time with this tool, please read the user manual : Order Manager user manual
Please be remind that if you are logged in your configurations will be private.
In this test, the SUT (System Under Test) must received the labeling instructions from the LBL Simulator.
As the SUT implements the LB (LB acts as a responder in this test) :
The messages exchanged between the simulator and the SUT can be found in the message table on the very same page or from the HL7 Message menu entry.
Hit the link on the left side of the raw (first column of the table), to display the Permanent Link to the test report for the corresponding HL7 message. (You can access to the hl7v2 report tutorial for more details.)
If the validation report status is passed for the message and the response, copy the "test result link" and paste it in Gazelle as the result of this test.
For further details, see this tutorial : How enter your pre-connectathon test result in Gazelle (only for the LAW, LCSD and LBL Profiles of the Laboratory Domain which send HL7v2 messages)?
This test concerns only the LIP (Label Information Provider) actor. Your system under test will request the Order Manager tool (acting as Label Broker) to deliver the labels (LAB-61 transaction).
Access the LBL Simulator tool at this location : Order Manager
If it is your first time with this tool, please read the user manual : Order Manager user manual
Please be remind that if you are logged in your configurations will be private.
In this test, the SUT (System Under Test) must send the labeling instructions to the LBL Simulator.
As your system under test implements the LIP (LIP acts as an initiator in this test) :
The messages exchanged between the simulator and the SUT can be found in the message table on the very same page or from the HL7 Message menu entry.
Hit the link on the left side of the raw (first column of the table), to display the Permanent Link to test report for the corresponding HL7 message. (You can access to the hl7v2 report tutorial for more details.)
If the validation report status is passed for the message and the response, copy the "test result link" and paste it in the comment section of the pre-connectathon log return page as the result for this test.
For further details, see this tutorial : How enter your pre-connectathon test result in Gazelle (only for the LAW, LCSD and LBL Profiles of the Laboratory Domain which send HL7v2 messages)?
This test concerns only the LIP (Label Information Provider) actor. You will need to communicate with the LBL Simulator, in order to simulate the LAB-62 transaction of the LBL Profile.
Access the LBL Simulator tool at this location : Order Manager
If it is your first time with this tool, please read the user manual : Order Manager User Manual
Please be remind that if you are logged in your configurations will be private. Requirements :
As your system implements the LIP (LIP acts as a responder in this test) :
In this test, you must use the LBL Simulator to query the SUT. Send severals messages using different parameters. All (Required) possibilities are defined in the steps below :
Take an example, for the step 1 :
Do this for all steps and don't forget to copy/paste the "test result link". Link the step number to the "test result link".
This test concerns the LB (Label Broker) actor. You will need to communicate with the LBL Simulator, in order to simulate
the LAB-62 transaction of the LBL Profile.
Access the LBL Simulator tool at this location : Order Manager
If it is your first time with this tool, please read the user manual : Order Manager User Manual
Please be remind that if you are logged in your configurations will be private. Requirements :
As your system implements the LB (LB acts as an initiator in this test) :
In this test, the SUT must query the LBL Simulator. Send severals messages using different parameters. All (Required) possibilities are defined in the steps below :
Take an example, for the step 1 :
Do this for all steps and don't forget to copy/paste the "test result link". Link the step number to the "test result link".
The point below must be verified for each steps :
This test concerns the LIP (Label Information Provider) actor. Your SUT will receive the label delivered notification from the Order Manager tool acting as Label Broker.
In this test, the Label Broker notifies the effective labeled containers production to the Label Information Provider.
So, the LBL Simulator, needs to know the labeling instruction, before to send a notification message to the LIP for the effective labels printing and labeled containers production.
Two steps are necessary in this test :
Access the LBL Simulator tool at this location : Order Manager
If it is your first time with this tool, please read the user manual : Order Manager User Manual
Please be remind that if you are logged in your configurations will be private.
This test concerns the LB (Label Broker) actor. You will use the Order Manager tool to simulate the Label Information Provider actor.
In this test, the Label Broker notify the effective labeled containers production to the Label Information Provider.
So, the SUT (System Under Test), needs to know the labeling instruction, before to label the tubes and to send a notification message to the LIP for the effective labels printing and labeled containers production.
Two steps are necessary in this test :
Access the LBL Simulator tool at this location : Order Manager
If it is your first time with this tool, please read the user manual : Order Manager User Manual
Please be remind that if you are logged in your configurations will be private.
The validation status must be passed for the two messages of the transaction. The message type must be the right message type, according to IHE.
The Acknowledgment code must be "AA" (MSA-1) in the acknowledgment message.
This section contains test cases performed with the LCSD Simulator.
Tool: https://gazelle.ihe.net/LCSDSimulator
Tool information page: https://gazelle.ihe.net/content/lcsd-simulator
This test concerns the CSC (Code Set Consumer) actor. You will need to communicate with the LCSD Simulator, in order to simulate the LAB-51 transaction of the LCSD Profile.
Access the LCSD Simulator tool at this location : LCSD Simulator
If it is your first time with this tool, please read the user manual : LCSD Simulator User Manual
Please be remind that if you are logged in your configurations will be private.
As your system implements the CSC (CSC acts as a responder in this test) :
Send at least, one code set for each code set category (Battery, Calculated, Not Numeric and Numeric. The Batch mode is not available yet in the LCSD Simulator.)
All (Required) possibilities are defined in the steps below.
How run and log this steps ?
For example, for the step 1:
The messages exchanged between the simulator and the SUT can be found in the message table on the very same page or from the HL7 Message menu entry.
Hit the link on the left side of the raw (first column of the table), to display the Permanent Link to test report for the corresponding HL7 message. (You can access to the hl7v2 report tutorial for more details.)
If the validation report status is passed for the message and the response, copy the "test result link", and paste it in Gazelle.
For further details, see this tutorial : How enter your pre-connectathon test result in Gazelle (only for the LAW, LCSD and LBL Profiles of the Laboratory Domain which send HL7v2 messages) ?
Do this for all steps and don't forget to copy/paste the "test result link". Link the step number to the "test result link".
This test concerns only the CSM (Code Set Master) actor. You will need to communicate with the LCSD Simulator, in order to simulate
the LAB-51 transaction of the LCSD Profile.
Access the LCSD Simulator tool at this location : LCSD Simulator
If it is your first time with this tool, please read the user manual : LCSD Simulator User Manual
Please be remind that if you are logged in your configurations will be private.
As your system implements the CSM (CSM acts as an initiator in this test) :
Send at least, one code set for each code set category (Battery, Calculated, Not Numeric and Numeric. The Batch mode is not available yet in the LCSD Simulator.)
All (Required) possibilities are defined in the steps below.
How run and log this steps ?
For example, for the step 1 :
The messages exchanged between the simulator and the SUT can be found in the message table on the very same page or from the HL7 Message menu entry.
Hit the link on the left side of the raw (first column of the table), to display the Permanent Link to test report for the corresponding HL7 message. (You can access to the hl7v2 report tutorial for more details.)
If the validation report status is passed for the message and the response, copy the "test result link" and paste it in Gazelle as the result of this test.
For further details, see this tutorial : How enter your pre-connectathon test result in Gazelle (only for the LAW, LCSD and LBL Profiles of the Laboratory Domain which send HL7v2 messages) ?
This section contains test cases where sample messages and objects are:
EYECARE-15 is a Patient Registration message
In this test, we ask you to provide a sample message produced by your system, and we will validate your message by manual evaluation using the requirements in the Eye Care Technical Framework.
This is the same evaluation we will perform during the Connectathon. This test enables you to prepare in advance.
Finally, update the status of this pre-Connectathon test in gazelle to signal that your message is ready for evaluation:
EYECARE-16 is a Appointment Scheduling Management message
In this test, we ask you to provide a sample message produced by your system, and we will validate your message by manual evaluation using the requirements in the Eye Care Technical Framework.
This is the same evaluation we will perform during the Connectathon. This test enables you to prepare in advance.
Finally, update the status of this pre-Connectathon test in gazelle to signal that your message is ready for evaluation:
EYECARE-17 is a Charge Posting message
In this test, we ask you to provide a sample message produced by your system, and we will validate your message by manual evaluation using the requirements in the Eye Care Technical Framework.
This is the same evaluation we will perform during the Connectathon. This test enables you to prepare in advance.
Finally, update the status of this pre-Connectathon test in gazelle to signal that your message is ready for evaluation:
EYECARE-21 is a Procedure Scheduled message
In this test, we ask you to provide a sample message produced by your system, and we will validate your message by manual evaluation using the requirements in the Eye Care Technical Framework.
This is the same evaluation we will perform during the Connectathon. This test enables you to prepare in advance.
Finally, update the status of this pre-Connectathon test in gazelle to signal that your message is ready for evaluation:
EYECARE-23 is XML for Refractive Measurement (no Pat ID)
EYECARE-24 is XML for Refrative Measurement (valid Pat ID)
In this test, we ask you to provide a sample message produced by your system, and we will validate your message by manual evaluation using the requirements in the Eye Care Technical Framework and in the JOIA 1.5 specification.
This is the same evaluation we will perform during the Connectathon. This test enables you to prepare in advance.
Finally, update the status of this pre-Connectathon test in gazelle to signal that your message is ready for evaluation:
This is the same evaluation that a monitor will perform during the Connectathon. This test enables you to prepare in advance.
In this test, we will evaluate a sample RAD-2 OMG message produced by your system.
Since CDS-OAT is a new profile, we do not yet have a tool to evaluate the specific requirements for RAD-2 in CDS-OAT, so we will validate your message using:
This is the same evaluation we will perform during the Connectathon. This test enables you to prepare in advance.
In this test, we will evaluate a sample RAD-3 OMG message produced by your system.
Since CDS-OAT is a new profile, we do not yet have a tool to evaluate the specific requirements for RAD-3 in CDS-OAT, so we will validate your message using:
This is the same evaluation we will perform during the Connectathon. This test enables you to prepare in advance.
In this test, we will evaluate a sample RAD-4 OMI message produced by your system.
Since CDS-OAT is a new profile, we do not yet have a tool to evaluate the specific requirements for RAD-4 in CDS-OAT, so we will validate your message using:
This is the same evaluation we will perform during the Connectathon. This test enables you to prepare in advance.
This section contains test pre-Connectathon cases with the NIST FHIR Toolkit (aka "Asbestos) prior to IHE CONNECTATHONS. Follow the link below.
The NIST FHIR Toolkit is used by developers before and during IHE North American and European Connectathons.
The FHIR Toolkit is used by participants and monitors during Connectathon week, so participants should prepare by using the tools in their home lab prior to Connectathon.
Location of tool and associated documentation:
Home page for documentation: https://github.com/usnistgov/asbestos/wiki
Source code and release notes for FHIR Toolkit are found here: https://github.com/usnistgov/asbestos/releases/
Installation Guilde: https://github.com/usnistgov/asbestos/wiki/Installation-Guide
IHE profiles/actors tested:
The NIST FHIR Toolkit contains tests for these actors:
Test instructions for each actor:
The list of tests and detail test instructions for each actor reside within the toolkit package.
--> You should perform the tests for the actor(s) and options you have implemented .e.g. support for Comprehensive metadata vs. Minimal metadata.
Preparatory test list in Gazelle Test Management:
In Gazelle Test Management, your preparatory test list will contain one test per actor you are testing with the FHIR Toolkit. The naming convention is FHIR_Toolkit_<profile>-<actor>, e.g., FHIR_Toolkit_MHD-Doc_Source.
Because this tool is new, you will not upload any logs as part of preparatory testing. Instead, you may add a note in the test instance that says, "I have successfully performed testing with the FHIR Toolkit."
Stay informed:
Join the MHD implementer google group to receive information on updates to the tools and to follow Q&A within the developer community. Users of FHIR Toolkit should subscribe.
NIST maintains a tool known as the PCD Test tools. It offers a suite of Preparatory Connectathon tests for the HL7v2 based transactions for many profiles in the Devices domain.
As a preparation for the Connectathon, you are requested to execute the test cases relevant to your system under test.
When done, post evidence in Gazelle Test Management (e.g. screen shots) that the tests have been run.
The tool is available here: https://ihe-pcd.nist.gov/pcdtool
When you run tests in the the "Context-based" tab, make sure you select the test plan for the latest development cycle.
January 2024: Note that the PIX/PDQ tool has been retired by NIST. IHE is grateful to NIST for their efforts to develop and support this tool. It was used for over 10 years by participants testing their PIX/PDQ and XPID implementations at IHE Connectathons.
This section contains test pre-Connectathon cases with the NIST PIX/PDQ tools prior to IHE Connectathons. Follow the link below.
January 2024: Note that the PIX/PDQ tool used in this test has been retired by NIST. IHE is grateful to NIST for their efforts to develop and support this tool. It was used for over 10 years by participants testing their PIX/PDQ and XPID implementations at IHE Connectathons.
The test description below is now DEPRECATED, but kept here for historical purposes.
========
The NIST PIX/PDQ tool is used by developers before and during IHE North American and European Connectathons. The tools enable developers to test actors in several IHE profiles listed below.
Location of tool and associated documentation:
The tool is found here: https://pixpdqtests.nist.gov/pixpdqtool/
For help using the tool, see the Documentation tab in the tool
IHE profiles/actors tested:
The NIST PIX/PDQ Tool contains tests for these actors:
Test instructions for each actor:
The list of tests and the test instrictions for each actor reside on the tool website.
Evaluation
In Gazelle Test Management, your preparatory test list will contain one test per actor you are testing with the tools. The naming convention is NIST_<profile>-<actor>, e.g., NIST_PIX-Manager
When you have successfully finished testing your actor, capture evidence of success (eg a screenshot).
Upload that file into Gazelle Test Management as the result for this actor.
This section contains test pre-Connectathon cases with the NIST XDS Toolkit prior to IHE CONNECTATHONS. Follow the link below.
The NIST XDS Toolkit is used by developers before and during IHE North American and European Connectathons.
The XDS Tools will be used during Connectathon week, so participants should prepare by using the tools in their home lab prior to Connectathon.
Location of tool and associated documentation:
Refer to the home page for the NIST Document Conformant Test Tools here: https://ihexds.nist.gov/
Source code and release notes for XDS Toolkit are found here: https://github.com/usnistgov/iheos-toolkit2/releases
IHE profiles/actors tested:
The NIST XDS Toolkit contains tests for these actors:
Test instructions for each actor:
The list of tests for each actor now reside within the toolkit package. Likewise, the test definitions are distributed in the package.
--> You should perform the tests for the actor(s) you have implemented You will be instructed to create a 'test session' within the tool to represent the actor you are testing. When you do this, all of your results are collected within one directory.
Evaluation:
In Gazelle Test Management, your preparatory test list will contain one test per actor you are testing with the XDS Toolkit. The naming convention is XDS_Toolkit_<profile>-<actor>, e.g., XDS_Toolkit_XDS.b-Doc_Repository.
When you have successfully finished testing your actor, create a zip file of the result files located in toolkit in the {ExternalCache}/TestLogCache directory.
Upload that zip file into Gazelle Test Management as the result for this actor.
Stay informed:
Join the XDS implementer google group to receive information on updates to the tools and to follow Q&A within the developer community. Users of XDS Toolkit should subscribe.
Pre-connectathon testing for systems implementing the PAM (Patient Administration Management) integration profile are perfomed against the Patient Manager simulator available at http://gazelle.ihe.net/PatientManager
Before starting your tests, please set up properly your system and/or give the correct information to the simulator in order to enable it to access your system under test. We also strongly recommend to read the documentation located at http://gazelle.ihe.net/content/patient-manager-user-manual
Read the configuration parameters of the Patient Demographic Consumer part of the simulator and configure your system to send messages to this part of the simulator. You will find this information following the menu: Patient Identification Management/Patient Demographic Consumer/Configuration and messages. Be careful to select the right character encoding before checking the receiving port.
The messages you will send to the simulator will also be available on that page.
The pre-connectathon test dedicated to your system is located here.
Register your system under test into the Gazelle simulator following the menu SUT Configurations, then click on "Create a configuration". Select the SUT actor as "PDC" and select the encoding character set expected by your SUT otherwise your system will not be able to decode the messages. Make sure your system is available from the Internet and no firewall prevents Gazelle to access your tool.
The pre-connectathon test dedicated to your system is located here.
Read the configuration parameters of the Patient Encounter Consumer part of the simulator and configure your system to send messages to this part of the simulator. You will find this information by following the menu: Patient Encounter Management/Patient Encounter Consumer/Configuration and messages. Be carreful to select the right character encoding before checking the receiving port.
The messages you will send to the simulator will also be available on that page.
The pre-connectathon test dedicated to your system is located here.
Register your system under test into the Gazelle simulator following the menu SUT Configurations, then click on "Create a configuration". Select the SUT actor as "PEC" and select the right encoding character set otherwise you may receive messages your system will not be able to decode. Make sure your system is available from the Internet and no firewall prevents Gazelle to access your tool.
The pre-connectathon test dedicated to your system is located here.
This test will be performed against the PAMSimulator tool. The goal of this test is to check the capability of your system to send/receive the messages defined within the ITI-31 (Patient Encounter Management) transaction. This test is only dealing with the basic set of trigger events defined for ITI-31, that means Patient admisssion, registration, discharge and the relative cancellation.
You will retrieve the patients the simulator has sent or received under the "All patients" menu; for each patient, the list of relative encounters is available under the tab entitled "Patient's encounters". You may want to use the filter to facilitate your search. If you are using the simulator as a PES, you can log onto the application using the CAS mechanism (use your Gazelle credentials) and easily retrieve the patients you have created within the application by checking the "see only patients created by me" checkbox. If you use the simulator as a PEC, the creator of the patients/encounters received by the simulator is the sending facility_sending application of your system under test. Once you have found the right patient, click on the magnifying glass you will get the permanent link to this patient; copy and paste it into the comment box of your pre-connectathon test instance.
Before starting your test, please read the instructions at http://gazelle.ihe.net/content/pre-connectathon-tests/pam
This test requires three patients we will name Patient1, Patient2 and Patient3. According the PAM profile, there is no need for the consumer to be aware of these patients before receiving encounter notifications for them.
This test is divided into two parts:
You will use the PAM Simulator as a Patient Encounter Consumer. Go to the Patient Encounter/Management/Patient Encounter Consumer page in order to retrieve the configuration (IP address, port, receiving facility/application) of the simulator.
1. Admit patient
2. Register patient
3. Cancel admission
4. Discharge patient
5. Cancel discharge
In order to help the connectathon manager with checking this test, go to "All patients" page and retrieve Patient1, Patient2, Patient3. For each of those patients, copy the permanent link and paste it in Gazelle Test Management.
You will use the PAM Simulator as a Patient Encounter Supplier. You may want to log onto the application to easily retrieve the patients/encounter you will create. Go to the Patient Encounter Management/Patient Encounter Supplier page.
1. Admit patient
In this step, you are going to create Patient1 and Patient2 and to admit them as inpatients.
2. Register patient
In this step, you are expected to create Patient3 and to register them as outpatient.
3. Cancel admission
In this step, you are expected to cancel the admission of Patient1.
4. Discharge patient
In this step, you are expected to discharge Patient2.
5. Cancel discharge
In this step, your are expected to cancel the discharge of Patient2
In order to help the connectathon manager with checking this test, go to "All patients" page and retrieve Patient1, Patient2, Patient3. For each of those patients, copy the permanent link and paste it in Gazelle Test Management.
The aim of this pre-connectathon test is to check your system under test is able to receive/send the messages exchanged within the ITI-30 (Patient Identification Management) transaction. Here, we are testing your capability to create a new patient, update his/her demographics and identifiers and, depending of the set of options you support, to merge and/or link patient demographic information.
Both actors (Patient Demographic Consumer and Patient Demographic Supplier) will be asked to test against the PAM Simulator. For each step of this test, you are expected to provide the permanent link to the patient sent or received by the simulator and the permanent link to the test report.
You will retrieve the patients the simulator has sent or received under the "All patients" menu. You may want to use the filter to facilitate your search. If you are using the simulator as a PDS or PES, you can log onto the application using the CAS mechanism (use your Gazelle credentials) and easily retrieve the patients you have created within the application by checking the "see only patients created by me" checkbox. If you use the simulator as a PDC or PEC, the creator of the patients received by the simulator is the sending facility_sending application of your system under test. Once you have found the right patient, click on the magnifying glass you will get the permanent link to this patient; copy and paste it into the comment box of your pre-connectathon test instance.
The test report is also available through a permanent link. Go to the "HL7 Messages" page and select the message related to the current step, you will be the link to the test report.
Before starting your test, please read the instructions at http://gazelle.ihe.net/content/pre-connectathon-tests/pam
Test definition for Patient Demographic Consumer actor
Test definition for Patient Demographic Supplier actor
You will use the PAM Simulator as a Patient Demographic Supplier. You may want to log onto the application to easily retrieve the patients you will create. You will have to switch among the pages available under the Patient Identification Management/Patient Demographic Supplier menus.
1. Patient creation
In this step, you are expected to send to your simulator two new patients (ADT^A28^ADT_A05 messages). Go to Patient Identification Management/Patient Demographic Supplier/Create new Patient.
2. Update patient information
In this step, you are expected to update the first name of Patient1 and send the notification to your system under test. Go to Patient Identification Management/Patient Demographic Supplier/ Update patient information.
3. Change patient's identifier list
In this step, you are expected to change one of the identifiers of Patient1. Go to Patient Identification Management/Patient Demographic Suppliser/Change patient identifier list.
4. Merge patients (if option supported by your SUT)
In this step, you will reuse Patient1 and Patient2 and merge them. Go to Patient Identification Management/Patient Demographic Supplier/Merge patients.
5. Link patients (if option supported by your SUT)
In this step, you will reuse Patient1 and Patient2 and merge them. If your SUT supports both merge and link options and if you have already performed step4; please create a third patient to replace Patient2 in this test. Go to Patient Identification Management/Patient Demographic Supplier/Link Unlink patients.
You will use the PAM Simulator as a Patient Demographic Consumer. The creator of the patients you will send to the simulator will be set up from the sending facility and application values contained in the received HL7 messages. The configuration of this part of the simulator is available under Patient Identification Management/Patient Demographic Consumer/Configuration and messages.
1. Patient creation
2. Update patient information
3. Change patient's identifier list
4. Merge patients (if option supported by your SUT)
5. Link patients (if option supported by your SUT)
This test deals with the subset of trigger events defined for the Inpatient/Outpatient Encounter Management option of ITI-31 transaction. As a reminder, here is the list of events your system must support to fulfill the requirements of this option:
This test is written for both Patient Encounter Supplier and Patient Encounter Consumer, refer to the right section according the actors your system under test supports.
In this test, we check the capability of your system under test to send messages for notifying the PEC actor of new events. You are asked to test against the PAMSimulator. Your first task is to configured your system under test to tell it to send the messages to the PAMSimulator. To do so, retrieve the configuration of the PEC part of the simulator under Patient Encounter Management/Patient Encounter Consumer/Configuration. Do not forgot to select the right character encoding before specifying the port to your system.
In this first step, you will feed the PAMSimulator with a new patient and encounter.
Once the patient is admitted, we will transfer him/her to a new bed.
In this step, we will change the patient class to outpatient.
In this step, we will change back the patient class to inpatient
This last step is used to check the ability of your system to send a pre-admission notification and its cancellation
In this step, we check the capability of your system under test to act as a Patient Encounter Consumer for the PAM profile, and for the Inpatient/Outpatient Encounter Management option in particular. We want you to demonstrate that your system is able to integrate the notifications received from the PAM Simulator and to correctly acknowledge them.
In this step, we will transfer the patient to a new bed.
This step is used to change the patient class to outpatient (code = O)
This step is used to change the patient class to inpatient (code = I)
In this step, we test the capability of your system to pre-admit a patient and to cancel this pre-admission.
This tests the ability of your application to receive an ITI-10 PIX update notification message. This applies only to PIX Consumers that support the PIX Update Notification option.
This test is performed with the Patient Manager simulator http://gazelle.ihe.net/PatientManager sending ITI-10 messages to your system under test.
Tool documentation is located at http://gazelle.ihe.net/content/patient-manager-user-manual
In these steps, you will use the Patietn Manager tool to send a PIX update notification to your application.
The screen shots demonstrate that you have successfully processed the received message(s).
This test applies to Patient Demographic Consumers in the PDQ or PDQv3 Profiles.
This test is performed with the Patient Manager simulator http://gazelle.ihe.net/PatientManager acting as a Patient Demographic Supplier responding to these queries. Note that the test steps are the same no matter the transaction...you will choose the transaction(s) supported by your Consumer.
Tool documentation is located at http://gazelle.ihe.net/content/patient-manager-user-manual
In these steps, you will use the Patient Manager as a Patient Demographics Supplier (PDS) Simulator to respond to your PDQ Query.
The permanent link captures the message exchange. The screen shot demonstrates that you have successfully processed the received query response(s).
This test applies to Patient Demographics Suppliers in the PDQ or PDQv3 Profiles.
This test is performed with the Patient Manager simulator http://gazelle.ihe.net/PatientManager acting as a Patient Demographic Consumer to initiate these queries. Note that the test steps are the same no matter the transaction...you will choose the transaction(s) supported by your Supplier.
Tool documentation is located at http://gazelle.ihe.net/content/patient-manager-user-manual
In these steps, you will use the Patient Manager as a Patient Demographic Consumer (PDC) Simulator to initiate a PDQ Query to your Supplier.
The permanent link captures the message exchange. The screen shots demonstrate that you have successfully processed the received message(s).
This test applies to initiators of the [ITI-78] Patient Demographics Query for Mobile transaction: Patient Demographics Consumers in the PDQm or PMIR Profiles.
This test is performed with the Patient Manager simulator https://gazelle.ihe.net/PatientManager acting as a Patient Demographic Supplier to response to PDQm queries.
The list of patients available on the supplier side are available under the Patients menu. Select "simulated actor" = "PDS" to see which one can be returned in a PDQm query response.
The endpoint to contact is available under menu PDQ* > Patient Demographics Supplier> FHIR configuration.
Verify the conformance of each query issued by your system (blue play icon in the "query" column) and copy the permanent link to the message in your pre-connectathon test in Gazelle Test Management (available from the magnifying glass icon).
The messages received by the simulator are available under HL7 Messages. To restrict the search, either access the page using the "history" icon on the FHIR Configuration page, either sent the following filters in the search criteria panel:
Not all the following test cases might be of relevance for your system under test. The purpose of this test is to make sure you correctly implement the portions of the specifications which are of interest for your system (based on the use cases it supports).
For this first step, we assume that the operator wants to retrieve a list of patients based on some demographics traits. You might want to repeat this step with various combinations of parameters among the following ones to see how the supplier understand your query:
For each query, your system should at least display the number of retrieved entries and some of the demographic traits for each entry in the list.
Note that queries with no search parameter will return no entry at all.
If your system supports both JSON and XML encoding, repeat at least once of the previous query with the second encoding so that you can verify that your system correctly set the requested encoded. You might use the HTTP header or the _format query parameter.
In addition to the query feature, your system shall support the retrieve patient feature. Choose one patient out of the list and directly retrieve the associated resource by performing the retrieve operation.
Example: https://gazelle.ihe.net/PatientManager/fhir/Patient/UUID.
Send a new query to your system under test, make sure the query parameters do not match any patient in the tool. Your system is expected to inform the final user that no match has been found.
You must execute the steps below if your system is able to constrain the domains from which patient identifiers are returned from the Patient Demographics Supplier.
If you can turn off the domain restriction in your system. First, choose a combination of parameters that will return at least one patient with identifiers in several domains. Under the Connectathon > Patient Demographics menu, you will find the patients that will be pre-loaded by suppliers during the Connectathon; they are also known by the Patient Demographics Supplier implemented in the tool. Each patient has an identifier in at least four domains.
Your system shall receive the patient(s) with identifiers in the IHEBLUE, IHEFACILITY, IHEREF and IHEGREEN domains at least.
Reuse the same query parameters as for the previous test but restrict the domain to the IHEBLUE (urn:oid:1.3.6.1.4.1.21367.13.20.3000) domain.
If your query is correctly formatted, the returned patient(s) should only have identifiers with system = urn:oid:1.3.6.1.4.1.21367.13.20.3000.
If your system supports more than one domain, repeat the operation above: constraint the identification domain to IHEBLUE and IHERED (urn:oid:urn:oid:1.3.6.1.4.1.21367.13.20.1000).
Once again, if your query is correctly formatted, the returned patient(s) should only have identifiers with system urn:oid:1.3.6.1.4.1.21367.13.20.3000 and urn:oid:1.3.6.1.4.1.21367.13.20.1000.
Reuse the same query parameters once again and restrict the domain to urn:oid:1.3.6.1.4.1.21367.13.20.9999999. This domain is unknown from the Patient Demographics Supplier.
If your query is correctly formatted, you should receive HTTP 404 error code with an OperationOutcome resource in which the unknown domain is precised. You might or might not give feedback on such error to the final user. No entry shall be displayed to the user (none will be returned by the Patient Demographics Supplier).
Execute this step if your system supports the paging mechanism, meaning that it can add the _count parameter to the query. In this step we assume that the user is able to set the number of records to fetch at one time. If your system does not provide this ability (default quantity, or quantity to choose from a list), simply adapt the test data below.
Set search parameters in a way that they will select at least 3 entries (usually given=rob is a good candidate) and ask for only two records at a time. If your query is correctly formatted, the received Bundle should contain only 2 entries and
Ask for the next batch of results. You should be able to see at least one more patient.
This test applies to responders to the [ITI-78] Patient Demographics Query for Moblie transaction: Patient Demographics Suppliers in the PDQm Profile or Patient Identity Registry in the PMIR Profiles
This test is performed with the Patient Manager simulator https://gazelle.ihe.net/PatientManager acting as a Patient Demographic Consumer to initiate these queries.
There is no prerequisite in terms of data to be load into your system. As such, choose relevant values for the various parameters based on the patient demographics known by your system under test so that matches are returned.
First of all, register your system under test within the tool as a FHIR Responder under SUT Configurations > FHIR Responders. Make sure to select IHE => ITI-78 (PDQm Consumer) in the list of usages.
Access the Patient Demographics Consumer for PDQm in Patient Manager from menu PDQ* > Patient Demographics Consumer > [ITI-78] Patient Demographics Query FHIR.
Verify the conformance of each response issued by your system (blue play icon in the "response" column) and copy the permanent link to the message in your pre-connectathon test in Gazelle Test Management (available from the magnifying glass icon). Also verify that the response format expected in the query matches the response format (XML vs JSON) returned by your system.
Upload the capability statement of your system (showing at least the PDQm Supplier features) in the pre-connectathon test in Gazelle Test Management.
For this first step, we assume that the consumer actor wants to retrieve a list of patients based on some demographics traits. You might want to repeat this step with various combinations of parameters among the following ones to test the behavior of your system:
Note that for parameters of type string, "exact" modifier will be added to the query if the wildcard is not used in the field (when valued). If you want to search of patients with a given starting with Ro, enter Ro* in the form.
If your system supports the Pediatric Demographics Option, you might also want to make sure that your system supports the mothersMaidenName search extension.
For this step, you are asked not to modify the default parameters in the "Additional information" section of the page.
Once you have flll out the form, push the "Send message" button. If matches are returned by your system, they will be displayed at the bottom of the page.
After you have retrieved a first batch of patients. You should also be able to use the resource ID search parameter (_id). You can find the value to use in the response returned by your system in Bundle.entry.resource.Patient.id.value.
Your system under test shall be able to support at least both JSON and XML encodings. Previous step has been executed using "XML" as format to be returned. Return the step above at least one but select Response format = json before sending the message to your system.
Access the detail of the response content and for one of the entries, access the URL displayed in field entry.fullUrl.value. You should retrieve the content of the Patient resource.
Send a new query to your system under test, make sure the query parameters do not match any patient in your database. Your system is expected to send back a Bundle resource with Bundle.total.value = 0.
In this step, we focus on the domain restriction feature. We assume that your system manages at least one domain for patient identification.
First, choose a combination of parameters that will return at least one patient. If your system supports multiple identifier domains, make sure the returned patients will show at least identifiers from two different domains. DO NOT restrict the search to a particular domain. We are interested in knowing what are the identifiers known for this patient.
Repeat the search below but in the "Additional information" section, add the identifier of one of the domains for which the returned patient has a PID assigned. Click on "Add domain to list" for the tool to take the value into account.
Your system is expected to return the patients that are a PID assigned to this domain, only one PID shall appear for each of them.
If your system supports more than one domain, repeat the operation above to add a second domain in the list of domains that are returned.
Once again, your system is expected to return the patients with PID in the mentioned domains. No other PID shall be returned.
Repeat the test but first clean up the list of domains to return and add "1.3.6.1.4.1.21367.13.20.9999" instead (this domain might not be known by your system, otherwise choose another value).
No entry shall be returned. Refer to section 3.78.4.1.3 / Case 4 for details on the expected response (Code HTTP 404 with OperationOutcome or HTTP 200 with no content).
The Patient Demographics Supplier shall represent the incremental responses as specified in FHIR Paging.
Empty the query form and enter parameters that will allow your system to match more than two patients.
In the "Additional information" section, check the box for option "Limit number of responses" and set the limit value to "1".
Click on "Send message", the tool shall display the single entry that is returned by your system and the following message "More results are available on supplier side". As for the other steps, copy the link to this message in Gazelle Test Management.
Click on "Get next results".
In the "Returned patients" section, the number of pages increases. A single patient is displayed.
This test applies to Patient Identity Consumers who initiate queries in the PIX, PIXv3, PIXm, or PMIR Profiles.
This test is performed with the Patient Manager simulator http://gazelle.ihe.net/PatientManager acting as a Patient Identity Cross-Reference Manager (PIX Manager) responding to these queries. Note that the test steps are the same no matter the transaction...you will choose the transaction(s) supported by your Consumer.
Tool documentation is located at http://gazelle.ihe.net/content/patient-manager-user-manual
In these steps, you will use the Patient Manager as a PIX Manager Simulator to respond to your PIX Query.
The permanent link captures the message exchange. The screen shot demonstrates that you have successfully processed the received query response(s).
This test applies to Patient Identifier Cross-Reference Managers (PIX Managers) in the PIX, PIXv3, PIXm, or PMIR Profiles.
This test is performed with the Patient Manager simulator http://gazelle.ihe.net/PatientManager acting as a PIX Consumer to initiate these queries. Note that the test steps are the same no matter the transaction...you will choose the transaction(s) supported by your Supplier.
Tool documentation is located at http://gazelle.ihe.net/content/patient-manager-user-manual
In these steps, you will use the Patient Manager as a PIX Consumer Simulator to initiate a PIX Query to your PIX Manager
The permanent link captures the message exchange.
This tests the ability of your application to receive RAD-1 and RAD-12 patient registration and update messages.
This test is performed with the Patient Manager simulator http://gazelle.ihe.net/PatientManager sending messages to your system under test.
Tools documentation is located at http://gazelle.ihe.net/content/patient-manager-user-manual
Before starting your tests, please configure the tool to send to your appliction using the SUT Configurations menu in the tool.
1. Patient creation
In this step, you are expected to send a new patient to your application .
2. Update patient information
In this step, you are expected to update the first name of the new and send the notification to your system under test.
The screen shots demonstrate that you have successfully handled the received ADT messages.
This test applies to the XCPD Initiating Gateway actot that supports the Deferred Response Opiton. See ITI TF-1: 27.2.2 and ITI TF-2b: 3.55.6.2.
This test is performed with the Gazelle Patient Manager simulator: https://gazelle.ihe.net/PatientManager
See also the User Manual for testing XCPD with the Patient Manager here.
The instructions for this test are detailed in the User Manual for the Patient Manager tool, so they are not repeated here. See: https://gazelle.ihe.net/gazelle-documentation/Patient-Manager/user.html#deferred-response-on-initiating-gateway
After you perform the test, the Patient Manager tool will produce a Test Report. Copy the Permanent Link to to Test Report, then paste that link into Gazelle Test Management as the results for this test.
The basline requirements of the XCPD profile do not require the actors to also implement XUA; however, this test enables you to test your XCPD Initiating Gateway actor that also support XUA X-Service User
This test is performed with the Gazelle Patient Manager simulator: https://gazelle.ihe.net/PatientManager
See also the User Manual for testing XCPD with the Patient Manager here.
The instructions for this test are detailed in the User Manual for the Patient Manager tool, so they are not repeated here. See the instructions for XUA over XCPD for the X-Service User here: https://gazelle.ihe.net/gazelle-documentation/Patient-Manager/user.html#xua-over-xcpd
After you perform the test, find your result in the Patient Manager tool under menu XUA > X-Service User logs. Copy. the URL for your result and paste it into Gazelle Test Management as the result for this test.
This test applies to the XCPD Responding Gateway actot that supports the Deferred Response Opiton. See ITI TF-1: 27.2.2 and ITI TF-2b: 3.55.6.2.
This test is performed with the Gazelle Patient Manager simulator: https://gazelle.ihe.net/PatientManager
See also the User Manual for testing XCPD with the Patient Manager here.
The instructions for this test are detailed in the User Manual for the Patient Manager tool, so they are not repeated here. See: https://gazelle.ihe.net/gazelle-documentation/Patient-Manager/user.html#deferred-response-on-responding-gateway
After you perform the test, the Patient Manager tool will produce a Test Report. Copy the Permanent Link to to Test Report, then paste that link into Gazelle Test Management as the results for this test.
The baseline requirements of the XCPD profile do not require the actors to also implement XUA; however, this test enables you to test your XCPD Responding Gateway actor that also support XUA X-Service Provider
This test is performed with the Gazelle Patient Manager simulator: https://gazelle.ihe.net/PatientManager
See also the User Manual for testing XCPD with the Patient Manager here.
The instructions for this test are detailed in the User Manual for the Patient Manager tool, so they are not repeated here. See the instructions for XUA over XCPD for the X-Service Provider here: https://gazelle.ihe.net/gazelle-documentation/Patient-Manager/user.html#xua-over-xcpd
After you perform the test, find your result in the Patient Manager tool under menu XUA > X-Service Provider logs. Copy. the URL for your result and paste it into Gazelle Test Management as the result for this test.
This section contains test cases performed with the the Order Manager tool.
Tool: http://gazelle.ihe.net/OrderManager
Tool user manual: https://gazelle.ihe.net/gazelle-documentation/Order-Manager/user.html
This test will be performed against the Order Filler part of the OrderManager Gazelle simulator. Here, we are only checking the initiator part of the Order Placer; that means that, in this test, your system is only asked to send messages to the simulator to create and cancel orders. The receiver part of the Order Placer (used to receive notifications from the Order Filler) is tested in the test #30002.
First step to perform: retrieve the configuration of the Order Filler to which send messages:
The LAB-1 (Lab Placer Order Management) transaction defines three structures of messages (OML^O21^OML_O21, OML^O33^OML_O33, OML^O35^OML_O35). As an initiator in this test, you are free to use the structure your system under test supports. Please, add a comment into the pre-connectathon test instance you have started to tell us which structures your system uses.
Your Order Placer is assumed to be coupled with a PAM actor, that means that you should be able either to enter a new patient/encounter into your system or to receive a new patient/encounter from a PAM/PDS or PAM/PES actor. If you need to populate your system with a patient and encounter using an external system, you are free to use the PAMSimulator tool to do so.
The purpose of this test is checking that the Order Placer is able to integrate messages received from the Order Filler in the context of the LAB-2 (creation of new orders) and LAB-1 (cancellation and status updates) transactions. We will also check that the acknowledgements are properly built.
Before beginning this test, do not forgot to check that the configuration (IP address, port, application/facility names) of your system under test is enter within the simulator.
Your Order Placer is assumed to be coupled with a PAM actor, that means that you should be able either to enter a new patient/encounter into your system or to receive a new patient/encounter from a PAM/PDS or PAM/PES actor. If you need to populate your system with a patient and encounter using an external system, you are free to use the PAMSimulator tool to do so. The Order Manager must also know this patient/encounter, to share the patient between the PAMSimulator and the Order Manager tools, read the tutorial available here. If you have already performed the test #30001, you can use the same patient.
We strongly recommend to read the tutorial available here.
In this step, you show the ability of your Order Placer to accept the creation of orders by the laboratory (ORC-1 = "SN") and to assign placer order numbers to these orders. The acknowledgement message (ORL of the corresponding message structure, with ORC-1="NA") must carry the placer order number (ORC-2).
As a receiver in this test, your Order Placer shall be able to integrate all of the three message structures defined in the technical framework. As a consequence, you are asked to perform this step three times.
In this step, you show the ability of your Order Placer to intefration te cancellation of orders by the laboratory (ORC-1="OC") and to acknowledge it (ORL of the corresponding message structure, with ORC-1 = "OK")
In this step, you show the ability of your Order Placer to integrate a change of status of an order (OBR-25 must change of calue, this must be visible in your application screenshot), notified by the Order Filler (ORC-1="SC") and to acknowledge it (ORL of the corresponding message structure, witch ORC-1 = "OK")
This test is dedicated to the LAB-1 transaction from the point of view of the Order Filler. In this test, we will check that the Order Filler is able to integrate the notifications of creation and cancellation of orders received from the Order Placer. In this test, you are asked to use the Order Manager tool as an Order Placer to send messages to your system under test.
Before beginning this test, do not forget to check that the configuration (IP address, port, application/facility names) of your system under test is enter within the simulator.
Your Order Filler is assumed to be coupled with a PAM actor, that means that you should be able either to enter a new patient/encounter into your system or to receive a new patient/encounter from a PAM/PDS or PAM/PES actor. If you need to populate your system with a patient and encounter using an external system, you are free to use the PAMSimulator tool to do so. The Order Manager must also know this patient/encounter, to share the patient between the PAMSimulator and the Order Manager tools, read the tutorial available here. If you have already performed the test #30004, you can use the same patient.
We strongly recommend to read the tutorial available here.
As a receiver in this test, your Order Filler shall be able to integrate all of the three message structures defined in the technical framework. As a consequence, you are asked to perform this step three times.
In this step, your Order Filler shall prove its ability to accept and integrate an order cancellation sent by the Order Placer (ORC-1="CA") and to acknowledge it (ORC-1="CR").
This test will be performed against the OrderPlacer part of the OrderManager Gazelle simulator. Here, we are only checking the initiator part of the Order Filler; that means that, in this test, your system is only asked to send messages to the simulator to create and cancel orders and to update the status of orders. The receiver part of the Order Filler (used to receive notifications from the Order Placer) is tested in the test #30003.
First step to perform: retrieve the configuration of the Order Placer to which send messages:
The LAB-1 (Lab Placer Order Management) and LAB-2 (Lab Filler Order Management) transactions define three structures of messages (OML^O21^OML_O21, OML^O33^OML_O33, OML^O35^OML_O35). As an initiator in this test, you are free to use the structure your system under test supports. Please, add a comment into the pre-connectathon test instance you have started to tell us which structures your system uses.
As described in the Technical Framework (TF-LAB volume 1), your system under test is assumed to implement actors from the PAM or PDQ profile in addition of the Order Filler actor from the LTW integration profile. That means that your system is able either to create a new patient and encounter or to receive those information from an external system. If your system under test implements PEC and/or PDC actors from the PAM profile, feel free to use the PAMSimulator tool to receive new patient and encounter for initializing this test.
As a first step for this test, you will have to create two new orders (the status of the first one will be updated in step 2 and the second one will be cancelled in step 3).
In this second step, the status of the fist placed order will be updated to "A" (Some but not all results available).
This third test is dedicated to the cancellation of the second order you have sent.
This test is used to test the capability of the Order Filler and Automation Manager actors to manage work orders. For both actors, the test will be performed against the Order Manager Gazelle tool.
First step to perform: retrieve the configuration of the Automation Manager to which send messages:
The LAB-4 (Work Order Management) transaction defines three structures of messages (OML^O21^OML_O21, OML^O33^OML_O33, OML^O35^OML_O35). As an initiator in this test, you are free to use the structure your system under test supports. Please, add a comment into the pre-connectathon test instance you have started to tell us which structures your system uses.
Your Order Filler is assumed to be coupled with a PAM actor, that means that you should be able either to enter a new patient/encounter into your system or to receive a new patient/encounter from a PAM/PDS or PAM/PES actor. If you need to populate your system with a patient and encounter using an external system, you are free to use the PAMSimulator tool to do so.
In this test, you will use the Order Manager tool to create a work order to send to your system under test. Before starting this test, make sure you have properly registered your system in the "SUT Configurations" section of the tool and that your system under test is reachable from the Internet (no firewall prevents it from receiving messages from our tools).
We strongly recommend to read the tutorial available here.
As a receiver in this test, your Automation Manager shall be able to integrate all of the three message structures defined in the technical framework. As a consequence, you are asked to perform this step three times (if your system does not support all of the three messages, please leave a comment in Gazelle to explain which one it supports, and why it does not support all of them).
In this step, you will cancel the first work order received by your Automation Manager.
The aim of this test is to prove the capability of the Order Filler, Automation Manager and Order Result Tracker to manage laboratory test results. In other words, we check that your system is able to send, receive and/or integrate the messages defined in Order Result Management (LAB-3) and Test Result Management (LAB-5).
Those tests have to be performed against the Order Manager Gazelle tool, which will play the role of Order Result Tracker, Order Filler or Automation Manager according to the cases.
The Order Filler actor is involved in both LAB-3 (as initiator) and LAB-5 (as receiver) transactions. In this test, we check that your system is able to integrate the test results sent by the Automation Manager (role played by the Order Manager Gazelle tool) and to send order results to the Order Result Tracker (role played by the Order Manager tool).
This part of the test will use the Order Manager as an Order Result Tracker. First step to perform: retrieve the configuration of the Order Result Tracker to which send messages:
The LAB-3 (Order Result Management) transaction defines two structures of messages (OUL^R22^OUL_R22 and ORU^R01^ORU_R01). As an initiator in this transaction, you are free to use the structure your system under test supports.
You are assumed to perform this test after working on test #30004 so that you can reuse the order previously placed in the Order Placer. If you have followed the instruction of the test #30004 you shall have the values: ORC-5="A" and OBR-25="P".
This part of the test will use the Order Manager part as an Automation Manager. If you have already performed tests #30003 and #30004, your system under test is already registered within the tool.
The LAB-5 (Test Results Management) transaction defines two structures of messages (OUL^R22^OUL_R22 and OUL^R23^OUL_R23). As a responder in this transaction, your system under test must be able to integrate both. Go to Results management --> Automation Manager.
The Automation Manager is involved in LAB-5 transaction as an initiator. In this test, we check the capability of your system to send messages to the Order Filler part of the Order Manager Gazelle tool. LAB-5 transaction defines two message structures (OUL^R22^OUL_R22 and OUL^R23^OUL_R23); as an initiator for this transaction, your system under test must support one out of these two structures. If your system supports both, please repeat this test twice so that we can check the conformance of the messages produced by your system.
This part of the test will use the Order Manager as an Order Filler. First step to perform: retrieve the configuration of the Order Filler to which send messages:
The Order Result Tracker is involved in LAB-3 transaction as a responder. In this test, we check the capability of your system to integrate the messages received from an Order Filler (role played by the Order Manager Gazelle tool). This transaction defines two message structures (OUL^R22^OUL_R22 and ORU^R01^ORU_R01); as a responder your system must support both of them.
This test used the Order Manager tool as an Order Filler. In order to tell the simulator to which system send messages, your first action will be to create a new configuration for your system under test within the Order Manager. Go to the "SUT configurations" section to do so.
Then, go to Results management --> Order Filler --> Send test results to your SUT to start the test.
This test concerns the Analyzer Manager actor. You will need to validate with EVSClient tool the IHE conformance of your HL7 messages.
As your system implements the Analyzer Manager actor, you will need to test the HL7 messages used in the LAB-27, LAB-28 and LAB-29 transactions.
Your system must be able to send HL7 messages (to the Analyzer actor) of types :
To test the IHE conformance of your HL7 messages, go to the EVSCLient tool at this location : EVSClient
If it is your first time with this tool, please read the user manual : EVSClient User Manual
In the EVSClient tool, go to the HL7 menu entry and choose "HL7v2" then click on "Message Validation".
Paste your message to validate in the box. (You can hit the "Guess" button to preset the Message Profile OID.)
For example, for the RSP^K11^RSP_K11 message :
Do this for all messages and don't forget to copy/paste the "Permanent link" of the validation result to Gazelle.
This test concerns the bi-directional Analyzer actor. You will need to validate with EVSClient tool the IHE conformance of your HL7 messages.
As your system implements the Analyzer actor and supports the bi-directional communication option, you will need to test the HL7 messages used in the LAB-27, LAB-28 and LAB-29 transactions.
Your system must be able to send HL7 messages (to the Analyzer Manager actor) of types :
To test the IHE conformance of your HL7 messages, go to the EVSCLient tool at this location : EVSClient
If it is your first time with this tool, please read the user manual : EVSClient User Manual
In the EVSClient tool, go to the HL7 menu entry and choose "HL7v2" then click on "Message Validation".
Paste your message to validate in the box, and hit the "Guess" button to preset the Profile OID.
For example, for the QBP^Q11^QBP_Q11 message :
Do this for all messages and don't forget to copy/paste the "Permanent link" of the validation result to Gazelle.
This test concerns the Analyzer actor. You will need to validate with EVSClient tool the IHE conformance of your HL7 messages.
As your system implements the Analyzer actor and supports the bi-directional communication option, you will need to test the HL7 message used in the LAB-29 transaction.
Your system must be able to send HL7 messages (to the Analyzer Manager actor) of type :
OUL^R22^OUL_R22, for the LAB-29 transaction (AWOS Status Change)
To test the IHE conformance of your HL7 messages, go to the EVSCLient tool at this location : EVSClient
If it is your first time with this tool, please read the user manual : EVSClient User Manual
In the EVSClient tool, go to the HL7 menu entry and choose "HL7v2" then click on "Message Validation".
Paste your message to validate in the box, and hit the "Guess" button to preset the Profile OID.
For example :
This test concerns only the Analyzer Manager actor. You will need to communicate with the LAW Simulator (which is included in the Order Manager Simulator), in order to simulate the LAB-27 transaction of the LAW Profile.
Access the LAW Simulator tool at this location : Order Manager
If it is your first time with this tool, please read the user manual : Order Manager User Manual
Please be reminded that if you are logged in your configurations will be private, otherwise it will be public.
In this test, the LAW Simulator plays the role of the "Analyzer". It is used to query the SUT (System Under Test) for an AWOS related to the specimen.
The SUT implements the Analyzer Manager (Analyzer Manager acts as a responder in this test) :
The messages exchanged between the simulator and the SUT can be found in the message table on the very same page or from the HL7 Message menu entry.
Hit the link on the left side of the raw (first column of the table), to display the Permanent Link to the test report for the corresponding HL7 message. (You can access to the hl7v2 report tutorial for more details.)
If the validation report status is passed for the message and the response, copy the "test result link" and paste it in Gazelle as the result of this test.
For further details, see this tutorial : How enter your pre-connectathon test result in Gazelle (only for the LAW, LCSD and LBL Profiles of the Laboratory Domain which send HL7v2 messages)?
This test concerns only the Analyzer actor. You will need to communicate with the LAW Simulator (which is included in the Order Manager Simulator), in order to simulate the LAB-27 transaction of the LAW Profile.
Access the LAW Simulator tool at this location : Order Manager
If it is your first time with this tool, please read the user manual : Order Manager User Manual
In this test, the LAW Simulator will be used to respond to the SUT (System Under Test) query.
As the SUT implements the Analyzer (Analyzer acts as an initiator in this test) :
The messages exchanged between the simulator and the SUT can be found in the message table on the very same page or from the HL7 Message menu entry.
Hit the link on the left side of the raw (first column of the table), to display the Permanent Link to test report for the corresponding HL7 message. (You can access to the hl7v2 report tutorial for more details.)
If the validation report status is passed for the message and the response, copy the "test result link" and paste it in the comment section of the pre-connectathon log return page as the result for this test.
For further details, see this tutorial : How enter your pre-connectathon test result in Gazelle (only for the LAW, LCSD and LBL Profiles of the Laboratory Domain which send HL7v2 messages)?
This test concerns only the Analyzer Manager actor. You will need to communicate with the LAW Simulator (which is included in the Order Manager Simulator), in order to simulate the LAB-28 transaction of the LAW Profile.
Access the LAW Simulator tool at this location : Order Manager
If it is your first time with this tool, please read the user manual : Order Manager User Manual
In this test, the LAW Simulator will receive the AWOS from the SUT, saves all information and responds with an acknowledgment.
As the SUT implements the Analyzer Manager (Analyzer Manager acts as an initiator in this test) :
The messages exchanged between the simulator and the SUT can be found in the message table on the very same page or from the HL7 Message menu entry.
Hit the link on the left side of the raw (first column of the table), to display the Permanent Link to test report for the corresponding HL7 message. (You can access to the hl7v2 report tutorial for more details.)
If the validation report status is passed for the message and the response, copy the "test result link" and paste it in the comment section of the pre-connectathon log return page as the result for this test.
For further details, see this tutorial : How enter your pre-connectathon test result in Gazelle (only for the LAW, LCSD and LBL Profiles of the Laboratory Domain which send HL7v2 messages)?
This test concerns only the Analyzer actor. You will need to communicate with the LAW Simulator (wich is included in the Order Manager Simulator), in order to simulate the LAB-28 transaction of the LAW Profile.
Access the LAW Simulator tool at this location : Order Manager
If it is your first time with this tool, please read the user manual : Order Manager User Manual
Please be reminded that if you are logged in your configurations will be private, otherwise it will be public.
In this test, the LAW Simulator will be used to send to the SUT (System Under Test) an AWOS related to a specimen.
As the SUT implements the Analyzer (Analyzer acts as a responder in this test) :
The messages exchanged between the simulator and the SUT can be found in the message table on the very same page (see the "Test report" panel) or from the HL7 Message menu entry.
Hit the link on the left side of the raw (first column of the table), to display the Permanent Link to the test report for the corresponding HL7 message. (You can access to the hl7v2 report tutorial for more details.)
If the validation report status is passed for the message and the response, copy the "test result link" and paste it in Gazelle as the result of this test.
For further details, see this tutorial : How enter your pre-connectathon test result in Gazelle (only for the LAW, LCSD and LBL Profiles of the Laboratory Domain which send HL7v2 messages)?
This test concerns only the Analyzer Manager actor. You will need to communicate with the LAW Simulator (which is included in the Order Manager Simulator), in order to simulate the LAB-29 transaction of the LAW Profile.
Access the LAW Simulator tool at this location : Order Manager
If it is your first time with this tool, please read the user manual : Order Manager User Manual
Please be reminded that if you are logged in your configurations will be private, otherwise it will be public.
In this test, the LAW Simulator will be used to send to the SUT (System Under Test) the test results.
As the SUT implements the Analyzer Manager (Analyzer Manager acts as a responder in this test) :
The messages exchanged between the simulator and the SUT can be found in the message table on the very same page or from the HL7 Message menu entry.
Hit the link on the left side of the raw (first column of the table), to display the Permanent Link to the test report for the corresponding HL7 message. (You can access to the hl7v2 report tutorial for more details.)
If the validation report status is passed for the message and the response, copy the "test result link" and paste it in Gazelle as the result of this test.
For further details, see this tutorial : How enter your pre-connectathon test result in Gazelle (only for the LAW, LCSD and LBL Profiles of the Laboratory Domain which send HL7v2 messages)?
This test concerns only the Analyzer actor. You will need to communicate with the LAW Simulator (which is included in the Order Manager Simulator), in order to simulate the LAB-29 transaction of the LAW Profile.
Access the LAW Simulator tool at this location : Order Manager
If it is your first time with this tool, please read the user manual : Order Manager User Manual
In this test, the LAW Simulator will be used to respond to the SUT (System Under Test) query.
As the SUT implements the Analyzer Manager (Analyzer Manager acts as an initiator in this test) :
The messages exchanged between the simulator and the SUT can be found in the message table on the very same page or from the HL7 Message menu entry.
Hit the link on the left side of the raw (first column of the table), to display the Permanent Link to test report for the corresponding HL7 message. (You can access to the hl7v2 report tutorial for more details.)
If the validation report status is passed for the message and the response, copy the "test result link" and paste it in the comment section of the pre-connectathon log return page as the result for this test.
For further details, see this tutorial : How enter your pre-connectathon test result in Gazelle (only for the LAW, LCSD and LBL Profiles of the Laboratory Domain which send HL7v2 messages)?
In this test, the OrderManager tool is a DICOM Modality Worklist SCP, and your application (most commonly an Acquisition Modality actor) is the MWL SCU.
The test is a sanity check of your worklist query capabilities for pre-Connectathon testing. In some cases, the Order Manager is used as a MWL SCP during a Connectathon, and this test helps prepare you for that.
Refer to the OrderManager User Manual for instructions.
As a MWL SCU in this test, your application will query the OrderManager (MWL SCP) for a worklist.
The link to the worklist entry & the screen shot demonstrate that you have successfully received worklist.
This tests the RAD-2 transaction from the point of view of the Order Filler as system under test.
In this test, we will check that your Order Filler is able to integrate the creation and cancellation of orders received from the OrderManager tool playing the role of the Order Placer actor.
Refer to the OrderManager user manual and these details about sending RAD-2.
As a receiver in this test, your Order Filler shall be able to integrate all of the three message structures defined in the technical framework. As a consequence, you are asked to perform three steps.
In this step, your Order Filler proves its ability to accept and integrate an order creation sent by the Order Placer (ORC-1="NW")
In this step, your Order Filler proves its ability to accept and integrate an order cancellation sent by the Order Placer (ORC-1="CA") and to acknowledge it (with an ACK message).
In this step, your Order Filler proves its ability to accept the discontinue of an ongoing order sent by the Order Placer (ORC-1="DC") and to acknowledge it (with an ACK message).
The permanent links to the test report & the screen shots demonstrate that you have successfully handled the received order messages.
This tests the RAD-3 transaction from the point of view of the Order Placer as system under test.
In this test, we will check that your Order Placer is able to integrate the creation and cancellation of orders received from the Order Manager tool playing the role of the Order Filler actor.
Refer to the Order Manager user manual.
As a receiver in this test, your Order Placer shall be able to integrate all of the three message structures defined in the technical framework. As a consequence, you are asked to perform three steps.
In this test, your Order Placer receives an ORM (v2.3.1) or OMG (v2.5.1) from the Order Manager acting as Order Filler. You respond with an ORR (v2.3.1) or ORG (v2.5.1)
In this test, your Order Placer receives an ORM-Cancel (v2.3.1) or OMG-Cancel (v2.5.1) from the Order Manager acting as Order Filler. You respond with an ACK.
In this test, your Order Placer receives an ORM-Status update (v2.3.1) or OMG-Status update (v2.5.1) from the Order Manager acting as Order Filler. You respond with an ACK.
The permanent links to the test report & the screen shots demonstrate that you have successfully handled the received order messages.
This tests the RAD-4 (Procedure Scheduled) and RAD-13 (Procedure Updated) transactions from the point of view of the Image Manager as system under test.
In this test, we will check that your Image Manager is able to integrate the scheduling and cancelllation of procedures received from the Order Manager tool playing the role of the Order Filler actor.
You may use the Order Manager to send
Refer to the Order Manager user manual.
As a receiver in this test, your Image Manager shall be able to integrate all of the message structures defined in the technical framework. As a consequence, you are asked to perform these four steps.
In this step, your Image Manager proves its ability to accept and integrate a new scheduled procedure sent by the Order Filler.
In this step, your Order Image Manager proves its ability to accept and integrate an order cancellation sent by the Order Filler (ORC-1="CA") and to acknowledge it.
In this step, your Image Manager proves its ability to accept the discontinue of an ongoing order sent by the Order Filler (ORC-1="DC") and to acknowledge it.
In this step, your Image Manager proves its ability to accept the procedure update/change order request (order still scheduled or in progress) sent by the Order Filler (ORC-1="XO") and to acknowledge it.
In this step, your Image Manager proves its ability to accept the procedure update/order completed sent by the Order Filler (ORC-1="XO") and to acknowledge it.
The permanent links to the test report & the screen shots demonstrate that you have successfully handled the received order messages.
Pre-connectathon testing for systems implementing the PDI (Protable Data for Images) Profile as a Portable Media Creator are perfomed using the PDI Media Tester tool and associated test plans originally developed by Northwestern University.
Location of tool and test plan documentation:
Specific instructions for the Portable Media Creator actor are in the test cases at the link above..
We use the Portable Media Tester application and test plans developed by Northwestern to test PDI media created by a Portable Media Creator.
Connectathon-related Considerations
When you prepare your media for Connectathon testing, you should include on your media DICOM objects that represent the range of images, structured reports, GSPS objects, Key Image Notes, etc that can be produced by your application. Including a full set (rather than one image) enhances the interoperability testing with your Portable Media Importer test partners.
Testing Instructions
Gazelle Test Management has a feature that allows participants in a testing session to share sample objects with other participants.
In Gazelle Test Management, a "sample" is any object or message that an application creates and is used by another application. Typical samples include:
Gazelle Test Management uses profiles and actors selected during System Registration to determine which systems are 'creators' of samples and which systems are 'consumers' of samples
Creators upload a file containing their sample into Gazelle Test Management (Testing-->Sample exchange). Consumers find samples uploaded by their Creator peers. Consumers download the samples and are able to test them with their application.
Test cases follow below...
Overview
The AIR Evidence Creator will submit samples AI Results (SR objects) produced by its system and the associated DICOM images. The goal of this test is to prepare Consumer actors so they are not surprised during Connectathon.
In order to facilitate this testing, Creators....please submit your samples 2-3 weeks before the usual preparatory test deadlines.
Sample | Creator(s) | Consumer(s) |
DICOM SR(s) DICOM Segmentation, Parametric Map or KOS IOD(s) DICOM image(s) |
AIR Evidence Creator |
AIR Image Mgr, Image Display, Imaging Doc Consumer |
Instructions for Evidence Creators:
FIRST, prepare the sample(s)-->
Evidence Creator system must provide a sample that includes both DIOM SRs and image objects. Although it is likely that an Acquisition Modality (ie not Evidence Creator) will be the source of images for which AIR SR objects are created, we ask the Evidence Creator to submit them together as a 'sample'
The AI result SR objects(s) Evidence Creator should be representative of what the system would created during expected clinical use when an AI algorithm is run.
3. Validate the SR using Gazelle EVS (green arrow on the sample page). You must select the DICOM validator from Pixelmed. During the Connectathon, we will be looking for a validation result with no errors (warnings are OK).
4. Evidence Creator of the AI result SR object should also upload the images of the study to which the SR pertains.
5. If you have this ability, render the objects you produced as a reference. The goal during Connectathon is for an Image Display partner to examine your rendered images and SR objects and compare that to what their software produces. Perform a screen capture and/or save as JPEG or other reasonable format. Upload the screenshot onto the sample page in Gazelle Test Management.
Repeat Steps 1 - 5 for each AI algorithm and resulting AI sample.
Finally, create a short txt file indicating you have completed the upload step. Upload '''that''' txt file into Gazelle Test Management as the result file for this preparatory test.
There is no specific evaluation for this test. Feedback comes as your partner Image Display and Image Manager partners test with your images in their lab.
Instructions for Consumers:
Finally, for both Creators & Consumers
The APPC profile enables creating Patient Privacy Policy Consent documents of many, many variations. For Connectathon testing, we have defined tests based on use cases documented in the APPC Profile.
It will help APPC Content Creators & Content Consumers to be familiar with the APPC tests prior arriving at Connectathon.
(Note: We currently have no tool for evaluating APPC documents)
Before Connectathon -- for APPC Content Creators:
(1) Read Connectathon test APPC_10_Read_This_First. Find this on your main Connectathon page in gazelle (menu Connectathon-->Connectathon). This test specifies policies, organizations, facilities, and providers that you will need to encode in the APPC Policy Consent documents that you will create for Connectathon testing.
(2) Read the APPC Connectathon tests for Case1, Case5, Case6, and "Other". For Connectathon, you will be required to create 3 documents.
You must perform any 2 of these 3 tests: Case1, Case5, Case6.
You must perform test APPC_Other_Use_Case.
(3) We highly encourage Content Creators to create these documents prior to arriving at Connectathon.
Before Connectathon -- for APPC Content Consumers:
(1) Read Connectathon test APPC_10_Read_This_First. Find this on your Test Execution page in Gazelle Test Management. This test specifies policies, organizations, facilities, and providers that Content Creators will use in the policy consent documents they create
(2) Above, we asked Content Creators to provide their sample APPC documents in advance of Connectathon, so...
(3) Check for sample APPC documents provided by Content Creators:
Finally, for both Creators & Consumers:
This is a family of tests. Each of the tests (40180-01, 02, 03, ...) is for a specific document type, but the instructions are the same. We have different test numbers to let us keep track of the document type in Gazelle Tests Management.
Please refer to the list of document types in the table below.
The goal is 'No Surprises' at the Connectathon
Creators....please upload your samples two weeks before the normal due date for Preparatory tests. This is to allow other participants to review your result.
1. Create a document according to the test number (see table below). Name the file using this convention:
2. Upload the document into Gazelle Test Management under menu Testing-->Sample exchange.
3. In Gazelle Test Management for this test instance, upload a brief note (txt file) indicating this task is done and upload the file as the results for this test.
4. Finally, change the status of the test to "Verified by Vendor". This is a signal to the Technical Manager that you have completed the task associated with your actor.
5. Repeat these instructions for each document type you can create.
1. Find samples uploaded by other vendors for test 40180-xx in Gazelle Test Management under menu Testing -> Sample exchange on the Samples available for rendering tab. (When a Content Creator or Form Filler has uploaded a sample, you will see a small triangle in front of their system name.) This page will evolve as vendors add samples, so be patient. The deadline for Creators to submit samples is typically two weeks prior to the Preparatory test deadline. Technical Managers of each testing event publish the deadline.
2. Retrieve the documents created by the other vendors. "Process/render" them so that you are confident your software understands the content.
For Content Consumer actors, "Process/render" means to apply one or more of the options:
For Form Manager actors, "Process/render" means to take the prepop data and make it part of the form.
3. You will perform one or more of those actions on the sample documents and then provide evidence that you have performed this action. That evidence will be some screen capture or database dump from your system. Upload that evidence into Gazelle Test Management as the results for this Preparatory test.
Document Types for this sample-sharing test:
Preparatory |
IHE Profile Document type |
40180-01 | XDS-MS Referral Document |
40180-02 | XDS-MS Discharge Summary |
40180-03 | ED Referral |
40180-04 | XPHR Extract |
40180-05 | XPHR Update |
40180-06 | Antepartum History and Physical |
40180-07 | Antepartum Summary |
40180-08 | Antepartum Laboratory Report |
40180-09 | Antepartum Education |
40180-10 | Triage Note |
40180-11 | ED Nursing Note |
40180-12 | Composite Triage and ED Nursing Note |
40180-13 | ED Phyisician Note |
40180-14 | Immunization Content |
40180-15 | Sharing Lab Report (XD-LAB) |
40180-16 | ITI - Basic Patient Privacy Consent acknowledgement (not CDA, but BPPC) |
40180-17 | ITI - XDS-SD Scanned Document |
40180-18 | Labor/Delivery Admission History and Physical |
40180-19 | Labor/Delivery Summary |
40180-20 | Maternal Discharge Summary |
40180-21 | EMS Transfer of Care |
40180-22 | Patient Plan of Care (PPOC) |
40180-26 | eNursing Summary |
40180-27 | Newborn Discharge Summary |
40180-28 | Postpartum Visit Summary |
40180-29 | EMS Transport Summary |
40180-30 | Interfacility Transport Summary |
40180-31 | RECON |
40180-32 | Patient Care Plan (PtCP) |
40180-33 | RIPT |
40180-34 | International Patient Summary (IPS) - CDA, CDA Complete, and CDA Occ Data Health Options |
40180-100 | QRPH - CRD: Clinical Research Document |
40180-101 | QRPH - DSC: Drug Safety Content |
40180-106 | QRPH - PRPH-Ca: Physician Reporting to a Public Health Repository-Cancer Registry |
40180-108 | QRPH - QRPH - BFDR-E - Birth and Fetal Death Reporting - Enhanced |
40180-109 | QRPH - EHDI - HPoC: UV Realm: Hearing Plan of Care UV Realm |
40180-110 | QRPH - EHDI - HPoC: US Realm: Hearing Plan of Care US Realm |
40180-111 | QRPH - HW: Healthy Weight |
40180-113 | QRPH - QME-EH: Quality Measure Execution - Early Hearing |
40180-114 | QRPH - VRDR: Vital Records Death Reporting |
40180-200 | CARD - CIRC: Cardilogy Imaging Report Content |
40180-201 | CARD - CRC: Cath Report Content |
40180-202 | CARD - RCS-C: Registry Content Submission - Cardiology |
40180-203 | CARD - EPRC-I/E: Electrophysiology Report Content - Implant/Explant |
40180-300 | EYECARE - GEE: General Eye Evaluation Content |
40180-301 | EYECARE - EC-Summary: Eye Care Summary Record Content |
In this “test”, Creators of DICOM objects submit a sample data set that will be reviewed by other Consumer test partners. The goal of this test is to prepare Consumer actors (Image Manager, Image Display) so they are not surprised during Connectathon events.
In order to facilitate this testing, Creators....please submit your samples 2-3 weeks before the usual Preparatory test deadline.
Sample | Creator(s) | Consumer(s) |
DICOM image(s) DICOM SR(s) |
Acquisition Modality in various profiles Evidence Creator or Modality in various profiles |
Image Mgr, Image Display, Dose Reporter or Consumer |
Some image sets are too large to upload, and Gazelle reports an error. If you encounter this, please contact the Connectathon Technical Project Manager for instructions on alternatives.
Create a short txt file indicating you have completed the upload step. Upload '''that''' txt file into Gazelle Test Management as the result file for this preparatory test.
You may submit more than one set.
Creators and Consumers: In Gazelle Test Management, find the entry for this test instance, and change the status to "Verified by Vendor". This is a signal to the Technical Manager that you have completed this task.
There is no specific evaluation for this test. Feedback comes as Consumers partners test with DICOM objects in their lab. The goal is no surprises.
In this “test”, Creators of FHIR Resoruce submit a sample data set that will be reviewed by other Consumer test partners. The goal of this test is to prepare Consumer actors so they are not surprised during Connectathon events.
In order to facilitate this testing, Creators....please submit your samples 2-3 weeks before the usual Preparatory test deadline.
Sample | Creator(s) | Consumer(s) |
FHIR Resource (json or xml format |
Content Creator | Content Consumer |
These are generic instructions for uploading samples into Gazelle Test Management. Individual IHE Profiles and FHIR IGs contain definitions and constraints for specific content types.
Create a short txt file indicating you have completed the upload step. Upload '''that''' txt file into Gazelle Test Management as the result file for this preparatory test.
You may submit more than one set.
Creators and Consumers: In Gazelle Test Management, find the entry for this test instance, and change the status to "Verified by Vendor". This is a signal to the Technical Manager that you have completed this task.
There is no specific evaluation for this test. Feedback comes as Consumers partners test with DICOM objects in their lab. The goal is no surprises.
Prior to the Connectathon, it is very beneficial for participants to have access to sample DICOM objects, HL7 messages,CDA documents, etc produced by their test partners' applications. We use Gazelle as an intermediary to exchange samples. In Gazelle Test Management, a "sample" is any object or message that an application creates and is used by another application.
This table lists samples from IHE Eye Care domain profiles U-EYECARE, GEE, and EC-Summary. Beneath the table are instructions for Creators to upload samples into Gazelle prior to the Connectathon, and instructions for Consumers who want to download samples to test in advance of Connectathon. Note that due to the size of some DICOM objects, we have separate instructions for exchange of DICOM samples.
The deadline for sharing samples is typically 2-3 weeks prior to the Connectathon. The due date will be announced by the Technical Project Manager.
Sample | Type | Creator(s) | Comsumer(s) |
EYECARE-15 Patient Registration | ADT^A04 | Pat Registration Src | Pat Registration Cons, DSS/OF in Model I & III |
EYECARE-16 Appt Scheduling | SIU^S* | Appt Scheduler | Appt Consumer |
EYECARE-17 Charge Posting | DFT^P03 | DSS/OF w/ Chg Posting option | Chg Processor |
EYECARE-19 Pat Demog Update | ADT^A08 | DSS/OF in Model I & III | Img Mgr in Model I & III |
EYECARE-20 Merge Pat IDs | ADT^A40 | Pat Registration Src w/ Merging option | Pat Reg Cons, DSS/OF, Img Mrg w/ Merging option |
EYECARE-21 Procedure Scheduling | OMG^O19 | DSS/OF in Model I & III | Img Mgr in Model I & III |
EYECARE-22 Procedure Status Upd | OMG^O19 | Img Mgr w/ Proc Status Upd HL7 option | DSS/OF w/ Proc Status Upd HL7 option |
EYECARE-23 Refractive Meas (no PatID) | XML | Refractive Measurement Source (RMS) | Refractive Measurement Consumer (RMC) |
EYECARE-24 Refractive Meas (valid PatID) | XML | RMSI, RMS w/ Pat Device List option | RMC w/ Pat Device List option |
GEE Document | CDA | Content Creator | Content Consumer |
EC Summary Document | CDA | Content Creator | Content Consumer |
Overview
The goal of this “test” is to provide samples for other vendors to display. You should submit a “representative sample” of the data produced by your system.
GSPS objects are supported in the Consistent Presentation of Images profile. CPI Evidence Creator actors create GSPS objects (requirement) and may optionally produce images. Likewise, CPI Acquisition Modalities actors create images and GSPS objects.
Each system (Modality or Evidence Creator) should submit samples of the Image and/or GSPS objects. The goal of this test is to prepare Consumer actors (Image Manager, Image Display) so they are not surprised during Connectathon.
In order to facilitate this testing, Creators....please submit your samples 2-3 weeks before the Preparatory test deadlines.
Sample | Creator(s) | Consumer(s) |
DICOM image(s) DICOM GSPS(s) |
CPI Acquisition Modality and Evidence Creator |
Image Mgr, Image Display |
Instructions for DICOM Image and GSPS Creators:
FIRST, prepare the samples-->
Both Modality and Evidence Creator systems must provide samples that include both images and GSPS objects. Acquisition Modalities will be the source of images for which GSPS objects will be created.
The GSPS objects and images created by the Modality or Evidence Creator should be of the same genre as those normally generated by the Evidence Creator during expected clinical use.
To insure adequate testing of capabilities, the set of GSPS objects you create should include at least 15 elements drawn from the following GSPS capabilities:
SECOND, upload your samples -->
Instructions for DICOM Image and GSPS Consumers:
Finally, for both Creators & Consumers
The purpose of this exchange of IMR samples is to enable testing prior to Connectathon week and reduce 'surprises'.
In this “test”, Creators of IMR reports submit a sample data set that will be reviewed by other Consumer test partners. The goal of this test is to prepare Consumer actors (Image Manager, Image Display) so they are not surprised during Connectathon events.
In order to facilitate this testing, Creators....please submit your samples 2-3 weeks before the usual Preparatory test deadline. (During Connectathon, your Report Creator will also be required to create a report based on a study provided by the Connectathon Techincal Manager.)
Sample | Creator(s) | Consumer(s) |
IMR Bundle |
Report Creator |
Report Reader Rendered Report Reader Report Repository |
DICOM study |
The Report Creator must also provide the DICOM study(ies) associated with its report(s). Each study will include images, and may optionally contain Stuctured Report (SR), different types of Presentation States (PR), or Segmentation objects. Providing this sample ensures that the Report Creator has access to images (that the Image Manager will store) that are compatible with its reporting application. |
Image Manager |
Some image sets are too large to upload, and Gazelle reports an error. If you encounter this, please contact the Connectathon Technical Project Manager for instructions on alternatives.
Creators and Consumers: In Gazelle Test Management, find the entry for this test instance, and change the status to "Verified by Vendor". This is a signal to the Technical Manager that you have completed this task.
There is no specific evaluation for this test. Feedback comes as your partner Consumers partners test with the report & images in their lab. The goal is no surprises.
In this “test”, the Image Display in the Integrated Reporting Application (IRA) profile submits a sample DICOM study that will be the subject of reporting using the interactions defined in IRA. The goal of this test is to enable Consumer actors (e.g., Report Creator, Evidence Creator) to have access to these studies prior to Connectathon events.
In order to facilitate this testing, Creators....please submit your samples 2-3 weeks before the usual Preparatory test deadline.
Sample | Creator(s) | Consumer(s) |
DICOM study |
The Image Display must provide at least one DICOM study that it will have availabile for reporting using the IRA profile. The study will include images, and may optionally contain Structured Report (SR), different types of Presentation States (PR), or Segmentation objects. Providing this sample ensures that the Consumer systems have access to studies on the Image Display that are compatible with its reporting-related application. |
Content Creator (grouped w/ Report Creator, Evidence Creator...) |
Some image sets are too large to upload, and Gazelle reports an error. If you encounter this, please contact the Connectathon Technical Project Manager for instructions on alternatives.
Creators and Consumers: In Gazelle Test Management, find the entry for this test instance, and change the status to "Verified by Vendor". This is a signal to the Technical Manager that you have completed this task.
There is no specific evaluation for this test. The goal is no surprises.
Overview
The goal of this test is to provide samples for other vendors to display. The KOS creator will submit a “representative sample” of the data produced by its system.
KOS objects are supported in the Key Image Notes profile. CPI Evidence Creator actors create GSPS objects (requirement) and may optionally produce images. Likewise, CPI Acquisition Modalities actors create images and GSPS objects.
Each system (Modality or Evidence Creator) should submit samples of the Images and the KOS objects containing references to some images. The goal of this test is to prepare Consumer actors (Image Manager, Image Display) so they are not surprised during Connectathon.
In order to facilitate this testing, Creators....please submit your samples 2-3 weeks before the usual preparatory test deadlines.
Sample | Creator(s) | Consumer(s) |
DICOM image(s) DICOM KOS(s) |
KIN Acquisition Modality and Evidence Creator |
Image Mgr, Image Display |
Instructions for DICOM Image and KOS Creators:
FIRST, prepare the samples-->
Both Modality and Evidence Creator systems must provide samples that include both images and KOS objects. Acquisition Modalities will be the source of images for which KOS objects will be created.
The KOS objects and images created by the Modality or Evidence Creator should be representative of what the system would created during expected clinical use.
Note: In order to identify the creator of the note, it would be most beneficial if the Patient Name chosen reflected the company/product of the system creating the Key Object Note. This is relatively easy for Modalities as they create the original images. This may take more imagination by Evidence Creators as they typically do not create the original images. By having the company/product name in the Patient Name field, other tests will be much easier to perform because your test partners will easily identify your images/KOS..
Instructions for DICOM Image and GSPS Consumers:
Finally, for both Creators & Consumers
The goal of this “test” is to provide MRRT Report Templates that are 'consumed' by other systems.
In order to facilitate this testing, Creators....please submit your samples 2-3 weeks before the usual Preparatory test deadlines.
Sample | Creator(s) | Consumer(s) |
MRRT Report Template |
Report Template Creator |
Report Template Manager, Report Creator |
FIRST, prepare the samples-->
We expect the Report Template Creator to create and provide a “representative sample” of the templates that represents the template-creating capabilities of your application AND that incorporate as many of the template structures defined in the MRRT Profile as possible. he better the samples you provide, the better interoperability testing we will have.
Using your Report Template Creator application, create a template using this guidance:
SECOND, upload your samples -->
In the REM profile, Dose Information Reporter systems are required to have the capabililty to de-identify Dose SR objects. See RAD TF-2: 4.63.4.1.2.1 .
In this test, Dose Information Reporters perform de-identification and then submits it as a sample into Gazelle Test Management. Dose Register systems can retrieve and test with those samples.
In order to facilitate this testing, please submit your samples 2-3 weeks before the usual Preparatory test deadlines.
Sample | Creator(s) | Consumer(s) |
Two files: one Radiation Dose SR (De-identified) *and* the original SR prior to de-identification |
REM Dose Information Reporter |
Dose Registry |
FIRST, prepare the samples-->
SECOND, upload your samples -->
For the Connectathon, meaningful interoperability testing for the REM profile between Modalities and Dose Info Consumers, Reporters and Registers will rely largely on the quality of the data sets supplied by the Acquisition Modality vendors. The goal of this “test” is to provide REM-compliant Radiation Dose SRs that are 'consumed' by other systems.
In order to facilitate this testing, Creators....please submit your samples 2-3 weeks before the usual Preparatory test deadlines.
Sample | Creator(s) | Consumer(s) |
Radiation Dose SR *plus* image(s) referenced in the SR |
REM Acquisition Modality |
Dose Information Consumer, Dose Information Reporter, Image Manager, Dose Registry |
FIRST, prepare the samples-->
SECOND, upload your samples -->
We have been testing REM for several years, and we've collected sample Dose SRs from Modalities that have tested REM at various connectathons. To expand the number of samples you have to test with, we point you to this collection of SRs.
There is no log to upload for this test.
For the Connectathon, meaningful interoperability testing for the REM-NM profile between Radiolpharmceutical Activity Suppliers (RAS) and Modalities as creators of DICOM objects, and Image Managers, Dose Info Consumers, Reporters, Registers as consumers of those objects, will rely largely on the quality of the data sets supplied by the RAS and Modality vendors. In this “test”, you create a sample data set that will be reviewed by other participants. The goal of this test is to prepare other actors (Image Manager, Dose Info Consumer, Dose Info Reporter, Dose Register) so they are not surprised during Connectathon events.
In order to facilitate this testing, Creators....please submit your samples 2-3 weeks before the usual Preparatory test deadlines.
Sample | Creator(s) | Consumer(s) |
Radiopharmaceutical Radiation Dose Structured Reports (RRDSR) Images with dose information encoded |
REM-NM RAS REM-NM Acquisition Modality |
Dose Information Consumer, Dose Information Reporter, Dose Registry, Image Manager |
FIRST, prepare the samples-->
SECOND, upload your samples -->
This section contains test cases performed with the Sharing Value Sets Simulator tool.
Tool: http://gazelle.ihe.net/SVSSimulator
Tool information page: http://gazelle.ihe.net/content/svs-simulator
We use this 'test' to inform you of the gazelle SVS simulator available for your testing.
SVS actors simulated:
Location of the tool: http://gazelle.ihe.net/SVSSimulator
Tool user manual: https://gazelle.ihe.net/content/svs-simulator
We encourage you to test with the simulator prior to the Connectathon.
There are no pre-Connectathon results to upload for this 'test'.
This section contains test cases performed with the XDStarClient tool.
Tool: http://gazelle.ihe.net/XDStarClient
Tool information page: http://gazelle.ihe.net/content/xdstarclient
Test Your Server with Gazelle XDStar Client Simulator
We use this test to inform you of the gazelle XDStar Client simulator tool available for your testing. It simulates 'client' actors in XDS.b, XDR, XCA, MPQ, DSUB, XCPD, XDS.b On-demand Docs option, XCF, XCA-I and XDS-I.b
We encourage you to test with the simulator prior to the Connectathon, but there are no pre-Connectathon test results files to upload for this test.
***Note that at this time, the CAS login on the XDStarClient only works with username/passwords for the European instance of gazelle (EU-CAT), not with the North American gazelle (gazelle-na). Until this is implemented, testers will have to create a new user account. This can be done using the CAS login link at the upper-right of the XDStarClient.
Instructions
1. Access the XDStarClient: http://gazelle.ihe.net/XDStarClient/home.seam Configure your server using the "'SUT Configurations'" menu
2. Under the SIMU-Initiators menu, find the message you want to receive:
3. Follow the instructions on the page to send the selected message to the server you have configured
Evaluation
The purpose of this test is to provide sample messages to test with your server. There are no pre-Connectathon test results files to upload for this test.
This section contains informatino about testing with the Security Token Service (STS) used with XUA tests.
This page is aimed to show you how to retrieve the test report asked in most of the tests performed against one of the Gazelle HL7v2.x simulators.
All the HL7v2.x messages exchanged between your system under test and a Gazelle simulator are logged into the application's database. In the context of the pre-connectathon testing, you will be asked to provide within Test Management the test report as a proof of your success.
Once you are ready to log your result into Gazelle, go to the HL7 messages part of the simulator you have tested against. The common menu icon is . If you were already on that page before sending the message, click on the "Refresh List" button. Look for the exchanged message you need (filters are available to restrain and facilitate your search). Note that most recent messages are at the top of the table.
Once you have found out the message to log, click on its id (left-hand column).
The permanent page gathering all the information about the selected message will be displayed. Among this information, you will find a linked entitled "Permanent link to test report", it is this link that you are asked to provide within your pre-connectathon test instance.