Preparatory tests

IHE has a collection of tools for testing implementations of actors in IHE profiles.  You will find links to all tools for testing IHE profiles and their test descriptions on the complete Index to IHE Test Tools 

The test cases in this section are for a subset of IHE's tools and are generally organized in the menu at left according to the tool that is used.

User guides for tools associated with these test cases are here: http://gazelle.ihe.net/content/gazelle-user-guides

If you are preparing for an IHE Connectathon, Gazelle Test Management for your testing event contains a list of the Preparatory tests you must perform, customized according to the IHE profiles/actors/options you have registered to test.  You can find your list of Preparatory tests in Gazelle Test Management under menu Testing -->Test execution.  (For IHE European Connectathons, Gazelle Test Management is at https://gazelle.ihe.net/gazelle/home.seam; other testing events may have a diffeent URL).

 

ATNA tests

This section contains test cases performed with the Gazelle Security Suite tool:

 

11099: Read ATNA Resources page

--> Prior to performing ATNA tests, please read this page for guidelines that address frequently asked questions about testing expectations. <--

THIS PAGE APPLIES TO ATNA TESTING AT 2024 IHE CONNECTATHONs. 

ATNA Requirements

The ATNA requirements are in the IHE Technical Framework:

NOTE:  The folloing options were retired in 2021 via CP-ITI-1247 and are no longer tested at IHE Connectathons:

      • STX: TLS 1.0 Floor with AES Option
      • STX: TLS 1.0 Floor using BCP195 Option

Gazelle Security Suite (GSS) tool for ATNA testing:

Tool-based testing of TLS (node authentication) and of the format and transport of your audit messages is consolidated in one tool - the Gazelle Security Suite (GSS).

Security Policy (TLS & audit) for the 2024 IHE EU/NA Connecthon

In order to ensure interoperability between systems doing interoperability (peer-to-peer) testing over TLS (e.g. XDS, XCA...) the Connectathon technical managers have selected a TLS version and ciphers to use for interoperability tests during Connectathon week.  (This is analagous to a hospital mandating similar requirements at a given deployment.)

TLS POLICY for [ITI-19]:

*** For the 2022 IHE Connectathon, interoperabily testing over TLS shall be done using:

        • TLS 1.2
        • cipher suite  - any one of:
          • TLS_DHE_RSA_WITH_AES_128_GCM_SHA256
          • TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256
          • TLS_DHE_RSA_WITH_AES_256_GCM_SHA384
          • TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384
        • A digital certificate, issued by the Gazelle Security Suite (GSS) tool.  See details below.

AUDIT MESSAGE POLICY for [ITI-20]:

Before 2020, an ATNA Audit Record Repository (ARR) was required to support receiving audit messages in both TLS syslog and UDP syslog.   That meant that all Secure Node/Applications could send their audit messaes to any ARR.

Now, all actors sending and receiving audit messages may choose to support TLS Syslog, UDP Syslog, and/or FHIR Feed for transport.   We expect that the Audit Record Repositories at the NA and EU Connectathons will provide good coverage of the options (TLS, UDP, FHIR), though some ARRs may support a subset.  In particular, the FHIR Feed Option in ITI-20 may have less support because it was new as of 2020.

Connectathon technical managers will not select one transport for all audit records exchanged during Connectathon.  Instead, Secure Node/Applications will choose ARRs for test partners that are compatible with the audit records they send in ITI-20.  Gazelle Test Management will show compatible partners for ITI-20 interoperability tests:  "ATNA_Logging_*.

==> GSS: Digital Certificates for IHE Connectathons

The Gazelle Security Suite (GSS) tool is the SINGLE PROVIDER OF DIGITIAL CERTIFICATES for IHE Connectathons.  

To obtain a digital certificate from the GSS tool for preparatory & Connectathon testing, follow the instructions in test 11100.   That test contains instructions that apply to an IHE Connectathon, whether face-to-face or online.

Some facts about the digital certificates for Connectathon testing:

    1. The digital certificate you generate in GSS:
      1. is from Certificate Authority (CA) with a key of 2048 length.  You must add the certificate for the new CA in your trust store.
      2. will contain the fully-qualified domain name (FQDN) of your Connectathon test system.   When you use GSS to request the certificate, the tool will prompt you for this value.  The FQDN value(s) will be in the subjectAltName entry of your digital certificate.  (You may need to provide more than one FQDN when you generate your certificate, e.g., if you will use your system to test TLS connections outside of the Connectathon network, such as using the NIST XDS Tools in your local test lab.)
    2. Test 11100 contains detailed instructions for generating your certificate, including how to get the fully-qualified domain name for your test system.
    3. Item (1.b.) means that each system testing TLS transactions during Connectathon week will have a digital certificate that is compatible with the 'FQDN Validation Option' in ATNA.  Thus, TLS connections with test partners will work whether the client is performing FQDN validation, or not.  This is intentional.
    4. The certificates are only for testing purposes and cannot be used outside of the IHE Connectathon context.

    ==> GSS: ATNA Questionnaire

    Systems testing ATNA are required to complete the ATNA Questionnaire in the GSS tool, ideally prior to Connectathon week.  Embedded in the questionnaire are Audit Record tests and TLS tests customized for the profiles & actors you registered to test at Connectathon.

        • Follow instructions in test 11106.

    ==> GSS: ATNA Logging Tests - ATX: TLS Syslog Option

    Read the Technical Framework documentation; you are responsible for all requirements in Record Audit Event [ITI-20] transaction. We will not repeat the requirements here.

    WHICH SCHEMA???:  The Record Audit Event [ITI-20] specifies use of the DICOM schema for audit messages sent using the ATX: TLS Syslog and ATX: UDP Syslog options.  The DICOM schema is found in DICOM Part 15, Section A.5.1.  

    We expect implementations to be compliant; we have tested audit messages using the DICOM schema at IHE Connectathons since 2016.

        • The GSS tool will only provide validation against the DICOM schema. If you fail that test, it is our signal to you that your audit messages are not compliant with the latest DICOM schema.  See test 11116.
        • We expect interoperability testing at the Connectathon to occur using audit records that are compliant with the DICOM schema.

    SENDING AUDIT MESSAGES:   You can send your audit records to the GSS tool simulating an Audit Record Repository.  See test 11117.

    Questions about ATNA Testing?

    Contact the Technical Project Manager for the IT Infrastructure domain.  Refer to the Contact Us page.

    Evaluation 

    There is no specific evaluation for this test.  

    Create a text file stating that you found and read the page. Upload that text file into Gazelle Test Management as the Log Return file for test 11099.

     

     

    11100: Obtain Digital Certificate for TLS Testing

    Overview of the test

    This test contains instructions for obtaining a digital certificate for your test system that is registered for an IHE Connectathon.   You will obtain your digital certificate(s) from the Gazelle Security Suite tool.

    Prerequisites for this test

    First, please read the ATNA Testing Resources page before proceeding with this test.  That page contains important context for using the digital certificates for Connectathon-related tests. 

    When you generate your digital certificate in Gazelle Security Suite, you will need to know two values:

    (1) The hostname(s) for your test system:

    • For IHE Connectathons face-to-face:  The hostname(s) are assigned to your test system by Gazelle Test Management.  (See https://gazelle.ihe.net/TM/ for the 2022 IHE Connectathon; the link may differ for other testing events). 
      To find the hostname for your test system, log into Gazelle Test Management, then select menu Preparation-->* Network Interfaces.
    • For IHE Connectathons Online     This is the public hostname(s) for your test system.  For Connectathons Online, hostname and IP addresses are determined by the operator of the test system.   (The operator still shares its hostname(s) with other participants using Gazelle Test Management.)

    (2) Domain Name:

    • For IHE Connectathons face-to-face:  The domain name of the Connectathon network.  This information is published by the Technical Manager of each IHE Connectathon.  (E.g., for the IHE Connectathon 2022, the Domain Name is ihe-europe.net).
    • For IHE Connectathons Online:  Your public domain name.

     

    Location Gazelle Security Suite (GSS) tool

    Log in to the GSS tool

    When logging in to GSS, you will use your username & password from Gazelle Test Management for your Connectathon.  There are separate CAS systems for different instances of Gazelle Test Management, and you will have to take this into account when logging in to GSS:

    • The European CAS is linked to Gazelle Test Management at http://gazelle.ihe.net/TM/ <---This will be used for the 2022 IHE EU/NA Connectathon
    • The North American CAS is linked to Gazelle Test Management at https://gazelle.iheusa.org/gazelle-na/
    • If you don't have an account, you can create a new on the Gazelle Test Management home page.

    On the GSS home page (http://gazelle.ihe.net/gss) find the "Login" link at the upper right of the page.  

    • Select either "European Authentication" or "North American Authentication"
    • Enter the username and password from either Gazelle Test Management linked above.

    Instructions - Obtain a Certificate

    • In GSS, select menu PKI-->Request a certificate
    • Complete the fields on page:
      • Certificate type:  Choose "Client and Server" from dropdown list  (Required field)
      • key size: 2048
      • Country (C): (required)
      • Organization (O):  Your organization name in Gazelle Test Management   (Required field)
      • Common Name (CN):  The Keyword for your test system in Gazelle Test Management (eg EHR_MyMedicalCo)  (Required field)
      • Title:  (optional)
      • Given name: (optional)
      • Surname: (optional)
      • Organizational Unit: (optional)
      • eMail:  (optiional) email of a technical contact making the request
      • Subject Alternative Names: 
        • You must enter at least one value in this field:   the fully-qualified domain name of your test system.
        • For a face-to-face Connectathon, this is a combination of the hostname of your test system and the domain name. (See the Prerequisites section above)
          • E.g., for Connectathon network, the hostname of your system might be acme0, and the domain name might be ihe-test.net.  So, an example of a fully-qualified domain name entered in this field for a digital certificate is acme0.ihe-test.net
        • This value may contain additional fully-qualified domain name(s) for your test system when it is operating outside of a face-to-face Connectathon, e.g. when you are testing with the NIST XDS Tools in your home test lab, or if you are participating in an online Connectathon.
        • If you have more than one hostname, multiple values are separated by a comma.
    • Click the "Request" button.
    • You will then be taken to a page listing all requested certificates.  Find yours on the top of the list, or use the filters at the top.
    • In the "Action" column, click the "View Certificate" (sun) icon.  Your certificate details are displayed.  Use the "Download" menu to download your certificate and/or the Keystore.

    It is also possible to find your certificate using the menu:

    • Select menu PKI-->List certificates
    • In the "Requester" column, filter the list by entering your username at the top of the column (the username you used to log in to GSS)
    • Use the icon in the "Action" column to find and download your certificate, as described above.

    You are now ready to use this certificate for performing:

    • authentication tests with the Gazelle Security Suite tool
    • interoperability (peer-to-peer) tests with your Connectathon partners

    Evaluation 

    There is no specific evaluation for this test.  

    Create a text file stating that you have requested & received your certificate(s). Upload that text file into Gazelle Test Management as the Log Return file for test 11100.

    In subsequent tests (eg 11109 Authentication test), you will verify the proper operation of your test system with your digital certificate.

     

    11106: ATNA Questionnaire

    Overview of the test

    In this test you complete a form which collects information that will help us evaluate the Audit Logging and Node Authentication (ATNA) capabilities of your test system.

    The contents of your ATNA Questionnaire are customized based on the the profiles and actors that you have registered in Gazelle Test Management for a given testing event (e.g. an IHE Connectathon).  Depending on which profiles/actors you have registered for, the ATNA Questionnaire will ask you to validate audit messages for transactions you support, and you will be asked to demonstrate successful TLS connections for the transports you support (eg DICOM, MLLP, HTTP).

    Prerequisites for this test

    Before you can generate your on-line ATNA questionnaire:

    • You must have a test system registered in Gazelle Test Management for an upcoming testing event.
    • Your test system must be registered to test an ATNA actor, e.g., Secure Node, Secure Application, Audit Record Repository.
    • Your test system must have a status of "Completed" in Gazelle Test Management.
      • This is because the content of the Questionnaire is build based on the profiles & actors you support. We want to know that your registration is complete.
      • To check this, log in to Gazelle Test Management.
      • Select menu Registration.
      • On the System summary tab for your test system, you must set your Registration Status to "Completed" before you start your ATNA Questionnaire.

    Location of the ATNA Tools:  Gazelle Security Suite (GSS)

    Log in to the GSS tool

    When logging in to GSS, you will use your username & password from Gazelle Test Management for your testing event.  There are separate CAS systems for different instances of Gazelle Test Management, and you will have to take this into account when logging in to GSS:

    • The European CAS is linked to Gazelle Test Management at http://gazelle.ihe.net/TM/ <---This will be used for the 2022 IHE EU/NA Connectathon
    • The North American CAS is linked to Gazelle Test Management at https://gazelle.iheusa.org/gazelle-na/
    • If you don't have an account, you can create a new on the Gazelle Test Management home page.

    On the GSS home page (http://gazelle.ihe.net/gss) find the "Login" link at the upper right of the page.  

    • Select either "European Authentication" or "North American Authentication"
    • Enter the username and password from either Gazelle Test Management linked above.

    Instructions

    1. In GSS, select menu Audit Trail --> ATNA Questionnaires

    2. First, search for any existing questionnaires for your organization. Use the filters at the top of the page to search based on various criteria.  You will only be able to access the questionnaires created for your organization's test systems.  Admins and monitors can access all of them.

    Filters

     3. You can use the icons in the right 'Actions' column to:

    • View the content of the questionnaire
    • Edit it
    • Delete it (administrators only)
    • Review it (monitors only)

     

    4. If no questionnaire is available for your test system, you need to create a new one.  

    • Click on the "New ATNA Questionnaire" button
    • From the dropdown list, select the name of your test system.   Note: If your system doesn't appear...
      • ...is your test system registered with status of "Completed"?
      • ...are you registered for ATNA Secure Node or Secure Application?
      • ...is the testing session closed (ie is the connectathon over)?
    • Next, click the "Back to list" button.  Use the filter at the top to find your questionnaire in the list.  Use the "Edit" icon in the "Action" column to begin.

    5. Complete the questionnaire.  You are now in the ATNA Questionnaire Editor.

    • In the System details, identify the ATNA actor you support.  Choose either "Secure Node (SN)" or "Secure Application (SA)"
    • Complete the "Inbound network communications" tab
    • Complete the "Outbound network communications" tab
    • Complete the "Authentication process for local users" tab
    • Complete the "Audit messages" tab.  This tab is used with test 11116.
    • Secure Nodes only:  Complete the "Non network means for accessing PHI" tab 
    • Complete the "TLS Tests" tab.  This tab is used with test 11109

    6.  Mark your questionnaire "Ready for review"

    • When all tabs in the questionnaire are complete, set the Questionnaire's status to "Ready for review" in the "Questionnaire details" section.  This is a signal that you have completed your work; we do not want to have monitors evaluating incomplete questionnairs.

    Evaluation

    Depending on the testing event, the results of this test may be reviewed in advance.  More typically, it will be reviewed and graded by a Monitor during the test event itself (e.g. during Connectathon week).

    Note:  You cannot get connectathon credit (i.e. a "Pass") for your ATNA Secure Node/Application without completing and submitting your questionnaire.

    11109: Authentication Test

    Prerequisites for this test

    (1) Read the ATNA Testing Resources page before proceeding with this test.

    (2) To perform this test, your digital certificate must be set up on your system (server and/or client).  Follow the instructions in test 11000 to obtain digital certificate(s) for your test system(s).

    (3) You should create your ATNA Questionnaire (test 11106) prior to running this test.  

    • The ATNA Questionnaire has a "TLS Tests" tab that identifies the inbound /outbound communications you support.  
      • That tab determines which of the "Server" and "Client" tests that you must run below.  
      • You will also record your successful results on that tab.

    Overview of the test

    In this test, you will use the Gazelle Security Suite (GSS) tool (https://gazelle.ihe.net/gss) to verify that you are able to communicate with TLS clients and servers using digital certificates.

    The GSS tool contains multiple client and server simulators that check:

    • transport over TLS v1.2, including protocol (DICOM, HL7/MLLP, HTTPS/WS, or syslog)
    • cipher suite (TLS_DHE_RSA_WITH_AES_256_GCM_SHA384, and more....),
    • certificate authentication
      • Digital certificates for pre-Connectathon & Connectathon testing are generated by GSS.  See test 11100.

    The TLS simulators available in the GSS tool are listed in Column 1 in the following table, along with notes on which you should use for this test:

     

    Simulator Names (keyword) To be tested by...
    Simulator configuration

     

    -- Server DICOM TLS 1.2 Floor

    -- Server HL7 TLS 1.2 Floor

    -- Server HTTPS/WS TLS 1.2 Floor

    -- Server Syslog TLS 1.2 Floor

    Connectathon test system that supports the "STX: TLS 1.2 Floor option" and is a client that...

    -- Initiates a TLS connection with DICOM protocol

    -- Initiates a TLS connection with MLLP protocol (i.e. HL7 v2 sender)

    -- Initiates a TLS connection for a webservices transaction

    -- Initiates a TLS connection to send an audit message over TLS syslog

    TLS 1.2 with 4 'strong' ciphers: 

    • TLS_DHE_RSA_WITH_AES_128_GCM_SHA256
    • TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256
    • TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384
    • TLS_DHE_RSA_WITH_AES_256_GCM_SHA384

     

    You may test with just one of the ciphers.

    -- Server RAW TLS 1.2 INVALID FQDN

    Connectathon test system that is a client supporting the "FQDN Validation of Server Certificate option"

    TLS 1.2 with 4 'strong' ciphers; see list above.

    Certificate has an invalid value for subjectAltName.

    -- Client TLS 1.2 Floor

    Connectathon test system that supports the "STX: TLS 1.2 Floor option" and is a server that...

    -- Accepts a TLS connection with DICOM protocol

    -- Accepts a TLS connection with MLLP protocol (i.e. HL7 v2 responder)

    -- Accepts a TLS connection for a webservices transaction

    -- Accepts a TLS connection to receive an audit message over TLS syslog

    TLS 1.2 with 4 'strong' ciphers; see list above.

     

    Location Gazelle Security Suite (GSS) tool:

    Log in to the GSS tool

    When logging in to GSS, you will use your username & password from Gazelle Test Management for your testing event.  There are separate CAS systems for different instances of Gazelle Test Management, and you will have to take this into account when logging in to GSS:

    • The European CAS is linked to Gazelle Test Management at http://gazelle.ihe.net/TM/ <---This will be used for the 2022 IHE EU/NA Connectathon
    • The North American CAS is linked to Gazelle Test Management at https://gazelle.iheusa.org/gazelle-na/
    • If you don't have an account, you can create a new on the Gazelle Test Management home page.

    On the GSS home page (http://gazelle.ihe.net/gss) find the "Login" link at the upper right of the page.  

    • Select either "European Authentication" or "North American Authentication"
    • Enter the username and password from either Gazelle Test Management linked above.

    Instructions for outbound transactions (Client side is tested)

    If your test system (SUT) does not act as a client (i.e., does not initiate any transactions), then skip this portion of the test and only test the Server side below).

    If your SUT acts as a client, you must be able to access to TLS server's public IP. You have to test your client by connecting to Server Simulators in the Gazelle Security Suite tool.

    1. On the home page for the Gazelle Security Suite, select menu TLS/SSL-->Simulators-->Servers to find the list of server simulators.  There are servers for different protocls (DICOM, HL7...) and for different ATNA options (e.g., TLS 1.2 Floor...).

    • You will test only the protocols you support -- those listed on the "TLS Tests" tab of your ATNA questionnaire.

    2. Configure your client to connect to the test TLS server.

    3. Check that the server is started before trying to connect to it. Click on the link for the server you want and look for status "Running"

    4. In your SUT, perform a connection (eg send a query) to the test server. The TLS connection is valid, but at transaction level you will get invalid replies because we are only checking for the TLS connection.

    5. You should then get a time-stamped entry in the results list at the bottom of the page.   Blue dot means OK, red NOT OK.

    5. For each successful connection, view the result with the icon in the "Action" column.  Copy the Permanent link (URL) to the result into your ATNA Questionnaire, on the "TLS Tests" tab The link must be formatted like https://.../connection.seam?id=...

    6. Repeat these steps for each supported protocol (HL7v2 , DICOM, Syslog server ...) : e.g., if your system has no DICOM capabilities, you can skip that portion of the test.

    Instructions for inbound transactions (Server side is tested)

    If your test system (SUT) does not act as a server (i.e., does not respond to any transactions initiated by others), then skip this portion of the test and only perform the Client test above).

    If your SUT acts as a server (i.e. a responder to IHE transactions), your server must be accessible from the outside so that the GSS tool, as a client simulator, can connect to your SUT. 

    1. On the home page for the Gazelle Security Suite, select menu TLS/SSL-->Simulators-->Clients to find the list of client simulators. 

    2. In the "Start Connection" section of the page, you will have to specify, for each supported protocol :

    • Client type : protocol supported (HL7, DICOM, WS, SYSLOG, or RAW)
      • You will test only the protocols you support -- those listed on the "TLS Tests" tab of your ATNA questionnaire.
    • Target host : public IP of your server
    • Target port : public port of your server

    3. Then click on "Start client".

    4. You should then get a time-stamped entry in the results list.   Blue means OK, red NOT OK.

    5. For each successful connection, view the result at the bottom of the page using the icon in the "Actions" column.  Copy the Permanent Link (URL) to the result into your ATNA Questionnaire, on the "TLS Tests" tab. The link must be formatted like https://.../connection.seam?id=...

    6. Repeat these steps for each supported protocol (HL7v2, DICOM, Syslog client, ...) : e.g., if your system has no DICOM capabilities, you can skip that portion of the test.

    Evaluation 

    Depending on the testing event, the results of this test may be reviewed in advance.  More typically, it will be reviewed and graded by a Monitor during the test event itself (e.g. during Connectathon week).

    The tool reports success or failure for each test you perform.  Your test system must demonstrate successful TLS handshake for each inbound and outbound protocol you support.

    If you are performing this test in preparation for an IHE Connectathon, a Connectathon monitor will verify your results as follows. The monitor will:

    1. Access the TLS tests tab in the ATNA Questionnaire.  (The SUT only performs tests for the protocols it supports, and skips the ones not supported.)
    2. For each "SERVER" tested side :
      • The test result must be PASSED.
      • During a Connectathon, these items can also be verified:
        • the SUT host must be the IP specified in the configuration of the system.
        • the SUT port must be the one specified in the configuration of the system for the protocol.
    3. For each "CLIENT" tested side :
      • The connection must succeed (blue dot).
      • During a Connectathon, this item can also be verified:
        • the host in the SUT address must be the IP specified in the configuration of the system. The port is not verified for outbound transactions.
    4. During the Connectathon, the monitor may choose to ask the vendor to re-run a test if the results raise questions about the system's support of TLS.

     

    11110: Authentication error cases

    Overview of the test

    *** If your ATNA Secure Node/Secure Application is only a client (ie it only initiates transactions), then this test case is not applicable for you.  Skip it. ***

    This test exercises several error cases.  You will use the TLS Tool in the Gazelle Security Suite as a simulated client, trying to connect to a Secure Node (SN) or Secure Application (SA) acting as a server.

    Prerequisite for this test

    Perform test 11109 Authentication Test before running this 'error cases' test.

    Location of the ATNA Tools:  Gazelle Security Suite

    Log in to the GSS tool

    When logging in to GSS, you will use your username & password from Gazelle Test Management for your testing event.  There are separate CAS systems for different instances of Gazelle Test Management, and you will have to take this into account when logging in to GSS:

    • The European CAS is linked to Gazelle Test Management at http://gazelle.ihe.net/TM/ <---This will be used for the 2022 IHE Connectathon
    • The North American CAS is linked to Gazelle Test Management at https://gazelle.iheusa.org/gazelle-na/
    • If you don't have an account, you can create a new on the Gazelle Test Management home page.

    On the GSS home page (http://gazelle.ihe.net/gss) find the "Login" link at the upper right of the page.  

    • Select either "European Authentication" or "North American Authentication"
    • Enter the username and password from either Gazelle Test Management linked above.

    Instructions

    1. Select menu TLS/SSL-->Testing-->Test Cases
    2. Run each of the error test cases listed:
      1. IHE_ErrorCase_Corrupted
      2. IHE_ErrorCase_Expired
      3. IHE_ErrorCase_Revoked
      4. IHE_ErrorCase-Self-Signed
      5. IHE_ErrorCase_Unknown
      6. IHE_ErrorCase_Without_Authentication
      7. IHE_ErrorCase_Wrong_Key
    3. Once you are on the 'Run a test' page, use the 'Client type' dropdown list to select the transport supported on your server (HL7v2, DICOM, HL7, DICOM_ECHO, WEBSERVICE, SYSLOG, or RAW)
    4. Input the host / IP address and port of your system and click on 'Run'.
    5. If you implement several transports as a server, you should mix message types over those error test cases in order to have at least one implemented protocol covered by one step.   It is not necessary to run each of the test cases for each transport.
    6. After each test case, find your result in the list of Test Executions.
    7. Capture the permanent links to your PASSED results.  Copy/paste the links into Gazelle Test Management as your results for test 11110.

    Evaluation

    Depending on the testing event, the results of this test may be reviewed in advance.  More typically, it will be reviewed and graded by a Monitor during the test event itself (e.g. during Connectathon week).

    Each error case must have a result of 'PASSED'. 

    Each transport type (HL7v2, DICOM, HL7, DICOM_ECHO, WEBSERVICE, SYSLOG, or RAW) implemented by your system as a server must have been tested at least one time in the list of error cases.

    If you are performing this test in preparation for a Connectathon, a Connectathon monitor will verify your results pasted into each test step.

    11116: Audit message check

    Overview of the test

    This test applies to a Secure Node/Application that supports the ATX: TLS Syslog or ATX: TLS UDP Option.

    In this test, a Secure Node or Secure Application tests audit messages it sends.  

    • We use the Gazelle Security Suite (GSS) tool to test the content of the audit message against the schema and against requirements documented in the IHE Technical Framework for some transactions.  
    • In this test, we do not test the transport of the audit message (TLS or UDP).

    The Gazelle Security Suite tool provides the ability to validate audit messages against the DICOM schema and the audit message definitions for many transactions in IHE Technical Frameworks.  (We are not longer testing the RFC 3881 schema; the ATNA profile requires support for the DICOM schema for syslog audit messages sent via ITI-20.)

    Location of the ATNA Tools:  Gazelle Security Suite

    Log in to the GSS tool

    When logging in to GSS, you will use your username & password from Gazelle Test Management for your testing event.  There are separate CAS systems for different instances of Gazelle Test Management, and you will have to take this into account when logging in to GSS:

    • The European CAS is linked to Gazelle Test Management at http://gazelle.ihe.net/TM/ <---This will be used for the 2022 IHE Connectathon
    • The North American CAS is linked to Gazelle Test Management at https://gazelle.iheusa.org/gazelle-na/
    • If you don't have an account, you can create a new on the Gazelle Test Management home page.

    On the GSS home page (http://gazelle.ihe.net/gss) find the "Login" link at the upper right of the page.  

    • Select either "European Authentication" or "North American Authentication"
    • Enter the username and password from either Gazelle Test Management linked above.

    Instructions

    You may perform this test directly in the ATNA Questionnaire **or** you may use the Gazelle EVSClient tool.  If you are preparing for an IHE Connectathon, you should use the instructions below for the ATNA Questionnaire.

    ---->Instructions for checking audit messages using the ATNA Questionnaire:

    1. Create a new ATNA Questionnaire for your test system using the instructions for test 11106.
    2. Find the Audit Messages tab in the questionnaire.  That tab contains "Instructions" and enables you to upload and validate audit messages directly on that tab.   
    3. You should validate all messages that you have marked "Implemented".
    4. When you are done, find the Permanent Link to the your ATNA Questionnaire.  Copy/paste that link into the chat window in Gazelle Test Management for test 11116.  

    ---->Instructions for checking audit messages using the EVSClient tool:

    1. In the Gazelle EVSClient, select menu IHE-->Audit messages-->Validate
    2. Select the Add button, and upload the XML file for your audit message
    3. From the Model based validation dropdown list, select the entry that matches your audit message. (Note that additional validations will be added over time.)
    4. Select the Validate button.  
    5. You should validate all audit messages associated with functionality & transactions supported by your test system.
    6. In the Validation Results displayed, find the Permanent Link to the results.  Copy/paste the link(s) into the chat window in Gazelle Test Management for 11116.

    Evaluation

    Depending on the testing event, the results of this test may be reviewed in advance.  More typically, it will be reviewed and graded by a Monitor during the test event itself (e.g. during Connectathon week).

    The tool reports the results of the validation of your messages.  We are looking for PASSED results.

    11117: Send audit or event message to Syslog Collector

    Overview of the test

    In this test, a client sends audit records or event reports using transaction [ITI-20] Record Audit Event to the Syslog Collector tool acting as an Audit Record Repository or Event Repository.   The Syslog Collector is one of the tools embedded in the Gazelle Security Suite.  

    This test is performed by an ATNA Secure Node, Secure Application or Audit Record Forwarder.  It is also performed by a SOLE Event Reporter.

    Note that this test checks the transport of audit messages.  The content of your audit message is verified in a different test.   

    Location of the ATNA Tools:  Gazelle Security Suite (GSS)

    Log in to the GSS tool

    When logging in to GSS, you will use your username & password from Gazelle Test Management for your testing event.  There are separate CAS systems for different instances of Gazelle Test Management, and you will have to take this into account when logging in to GSS:

    • The European CAS is linked to Gazelle Test Management at http://gazelle.ihe.net/TM/ <---This will be used for the 2022 IHE Connectathon
    • The North American CAS is linked to Gazelle Test Management at https://gazelle.iheusa.org/gazelle-na/
    • If you don't have an account, you can create a new on the Gazelle Test Management home page.

    On the GSS home page (http://gazelle.ihe.net/gss) find the "Login" link at the upper right of the page.  

    • Select either "European Authentication" or "North American Authentication"
    • Enter the username and password from either Gazelle Test Management linked above.

    Instructions

    • Access the Syslog Collector in GSS under menu Audit Trail --> Syslog Collector.  This page displays the tool's IP address and UPD and TCP-TLS ports.
    • Configure your application to send your audit messages (event reports) to the Syslog Collector.
    • Then trigger any event that initiate an ITI-20 transaction. This event may be an IHE transaction or other system activity (eg system start/stop or one of the SOLE events). Your system should finally send the message to the Syslog Collector.
    • IMPORTANT NOTE:  The Syslog Collector tool is a free, shared resource. It is intended for intended for brief, intermittent use.  Developers SHOULD NOT configure their system to send syslog messages to the tool on a long-term basis.  Flooding the tool with audit messages can make it unavailble for use by others.

    Evaluation

    You must check that your audit message has been received by the Syslog Collector and that the protocol SYSLOG is correctly implemented.

    • Go to Gazelle Security Suite, on page Audit Trail > Syslog Collector.
    • Filter the list of received messages by the host or the IP of the sender, and find the message you sent according to the timestamps.
    • Click on the magnifying glass to display the message details.
    • If the protocol is UDP or TLS, if there is a message, a message content, no errors and RFC5424 parsing succeeeded, then the test is successful.  There is an example screenshot below.
    • Copy the URL to your successful result and paste it into your local Gazelle Test Management as the Log Return file for test 11117.  
    • Do not forget to stop sending audit-messages to the Syslog Collector once you’ve finished the test. If your system sends a large amount of messages, administrators of the tool may decide to block all your incoming transactions to prevent spam.

    Tips

    TCP Syslog is using the same framing requirement as TLS Syslog. You can first use the TCP port of Syslog Collector to debug your implementation. Keep in mind that the IHE ATNA Profile expects at least UDP or TLS for actors that produce SYSLOG messages.

    .

    11118: AuditEvent Resource check

    THIS TEST IS UNDER CONSTRUCTION AND NOT YET AVAILABLE...

    Overview of the test

    This test applies to a Secure Node/Application that supports the ATX: FHIR Feed Option.

    The RESTful ATNA TI Supplement, Section 3.20.4.2.2.1, defines a mapping between DICOM Audit Messages and FHIR AuditEvent Resources. Implementers should be creating their AuditEvent Resources according to the defined mappings, and expect that they will be examined according those mappings at IHE Connecthons.

    • In test EVS_FHIR_Resource_Validation, a Secure Node or Secure Applicationis  asked to use the Gazelle EVSClient tool to test the content of the AuditEvent Resource against the baseline FHIR Requirements.
    • In this AuditEventResource check test:   Gazelle tools will be enhanced to provide additional validation of AuditResource based on constraints in [ITI-20]. The RESTful ATNA TI Supplement, Section 3.20.4.2.2.1.That mapping will enable Gazelle tooling to verify specifics of the IHE-defined audit records that are sent as FHIR AuditEvent Resources.  

    Instructions

     

    ---->Instructions for checking additional constraints on AuditEvent Resources (mapping defifned in ITI TF-2b: 3.20.4.2.2.1):

    1. Instructions TBD when tool update is complete.

    Evaluation

    Depending on the testing event, the results of this test may be reviewed in advance.  More typically, it will be reviewed and graded by a Monitor during the test event itself (e.g. during Connectathon week).

    The tool reports the results of the validation of your Resources.  We are looking for PASSED results.

    DICOMScope tests

    Pre-connectathon testing for systems implementing the CPI (Consistent Presentation of Images) Profile are perfomed using the DICOMScope tool and associated test plans developed by OFFIS.  This tool is also used by Image Display actors in the MAMMO, DBT and SMI profiles.

    Find links to specific instructions for each actor below.

     

    Hardware_Rqmts_ImageDisplay

    Image Displays some profiles are required to be able to calibrate a monitor according to the the DICOM Grayscale Standard Display Function (GSDF):

    • CPI Image Display
    • MAMMO Image Display
    • DBT Image Display
    • SMI Image Display

    Connectathon-related Instructions

    Ideally, the monitor calibration would be done during the Connectathon; however, we recognize that there are costs associated with shipping a monitor to the Connectathon.

     

    (1) If you are testing an Image Display the CPI profile but not MAMMO, DBT, or SMI, as an alternative to shipping a monitor to the Connectathon you may:

    • Perform the calibration in your home lab as described in test DICOMScope_ImageDisplay.
    • Record a video of the calibration task as performed in your lab
    • Bring the recording to Connectathon as evidence for this test in order to convince the Connectathon monitor that it was performed correctly. Then, you will not need to perform calibration on-site during the Connectathon.
    • If you do not perform/record calibration in advance of Connectathon, you will be required to perform it on-site at Connectathon (with diagnostic-quality monitor and photometer provided by you) as described in (2) below.

     

    (2) If you are an Image Display actor in the MAMMO, SMI or DBT profile:

    • Perform calibration as described in (1) above.  You are highly encouraged to do it in advance, as described.  Otherwise, you will have to do that task during Connectathon week.
    • Also, plan to bring a diagnostic-quality monitor and a photometer to the Connectathon. The test managers do not provide a photometer on-site. For these profiles, you must bring a diagnostic-quality monitor to Connectathon in order to perform the required display tests with your modality partner's images.

     

    Evaluation

    There is no evaluation for this informational 'test'. You will share you calibration results in the DICOMScope test.

    DICOMScope_ImageDisplay

    Introduction

    Image Displays some profiles are required to be able to calibrate a monitor according to the the DICOM Grayscale Standard Display Function (GSDF):

    • CPI Image Display
    • MAMMO Image Display
    • DBT Image Display
    • SMI Image Display

    We use software and test plans developed by OFFIS to test these actors.

    Hardware Instructions for Image Displays

    Please refer to test Hardware_Rqmts_ImageDisplay.

    Instructions

    1. Install the DICOMScope software available here: https://dicom.offis.de/dscope.php.en   See also https://dicom.offis.de/download/dscope/dscope364/
    2. Access imgdisplay_testplan.doc and gsdf_lum.xls on Google Drive in: IHE Documents > Connectathon > tools > RAD-CPI-DICOMScope > CPI_Tests_2004_Release > documents
    3. Access imgdisplay_gsdf.zip for test cases referenced in the Image Display test plan in: IHE Documents > Connectathon > tools > RAD-CPI-DICOMScope > CPI_Tests_2004_Release
    4. Follow the instructions in Section 2.2 of the "Test Plan for Image Displays" document.

    Evaluation

    Please upload the completed gsdf_lum.xls file produced in the DICOMScope test procedure into Gazelle Test Management as the results for this test.

    DICOMScope_PrintComposer

    Introduction

    Actors in several profiles are required to implement the [RAD-23] “Print Request with Presentation LUT” transaction as an SCU:

    • CPI Print Composer
    • MAMMO Print Composer
    • DBT Print Domposer

    We use software and test plans developed by OFFIS to test these actors.

    Instructions

    1. Install the DICOMScope software available here: https://dicom.offis.de/dscope.php.en See also https://dicom.offis.de/download/dscope/dscope364/
    2. Access printscu_testplan.doc from Google Drive in: IHE Documents > Connectathon > tools > RAD-CPI-DICOMScope > CPI_Tests_2004_Release > documents
    3. Follow the instructions in Section 2 of the "Test Plan for Print Composers" document.

    Evaluation

    Capture and submit your results:

    1. Section 2.3 of the "Test Plan for Print Composers" describes log files created during the test procedure.  When you have successfully completed the print jobs described, caputure the log files and upload them into Gazelle Test Management as results for this pre-Connectathon test.
    2. Change the status of the test to Verified by vendor
     

    PrintServer_tests

    Introduction

    Actors in several profiles are required to implement the [RAD-23] “Print Request with Presentation LUT” transaction as an SCP:

    • CPI Print Server
    • MAMMO Print Server
    • DBT Print Server

    We use test plans developed by OFFIS to test these actors.  The goal of the Print calibration tests is to demonstrate that printed output conforms to the GSDF. 

    Instructions

    1. Access printscp_testplan.doc and gsdf_od.xls on Google Drive in: IHE Documents > Connectathon > tools > RAD-CPI-DICOMScope > CPI_Tests_2004_Release > documents
    2. Access printscp_testcases.zip for test cases referenced in the test plan.  See Google Drive in: IHE Documents > Connectathon > tools > RAD-CPI-DICOMScope > CPI_Tests_2004_Release
    3. Follow the instructions in Section 2 of the "Test Plan for Print Servers" document.

    Evaluation

    Capture and submit your results:

    1. Upload the completed gsdf_od.xls file into Gazelle Test Management as results for this pre-Connectathon test.
    2. Change the status of the test to Verified by vendor

    Do_This_First tests

    This section contains test cases that contain instructions or other background material that prepares a test system for interoperability testing at an IHE Connectathon.

    Ideally, the preparation described in these tests should be completed BEFORE the Connectathon begins.

     

    General 'Do/Read This First' Tests

    This is an index of Do This First  and Read This First tests that apply to testing across multiple domains

     

    FHIR_CapabilityStatement

    Overview

    This test applies to test systems that have implemented one or more IHE Profiles based on HL7®© FHIR or FHIRcast®©.  

    • The test primarily applies to FHIR Servers, who will publish a FHIR CapabilityStatement that documents the capabilities (behaviors) of its server implementation.  See ITI TF-2: Appendix Z.3.  
    • Some IHE Profiles may also require that client actors publish a CapabilityStatement to document the FHIR Resources it supports, e.g. this is required for IHE RAD IRA actors, including Report Creator.

    IHE publishes CapabilityStatements aligned with profile requirements on the  Artifacts page of the FHIR Implementation Guide (IG) for that profile (e.g. for the IHE ITI PIXm profile, see https://profiles.ihe.net/ITI/PIXm/artifacts.html).

    ==> During the Connectathon Preparatory phase: You will create a FHIR or FHIRcast CapabilityStatement Resource that represents the FHIR capabilities in your system/product, i.e. CapabilityStatement.kind has value "instance".  You will upload it as a sample into Gazelle Test Management.  Finally, you will use Gazelle External Validation Service (EVS) to validate your CapabilityStatement.

    ==> Later during Connectathon: 

    • Connectathon test "01_CapabilityStmt_ResourceCheck" describes handling of FHIR CapabilityStatements during the IHE Connectathon.
    • If your test system implements FHIR Server capabilities, we expect you to be able to respond to a FHIR capabilities interaction (https://www.hl7.org/fhir/http.html#capabilities) to make your CapabilityStatement available to test partners and monitors.  Connectathon monitors will have tools to enable them to retrieve and review the CapabilityStatement for your test system.

    ReferenceIHE (ITI) Appendix Z on HL7 FHIR, Section Z.3"HL7 FHIR allows service implementers to publish a CapabilityStatement Resource describing the resources, transport, formats, and operations that can be performed on a series of resources for the service instance. The CapabilityStatement Resource is described in FHIR:  http://hl7.org/fhir/CapabilityStatement.html.   Actors providing http server functionality shall publish a CapabilityStatement on the metadata endpoint as described in FHIR http://hl7.org/fhir/R4/http.html#capabilities."

    Instructions for the Preparatory phase:

     

    First, create a Sample entry in Gazelle Test Management for your CapabilityStatement:

    1. Create a CapabilityStatement Resource that represents the FHIR capabilities in your test system/product.  
    2. Upload the XML or JSON file for the CapabilityStatement Resource into the Sample Exchange area of Gazelle Test Management under. On the Samples to share tab, upload your file under the FHIR CapabilityStatement entry.    Though most systems will have one CapabilityStatment, you may upload more than one file.
      1. Important note:  Unlike other samples, you will not validate the CapabilityStatement within the Samples UI.  See next step...

    Second, validate your CapabilityStatement using Gazelle EVSClient:

    1. Access Gazelle EVS, menu 'IHE --> FHIR - IG-based'
    2. Under that menu, upload your CapabilityStatement into the tool and select the correct validator from the dropdown list.
    3. The validation result should show "Passed".

    Evaluation

    • The EVS creates a Permanent link to your results
    • To record your results for this test in Gazelle Test Management, paste the Permanent link(s) to your EVS validation results into the proper test instance as evidence.

    NOTE:  You will be asked to provide this CapabilityStatement during Connectathon, and Monitors will examine it then, so it is to your benefit to do this preparation in advance of Connectathon.

    OIDs_Do_This_First

    Introduction

    To enable interoperability testing at an IHE Connectathon, some actors require OIDs to be assigned for various attributes in the messages they will exchange.  Gazelle Test Management assigns these OID values.

    For example:

    • homeCommunityID OID for Gateway actors in XCA, XCPD...
    • patient ID assigning authority OID for actors that create Patient IDs:  e.g., Patient Identity Sources, Radiology ADT systems, more...
    • organization ID OID for actors sending/receiving HL7v3 messages
    • repositoryUniqueId OID for XDS Document Repositories
    • and more...

    Instructions

    • Log in to Gazelle Test Management (https://gazelle.ihe.net/TM - the link may be different for your testing session)
    • Access menu Preparation-->OID registry
        • Note that the Connectathon Technical Manager will announce when OIDs have been generated for a test session.  If you don't see any OID entries on this page, then they have not been generated yet.
    • Find the OIDS for your system:
        • Use the dropdown list at the top of the page to search for the OIDs for your "System". 
        • If there is no entry for your system, it means that you have not registered for an actor that needs an OID assignment
    • Find the OIDs for your test partners:
        • Use the dropdown list to filter by OID "Label" or "Profile / Actor" to find the OIDs for your test partners,
          e.g.:
          • Initiating Gateways need the homeCommunityId OID of their Responding Gateway partners
          • XDS Document Repositories need the sourceId OID of their Document Source partners

    • Special note for FHIR Servers:  For some FHIR-based profiles, a server base URL may be used instead of an assigning authority OID.  For example, for a PIXm Patient Identity Source system, a FHIR server with base http://fhir.mydomain.com would use that value as the patient ID assigning authority (rather than an assigning authrority such as urn:oid:1.2.3.4.5).    In this case, the FHIR Server may use its own base URL, rather than the OID value assigned in Gazelle Test Management.

    Once you find the OIDs you need, configure your Test System in advance so that you are ready to start testing at the beginning of the Connectathon.

    Evaluation

    There is no result file to upload into Gazelle Test Management for this test.

    ITI 'Do/Read This First' Tests

    This is an index of Do This First  and Read This First tests defined for profiles in the IHE IT Infrastructure (ITI) domain.

     

    APPC_Read_This_First

    Introduction

    This informational 'test' provides an overview of APPC tests and defines value sets used when creating Patient Privacy Policy Consent documents during Connectathon. Please read what follows and complete the preparation described before the Connectathon.

    Instructions

    WHICH APPC TESTS SHOULD I RUN?

    The APPC profile enables creating Patient Privacy Policy Consent documents of many, many variations.  We have created test cases that cover some of the common use cases in the APPC profile for consent documents that would be commonly created for patients in health care facilities, eg consent disclose patient's data to an facility or to restrict disclosure of data from a specific individual provider.

    Content Creators

    • Run tests for two of the APPC uses cases; you may choose from the Meta test which two you want to run.  Test names are APPC_Case*.
    • Run one test -- APPC_Other_Use_Case -- that allows you to create a Patient Privacy Policy Consent document of your choosing. We hope you use this to demonstrate the depth & breadth of your capabilities as a APPC Content Creator.
    • Support your APPC Content Consumer test partners that want to demonstrate their View or Structured Policy Processing capabilities using your APPC document(s).
    • Optional:  There is no requirement in APPC that the Content Creator be able to act as an XDS Doc Source and thus be able to submit your APPC documents to an XDS Repository/Registry.  If you have this capability, you can run test APPC_Submit_PolicyConsentDoc.

    Content Consumers

    • If you support the View Option, we assume that you can render any APPC document from a Content Creator
    • If you support the Structured Policy Processing Option, you will run the associated test.

     

    HOW DO APPC DOCUMENTS GET FROM THE CONTENT CREATOR TO THE CONTENT CONSUMER?

    The APPC Profile does not mandate how Consent documents get from a Content Creator to a Content Consumer.  It could be XDS, another IHE profile, or a non-IHE method.  

    At the Connectathon, we ask Content Creators to upload the Patient Privacy Policy Consent documents it creates into the Samples area of Gazelle (menu Connectathon-->Connectathon-->List of samples, on the 'Samples to share' tab.   Content Consumers will find uploaded samples under the same menu on the 'Samples available for rendering' tab.

     

    WHICH PATIENT SHOULD BE USED FOR APPC TESTS?

    Each APPC Patient Privacy Policy Consent document applies to a single PATIENT.  In a consent document, the patient ID and assigning authority values are encoded with the AttributeId urn:ihe:iti:ser:2016:patient-id .  A patient may have more than one privacy policy consent.  

    We do not identify a single 'test patient' that must be used for creating APPC documents for Connectathon testing.  The Content Creator may include any valid Patient ID.  If the policy restricts or allows access based on values the XDS metadata for a patient's documents, the Content Creator may use a Patient ID in for clinical document(s) the policy applies to.

     

    WHAT PRIVACY POLICIES, & OTHER VALUE SETS ARE DEFINED FOR APPC TESTING?

    APPC Patient Privacy Policy Consent documents rely on the affinity domain agreeing to a set of PATIENT PRIVACY POLICIES that apply to ORGANIZATIONS and INDIVIDUAL PROVIDERS.  These policies, organizations, providers are identified by unique identifiers that are recognized within the affinity domain, and are encoded in a patient's consent document.

    To enable the APPC Content Creator to specify different policies, this test defines values for various attributes used in policies:

    • Policies - that a patient may apply to his/her documents
    • Facilities - where a patient could receive care; facilities have an identified type.
    • Providers - who provide care for a patient; providers have an identified role.
    • Doc Sources - who create a patient's document(s) shared in an XDS environment
    • Confidentiality Codes - applied to a patient's document that can be part of a policy
    • Purpose of Use

    The tables below contain value sets that are defined for the purpose of Connectathon testing of the APPC profile.

    • APPC tests will ask Content Creators to use these when creating APPC Patient Privacy documents.  
    • APPC Content Consumers should recognize these values.

     


    POLICIES FOR CONNECTATHON TESTING OF APPC:

    Policies:

    (APPC use case)

    PolicySetIdReference Policy description

    FOUNDATIONAL POLICY

    (all use cases)

    urn:connectathon:bppc:foundational:policy

    By default, in this Connectathon affinity domain, document sharing is based on the value of Confidentiality Code (DocumentEntry.confidentialityCode).  This policy (inherited from BPPC) is applied to all documents.   

    • Documents with a confidentialityCode of “N” (normal) are always shared unless a patient selects a more restrictive policy.
    • Documents with a confidentialityCode of “R” (restricted) are not shared **unless** a patient explicitly “Opts in”, by agreeing to a less restrictive policy below
    • Documents with a confidentialityCode of “V” (very restricted) are never shared, even if a patient also chooses a less restrictive policy. This is an artificial condition that enables testing.
    • Documents with a confidentiality code of "U" (unrestricted) are always shared.   It is not allowed to apply a more restrictive policy to these documents.

    A patient may also choose to apply one of the additional policies below.

    • This means that when an APPC document is created with multiple rules, then RuleCombiningAlgId=urn:oasis:names:tc:xacml:1.0:rule-combining-algorithm:permit-overrides
    FULL ACCESS TO ALL urn:connectathon:policy:full-access The patient agrees that the document may always be shared.  (This is equivalent to having a confidentiality code of "U".)
    DENY ACCESS TO ALL urn:connectathon:policy:deny-access The patient prohibits the document from ever being shared.  (This is equivalent to having a confidentiality code of "V".)
    DENY ACCESS EXCEPT TO PROVIDER urn:connectathon:policy:deny-access-except-to-provider The patient prohibits the document from being shared except with the provider(s) identified in the consent document.

    DENY ACCESS TO PROVIDER
    (use case 5)

    urn:connectathon:policy:deny-access-to-provider The patient prohibits the document from being shared with the provider(s) identified in the consent document.   The referenced individual provider(s) is prohibited from accessing this patient's documents (ie no read or write access).

    DENY ACCESS EXCEPT TO FACILITY
    (use case 1)

    urn:connectathon:policy:deny-access-except-to-facility The patient prohibits the document from being shared except with the facility(ies) identified in the consent document.
    DENY TO ROLE urn:connectathon:policy:deny-access-to-role The patient prohibits the document from being shared with providers who have the role(s) identified in the consent document
    FULL ACCESS TO ROLE urn:connectathon:policy:full-access-to-role The patient allows the document to be shared with providers who have the role(s) identified in the consent document.  The patient prohibits the document from being shared with providers with any other role(s).
    LIMIT DOCUMENT VISIBILITY
    (use case 6)
    1.3.6.1.4.1.21367.2017.7.104 The patient prohibits sharing the referenced clinical document(s) and this privacy policy consent document with any healthcare provider or facility.

     

    ORGANIZATIONS/FACILITIES defined in the "Connectathon Domain":

    XACML AttributeId-->

    Facility
    Name:

    urn:ihe:iti:appc:2016:document-entry:healthcare-facility-type-code
    (DocumentEntry.healthcareFacilityTypeCode)
    urn:oasis:names:tc:xspa:1.0:subject:organization-id
    Connectathon Radiology Facility for IDN One code=”Fac-A”
    displayName=”Caregiver Office”
    codeSystem=”1.3.6.1.4.1.21367.2017.3"
    urn:uuid:e9964293-e169-4298-b4d0-ab07bf0cd78f
    Connectathon Radiology Facility for NGO Two code=”Fac-A”
    displayName=”Caregiver Office”
    codeSystem=”1.3.6.1.4.1.21367.2017.3"
    urn:uuid:e9964293-e169-4298-b4d0-ab07bf0cd12c
    Connectathon Dialysis Facility One code=”Fac-B”
    displayName=”Outpatient Services”
    codeSystem=”1.3.6.1.4.1.21367.2017.3"
    urn:uuid:a3eb03db-0094-4059-9156-8de081cb5885
    Connectathon Dialysis Facility Two code=”Fac-B”
    displayName=”Outpatient Services”
    codeSystem=”1.3.6.1.4.1.21367.2017.3"
    urn:uuid:be4d27c3-21b8-481f-9fed-6524a8eb9bac

     

    INDIVIDUAL HEALTHCARE PROVIDERS defined in the "Connectathon Domain":

    XACML AttributeId-->

    Provider
    Name:

    urn:oasis:names:tc:xacml:1.0:subject:subject-id urn:oasis:names:tc:xspa:1.0:subject:npi urn:oasis:names:tc:xacml:2.0:subject:role
    Dev Banargee devbanargee urn:uuid:a97b9397-ce4e-4a57-b12a-0d46ce6f36b7

    code=”105-007”
    displayName=“Physician/Medical Oncology”
    codeSystem="1.3.6.1.4.1.21367.100.1"

    Carla Carrara carlacarrara urn:uuid:d973d698-5b43-4340-acc9-de48d0acb376

    code=”105-114”
    displayName=”Radiology Technician”
    codeSystem="1.3.6.1.4.1.21367.100.1"

    Jack Johnson jackjohnson urn:uuid:4384c07a-86e2-40da-939b-5f7a04a73715 code=”105-114”
    displayName=”Radiology Technician”
    codeSystem="1.3.6.1.4.1.21367.100.1"
    Mary McDonald marymcdonald urn:uuid:9a879858-8e96-486b-a2be-05a580f0e6ee code=”105-007”
    displayName=“Physician/Medical Oncology”
    codeSystem="1.3.6.1.4.1.21367.100.1"
    Robert Robertson robertrobertson urn:uuid:b6553152-7a90-4940-8d6a-b1017310a159 code=”105-007”
    displayName=“Physician/Medical Oncology”
    codeSystem="1.3.6.1.4.1.21367.100.1"
    William Williamson williamwilliamson urn:uuid:51f3fdbe-ed30-4d55-b7f8-50955c86b2cf code=”105-003”
    displayName=“Nurse Practitioner”
    codeSystem="1.3.6.1.4.1.21367.100.1"

     

    XDS Document Sources:

    XACML AttributeId-->

    Source Id:

    urn:ihe:iti:appc:2016:source-system-id
    (SubmissionSet.sourceId)
    Use sourceId as assigned in Gazelle to Connectathon XDS Doc Sources Various XDS Document Sources systems

     

    CONFIDENTIALITY CODES:

    XACML AttributeId-->

    ConfidentialityCode:

    urn:ihe:iti:appc:2016:confidentiality-code
    (DocumentEntry.confidentialityCode)
    normal code=”N”
    displayName=”normal”
    codeSystem=”2.16.840.1.113883.5.25"
    restricted code=”R”
    displayName=”restricted”
    codeSystem=”2.16.840.1.113883.5.25"
    very restricted code=”V”
    displayName=”very restricted”
    codeSystem=”2.16.840.1.113883.5.25"
    unrestricted code=”U”
    displayName=”unrestricted”
    codeSystem=”2.16.840.1.113883.5.25"

     

    PURPOSE OF USE:

    XACML AttributeId-->

    Purpose of use:

    urn:oasis:names:tc:xspa:1.0:subject:purposeofuse
    TREATMENT code=”99-101”
    displayName=”TREATMENT”
    codeSystem="1.3.6.1.4.1.21367.3000.4.1"
    EMERGENCY code=”99-102”
    displayName=”EMERGENCY”
    codeSystem="1.3.6.1.4.1.21367.3000.4.1"
    PUBLICHEALTH code=”99-103”
    displayName=”PUBLICHEALTH”
    codeSystem="1.3.6.1.4.1.21367.3000.4.1"
    RESEARCH code=”99-104”
    displayName=”RESEARCH”
    codeSystem="1.3.6.1.4.1.21367.3000.4.1"

    Evaluation

    There is no evaluation for this informational test. If the systems testing the APPC Profile do not do the set-up described above, then APPCC tests at Connectathon will not work.

    BPPC_Read_This_First

    Introduction

    This is an informational 'test'. We want all actors involved in testing the BPPC Profile and the BPPC Enforcement Option to read the "Privacy Policy Definition for IHE Connectathon Testing".

    Instructions

    Prior to arriving at the Connectathon, read this document: Privacy Policy Definition for IHE Connectathon Testing. This contains the policy for XDS Affinity Domains at the Connectathon, including 2 BPPC-related items.

    Evaluation

    There is no evaluation for this informational test. If the systems do not do the set-up described above, then BPPC Profile tests and BPPC Enforcement Options tests at Connectathon will not work.

    CSD_Load_Directory_Test_Data

    Introduction

    This is a "task" (ie not a test) that ensures that your '''CSD Care Services Directory''' is loaded with the entries that we will use as part of Connectathon testing.

    Instructions

    The Care Services Directory is loaded with Connectathon test data: (1) Codes, and (2) Organization, Provider, Facility, and Service information. 

    (1) Load Connectathon code sets:

    ITI TF-1:35.1.1.1 states, "Implementing jurisdictions may mandate code sets for Organization Type, Service Type, Facility Type, Facility Status, Provider Type, Provider Status, Contact Point Type, Credential Type, Specialization Code, and language code. A Care Services Directory actors shall be configurable to use these codes, where mandated."

    For Connectathon testing, we define these codes and ask that you load them onto your Directory prior to arriving at the connectathon. They are documented in the format defined in IHE's SVS (Sharing Value Sets) profile, though support for SVS is not mandated in IHE.

    The code sets are found here in Google Drive under IHE Documents > Connectathon > test_data > ITI-profiles > CSD-test-data >  CSD_Directory_Codes. (They are also available in the SVS Simulator: http://gazelle.ihe.net/SVSSimulator/browser/valueSetBrowser.seam

    • 1.3.6.1.4.1.21367.200.101-CSD-organizationTypeCode.xml
    • 1.3.6.1.4.1.21367.200.102-CSD-serviceTypeCode.xml
    • 1.3.6.1.4.1.21367.200.103-CSD-facilityTypeCode.xml
    • 1.3.6.1.4.1.21367.200.104-CSD-facilityStatusCode.xml
    • 1.3.6.1.4.1.21367.200.105-CSD-providerTypeCode.xml
    • 1.3.6.1.4.1.21367.200.106-CSD-providerStatusCode.xml
    • 1.3.6.1.4.1.21367.200.108-CSD-credentialTypeCode.xml
    • 1.3.6.1.4.1.21367.200.109-CSD-specializationTypeCode.xml
    • 1.3.6.1.4.1.21367.200.110-CSD-languageCode.xml

    (2) Load Connectathon Organization, Provider, Facility, and Services entries

    In order to perform query testing with predictable results, Care Services Directories must be populated with the entries in the following files here in Google Drive under IHE Documents > Connectathon > test_data > ITI-profiles > CSD-test-data >  CSD_Directory_Entries.

    Some directories may support only a subset of these entry types:

    • CSD-Organizations-Connectathon-<date>.xml
    • CSD-Providers-Connectathon-<date>.xml
    • CSD-Facilities-Connectathon-<date>.xml
    • CSD-Services-Connectathon-<date>.xml

    (3) Additional Organization, Provider, Facility, and Services entries

    The Connectathon entries are limited in scope.  We expect Directories to be populated with additional Organization, Provider, Facility & Service entries.  We give no specific guidance on the number of entries, but we are looking for a more realistic database.  Good entries offer better testing opportunities.

    Evaluation

    Create a short text file saying that you have finished loading your codes. Upload that text file into Gazelle Test Management as the results for this 'test'. That is your signal to use that you are ready for Connectathon testing.

    HPD: Load Provider Test Data for Connectathon testing

    Introduction

    At the Connectathon, the HPD tests assume that a pre-defined set of Organizational and Individual provider information has been loaded on all of the Provider Information Directory actors under test.

    Instructions

    • We expect that your Directory will also contain other provider information, beyond what is in this test set.

    Evaluation

    There are no result files to upload into Gazelle Test Management for this test.  Preloading these prior to the Connectathon is intended to save you precious time during Connectathon week.

     

    AttachmentSize
    Office spreadsheet icon HPD_test_providers.xls51.5 KB

    mCSD Load Test Data

    Introduction

    This is not an actual "test". Rather it is a task that ensures that the mCSD Care Services Selective Supplier is loaded with the Resources and value sets that we will use as part of Connectathon testing.

    Instuctions for Supplier Systems

    The instructions below apply to mCSD Supplier systems. (The mCSD Consumer actor is included on this test so that it is aware of this test mCSD test data, but it has no preload work to do. During Connectathon, the Consumer will be performing queries based on the content of these Resources.)

    (1) Connectathon FHIR Resources

    In order to perform query testing with predictable results, the Care Services Selective Supplier system must be populated with the entries from pre-defined FHIR Resources:

    • Organization
    • Location
    • HealthcareService
    • Practitioner
    • PractitionerRoles

    **Some Suppliers may support only a subset of these. **

    These resources are available in two places  (the test data is the same in both places, so you only need to access/load one set):

    • On the HAPI FHIR Read/Write Server deployed for Connectathon.  (note that the link to this server be published by the technical manager of your testing event).

    (2) Additional Resources

    The pre-defined Connectathon test data are limited in scope.  We expect Suppliers to be populated with additional Organization, Provider, Facility & Service Resources.  We give no specific guidance on the number of Resources, but we are looking for a more realistic database.  Good entries offer better testing opportunities.

    (3) Value Sets for some codes:

    The FHIR Resources for mCSD testing contain codes from some pre-defined ValueSet Resources.

    These ValueSets are also found in Github and on the FHIR Read/Write Server at the links above.

    code FHIR ValueSet Resource id
    Organization Type  IHE-CAT-mCSD-organizationTypeCode
    Service Type  IHE-CAT-mCSD-serviceTypeCode
    Facility Type  IHE-CAT-mCSD-facilityTypeCode
    Facility Status  IHE-CAT-mCSD-facilityStatusCode
    Provider Type  IHE-CAT-mCSD-providerTypeCode
    Provider Status  IHE-CAT-mCSD-providerStatusCode
    Credential Type  IHE-CAT-mCSD-credentialTypeCode
    Specialty Type  IHE-CAT-mCSD-specializationCode
    Language  languages
    Provider Qualification  v2-2.7-0360

     

    The mCSD Resources also contain codes from these FHIR ValueSets:

    Evaluation

    There are no result files to upload into Gazelle Test Management for this test.  Preloading these prior to the Connectathon is intended to save you precious time during Connectathon week.

     

    mXDE_Read_This_First

    Introduction

    This is not an actual "test". The Instructions section below describe the testing approach for the mXDE Profile. It provides context and preparation information prior to performing Connectathon tests with your test partners.

    Instructions

    Please read the following material prior to performing Connectathon tests for the mXDE Profile.

    Overall Assumptions:

    (1) There is a significant overlap between the mXDE and QEDm profiles.  Each mXDE actor must be grouped with its QEDm actor counterpart.  Thus, you should successfully complete tests for QEDm before attempting mXDE tests.

    (2) The mXDE Profile refers to extracting data from documents but does not specify the document types. For purpose of Connectathon testing, we will provide and enforce use of specific patients and specific documents.  We intend to use the same clinical test data for both QEDm and mXDE tests.   See details about test patients and documents in the QEDm ReadThisFirst and DoThisFirst tests.

    • For mXDE, we will extract data from documents for patient Chadwick Ross.
    • If our test data (CDA documents, FHIR Resources) are not compatible with your system, please let us know as soon as possible. We would prefer to use realistic test data. It is not our intention to have you write extra software to satisfy our test data.
    • Some of the coded data with the documetns may use code systems that are not in your configuration. For example, they might be local to a region. You will be expected to configure your systems to use these codes just as you would at a customer site.
    • We will require that the extraction process be automated on the mXDE Data Element Extractor system.

    (3) The mXDE Data Element Extractor actor is grouped with an XDS Document Registry and Repository or an MHD Document Responder.

    • When grouped with the XDS Document Registry and Repository, we will expect the Data Element Extractor to accept (CDA) documents that are submitted by XDS transactions to initiate the extraction process.
    • When grouped with an MHD Document Responder, we will expect the Data Element Extractor to accept (CDA) documents submitted to an MHD Document Recipient to initiate the extraction process.
    • In summary, you may not extract from your own internal documents.  You must use the Connectathon test data provided.

    (4) The tests reference several patients identifed in QEDm: Read_This_First. These same patients are used for mXDE tests.  The Data Element Extractor may choose to reference the patients on the Connectathon FHIR Read/Write Server or may import the Patient Resources and host them locally.

    (5) The Provenance Resource is required to contain a reference to the device that performed the extraction. Because the device is under control of the Data Element Extractor, the Data Element Extractor will be required to host the appropriate Device Resource. You are welcome to use multiple devices as long as the appropriate Device resources exist.  (See QEDm Vol 2, Sec 3.44.4.2.2.1).

    (6) The QEDm Profile says the Provenance Resource created by the mXDE Data Element Extractor shall have [1..1] entity element which point to the document from which the data was extracted. 

    • The Data Element Extractor that supports both the MHD and XDS grouping may create one Provenance Resource that references both forms of access (via MHD, via XDS) or may create separate Provenance resources to reference the individual documents.

    (6) During the Connectathon, we want you to execute mXDE tests using the Gazelle Proxy. That will simplify the process of collecting transaction data for monitor review.

    mXDE Data Element Extractor actor:

    Overall mXDE test workflow:

    (1) Create one or more Device Resources in your server (to be referenced by Provenance Resources you will create).

    (2) Import the required test patients or configure your system to reference the test Patient Resources on the FHIR Read/Write Server.

    (3) Repeat this loop:

    • The Extractor will host the test document(s) on your system (you may have received them via MHD or XDS).
    • Your system will extract the relevant data and create FHIR Resources (Observation, and/or Medication, and or...) on your server.
    • Your system will create a least one Provenance Resource for each of the FHIR Resources created.
    • The Provenance Resource will reference at least one entity (MHD or XDS) and may reference both entities if supported.
    • A software tool or Data Element Provenance Consumer will send FHIR search requests to your system. A set of queries that might be sent is included below. [Lynn: refer to the relevant *Search* test]

    mXDE Data Element Provenance Consumer actor:

    Overall mXDE test workflow:

    (1) Configure your system with the endpoint of the Data Element Extractor partner.

    (2) Repeat this loop for each data element supported (Observation, Medication, ...); some of the items might occur in a different order based on your software implementation:

    • Send a search request of the form: GET [base]/[Resource-type]?_revinclude=Provenance:target&criteria
    • Retrieve the source document or documents referenced in the Provenance Resource(s).
    • IHE profiles usually do not defined how a consumer system makes use of clinical data. The mXDE Profile is no different. We will expect you to demonstrate some reasonable action with the FHIR resources that have been retrieved and the source document(s).

    Evaluation

    There are no result files to upload into Gazelle Test Management for this test.  Understanding the testing approach in advance is intended to make testing during Connectathon week more efficient.

     

    AttachmentSize
    Office spreadsheet icon HPD_test_providers.xls51.5 KB

    PMIR_Connectathon_Test_Patients

    On this page:

    Overview:

    These instructions apply to the Patient Identity Registry actor in the Patient Master Identity Registry (PMIR) Profile.  In this test, a PMIR Registry will load its database with Patient Resources formatted for [ITI-93] Mobile Patient Identity Feed, to support subscription tests that will occur later, during the Connectathon.

    Background Information:

    Read this section for background information about test patients used for PMIR testing at the Connectathon. Otherwise, for instructions for loading test patients, skip to the Instructions below.

     

    In a normal deployment, a product is operating in an environment with a policy for patient identity creation and sharing that remains stable.

    However, at the Connectathon, we test multiple profiles (for patient management:  PIX, PDQ, PMIR...  for document sharing:  XDS, MHD...).   Thus, the Connectathon provides special challenges when requirements for actors differ across IHE Profiles.    Particularly relevant in PMIR and PIXm is the behavior of the server actor (the PMIR Patient Identity Registry & the PIXm Cross-Reference Manager).

    A PIXm Patient Identifier Cross-Reference Manager:

    Source of Patients in the PIXm Profile: The PIX Manager has many patient records, and a single patient (person) might have several records on the PIX Manager server that are cross-referenced because they apply to the same patient.   The Patient Identity Feed [ITI-104] transaction in PIXm was introduced in PIXm Rev 3.0.2 in March 2022.   The PIX Manager may also have other sources of patient information (eg HL7v2 or v3 Feed).
    At the Connectathon, we ask PIX Managers to preload “Connectathon Patient Demographics” that are are provided via the Gazelle Patient Manager tool (in HL7v2, v3, or FHIR Patient Resource format).  These Connectathon Patient Demographics contain four Patient records for each ‘patient’, each with identical demographics (name, address, DOB), but with a different Patient.identifier (with system values representing the IHERED, IHEGREEN, IHEBLUE, and IHEFACILITY assigning authority values).   We expect that the PIX Manager will cross-reference these multiple records for a single patient since the demographics are the same.

    QUERY: When a PIXm Consumer sends a PIXm Query [ITI-83] to the PIXm Manager with a sourceIdentifier representing the assigning authority and patient ID (e.g. urn:oid:1.3.6.1.4.1.21367.3000.1.6|IHEFACILITY-998), the PIXm Manager would respond with [0..*] targetId(s) which contain a Reference to a Patient Resource (one reference for each matching Patient Resource on the server).
    At the Connectathon, if a PIXm Consumer queried by a Patient ID in the IHEFACILITY domain (above example), if there is a match, the PIXm Manager would return a response with three matches, one each for RED, GREEN, and BLUE, e.g.:

    PIXm Response Example 

    A PMIR Patient Identity Registry:

    Source of Patients in the PMIR Profile: The PMIR Profile is intended for use in an environment where each patient has a single “Golden Patient record”. In PMIR, a patient has a single “Patient Master Identity” (a.k.a. Golden Patient record) that is comprised of identifying information, such as business identifiers, name, phone, gender, birth date, address, marital status, photo, contacts, preference for language, and links to other patient identities (e.g. a mother’s identity linked to a newborn).

    The PMIR Patient Identity Source actor sends the Mobile Patient Identity Feed [ITI-93] transaction to the PMIR Patient Identity Registry to create, update, merge, and delete Patient Resources.  

    The PMIR Regsitry persists one "Patient identity" per patient.  The PMIR Registry relies on the Patient Identity Source actor as the source of truth about new patient records (FHIR Patient Resources), and about updates/merges/deletes of existing patient records (i.e. the Registry does whatever the Source asks). The PMIR Registry does not have algorithms to 'smartly' cross-reference multiple/separate records for a patient.  .

    In the FHIR Patient Resource in PMIR, there are two attributes that hold identifiers for a patient:

      1. Patient.id - in response to a Mobile Patient Identity Feed [ITI-93] from a Source requesting to 'create' a new patient, the Patient Identity Registry assigns the value of Patient.id in the new Patient Resource. This value is *the* unique id for the patient's 'golden identity' in the domain.
      2. Patient.identifiers - the Patient Resource may contain [0-*] other identifiers for the patient, e.g. Patient ID(s) with assigning authority, a driver's license number, Social Security Number, etc.  In the pre-defined Patient Resources used for some PMIR tests, we have defined a single assigning authority value. (urn:oid:1.3.6.1.4.1.21367.13.20.4000, aka the "IHEGOLD" domain). This distinguishes these patients from patients in the red/green/blue domains used to test other profiles.

    At the Connectathon, because some systems support PIX/PIXv3/PIXm and PMIR, we provide separate Patients (in [ITI-93] format) for PMIR testing with identifiers in a single assigning authority - urn:oid:1.3.6.1.4.1.21367.13.20.4000 (aka IHEGOLD). PMIR Registry systems will preload these patients, and they are only used for PMIR tests.  These patients have different demographics than the traditional red/green/blue Connectathon patients used for PIX, PDQ, XDS, and XCA.

    QUERY: When a PMIR Patient Identifier Cross-Referece Consumer sends a PIXm Query [ITI-83] to the PMIR Registry with a sourceIdentifier representing the assigning authority and patient ID  (e.g. urn:oid:1.3.6.1.4.1.21367.13.20.4000|IHEGOLD-555), if there is a match, the PMIR Registry would return a response with the one (matching) Patient Resource (the 'Golden' patient record). 

    At the Connectathon, if a Patient Identifier Cross-Reference Consumer send a PIXm Query by a Patient ID in the GOLD domain, if there is a match, the PMIR Registry would return a response with a reference to one Patient Resource.

     

    In conclusion, using the RED/GREEN/BLUE Patients for PIX* testing, and the GOLD Patients for PMIR testing enables us to separate expected results that differ depending on whether a server is a PIX* Patient Identifier Cross-reference Manager or a PMIR Patient Identity Registry in a given test.  We have managed testing expectations by using patients in different domains for testing the two profiles, but we don't tell you how you manage this in your product if you support both PMIR and PIX.

    Instructions:

    In this test, the PMIR Patient Identity Registry loads a set of FHIR Patient Resources used in PMIR peer-to-peer subscription tests.  We use a set of FHIR Patient Resources for PMIR testing that are different than the Connectathon patients for testing the PIX* & PDQ* Profiles.

    The patient test data is a Bundle formatted according to the requirements for a Mobile Patient Identity Feed [ITI-93] transaction.

    The PMIR Patient Identity Manager should follow these steps to load the patient test data:

    1. Find the test patients in Github here: https://github.com/IHE/connectathon-artifacts/tree/main/profile_test_data/ITI/PMIR
    2. Download this file:  001_PMIR_Bundle_POST-AlfaBravoCharlie-GOLD.xml   (we only provide the data in xml format)
      That .xml file is a Bundle formatted according to the requirements for ITI-93.
    3. POST that file to your PMIR Registry so that you are hosting the three Patient Resources within that file.
    4. Note that there are other files in that directory that you can access now, but do not POST them to your Registry until instructed to do so in the Subscription test during the Connectathon.  It is important to wait because posting them during the Connectathon will be a trigger for a subscription.

    Evaluation:

    There is no log file associated with this 'test', so you do not need to upload results into Gazelle Test Management for this test.

    PLT_Preload_Codes

    Introduction

    To prepare for testing the ITI Patient Location Tracking (PLT) Profile, the PLT Supplier, Consumer and Supplier actors must use a common set of HL7 codes. 

    We ask these actors to load codes relevant to their system in advance of the Connectathon.

     

    Instructions

    The codes you need are identified in the peer-to-peer test that you will perform at the Connectathon.

    1.  In Gazelle Test Management, find the test "PLT_TRACKING_WORKFLOW" on your main Connectathon page.

    2.  Read the entire test to understand the test scenario.

    3.  Load the codes for PV1-11 Temporary Patient Location.  These are in a table at the bottom of the Test Description section.

     

    Note:  These codes are are a subset of the HL7 codes used during Connectathon.  If you already performed pre-Connectathon test "Preload_Codes_for_HL7_and_DICOM", then you already have these codes.

     

    Evaluation

    There is no evaluation for this 'test'.   If you do not load the codes you need on your test system prior to the Connectathon, you may find yourself wasting valuable time on the first day of Connectathon syncing your codes with those of your test partners.

    Preload Connectathon Test Patients

    Introduction:

    These instructions apply to:

    • PIXv2/PIXv3/PIXm - PIX Managers
    • PDQv2/PDQv3/PDQm - Patient Demographics Suppliers
    • XD*, MHD - Document Sources submitting docs to XDS Reg/Rep or MHD Recipient
    • XC* - Initiating & Responding Gateways
    • MHD testing

    To support PIX, PDQ, XDS, and XC* tests, we ask you to arrive at the Connectathon with specific patients already loaded in your database.  These demograpics include two "well-known" patients FARNSWORTH^STEVE (for PIXv2) and WALTERS^WILLIAM (for PIXv3 & PIXm tests), plus several patients for PDQ/PDQv3/PDQm. There are also several patients used in XDS, SDC/MHD and XC* testing.

    We use the Gazelle PatientManager tool to enable you to load these patients on to your Connectathon test system. 

    You can use the PatientManager to:

    • Download FHIR Patient Resources for these patients, individually or as a FHIR Bundle.
    • Send the patients to your system via an HL7v2 feed [ITI-8], and HL7 v3 feed [ITI-44], or by loading FHIR Resources. (These are easier methods because all patients can be loaded at once.)
      --or--
    • Respond to PIX* or PDQ* or XCPD queries for these patients (These methods are not as easy because you have to query for patients individually.)

    Gazelle PatientManager Tool:

    Instructions:

    Use the PatientManager tool to pre-load selected patients onto your test system.  It will help you to get this task done before arriving at Connectathon.

    Which patients?

    The PatientManager contains patients to load for Connectathon testing.

    • In the PatientManager tool, select menu Connectathon-->Patient Demographics
      • All patients listed are applicable to the Connectathon.  These records each have Patients IDs with the 4 Assigning Authorities used at the Connectathon (IHERED, IHEGREEN, IHEBLUE, and IHEFACILITY.  If you are new to Connectathon testing, this is explained here.
    • The patients you should load depend on what actor(s) you support:

    How?

    • To download patients as FHIR Patient Resources, use the download icon to download an individual Patient Resource, or use that icon at the top of the Actions column to download all selected patients as a FHIR Bundle.

    The User Manual contains documentation about:

    Evaluation:

    There is no log file associated with this 'test', so you do not need to upload results into Gazelle Test Management for this test.

    Preload PDO: Load Patients for Pediatric Demographics option

    Introduction

    These instructions apply to actors that support the Pediatric Demographics option.  This test asks actors to load their database with patients for "twin" use cases.

    Instructions

    Actors that support the Pediatric Demographics Option must run this 'test' to ensure that test patients with proper pediatric demographics are in its system in order to run subsequent tests on the Pediatric Demographics option.

    • PIX/PIXv3 Patient Identity Source systems - load these patients into your database.  At the Connectathon, you will perform peer-to-peer tests, you will be asked to send them to a PIX Manager.
    • PDQ/PDQv3/PDQm Patient Demographics Supplier systems - load these patients into your database.  At the Connectathon, you will perform peer-to-peer tests, you will be asked to respond to queries for these patients.

    The Pediatric Demographics test patients are here: http://gazelle.ihe.net/files/Connectathon_TwinUseCases.xls

    We ask you to manually load these patients.  Unfortunately, we cannot use the Gazelle Patient Manager tool to load these patients because the 'special' pediatric fields are not supported by the tool.

    Evaluation

    There is no log file associated with this 'test', so you do not need to upload results into Gazelle Test Management for this test.

    AttachmentSize
    Office spreadsheet icon Connectathon_TwinUseCases.xls43.5 KB

    RFD_Read_This_First

    This “Read This First” test helps to prepare you to test RFD-based profiles at an IHE Connectathon.

    A Pre-Connectathon Task for Form Managers & Form Processors

    Documenting RFD_Form_Identifiers: This is a documentation task. We request all Form Managers and Form Processors to help their Form Filler test partners by documenting the Form Identifiers (formID) that the Form Fillers will use during Connectathon testing. Follow the instructions in preparatory test RFD_formIDs

    Overview of RFD Connectathon Testing

    RFD and its 'sister' profiles:

    The number of IHE profiles based on RFD has grown over the years.  These 'sister' profiles re-use actors (Form Filler, Receiver, Manager, Processor, Archiver) and transactions (ITI-34, -35, -36) from the base RFD profile.

    Starting at 2016 Connectathons, to reduce redundant testing, we have removed peer-to-peer tests for RFD only.  If you successfully complete testing for an actor in a 'sister' profile, you will automatically get a 'Pass' for the same actor in the baseline RFD profile.  For example, if you "Pass" as a Form Filler in CRD, you will get a "Pass" for a Form Filler in RFD for 'free' (no additional tests).

    Similar test across profiles:

    Baseline Triangle and Integrated tests: These RFD tests exercise the basic RFD functions Retrieve Form and Submit Form.

    "Triangle" and "Integrated" refer to the combinations of actors in the tests. A Triangle test uses a Form Filler, Form Manager and Form Receiver (triangle). An Integrated test refers to the Form Processor that is an integrated system (supports both ITI-34 and ITI-35); the Integrated test uses a Form Filler and a Form Processor.

    CDA Document Tests: We have tried to be more thorough in our definitions of tests for CDA documents; we still have some work to do. There are “Create_Document” tests that ask actors that create CDA documents to produce a document and submit that document for scrutiny/validation by a monitor. There are test sequences that need those documents for pre-population or as an end product; you cannot run those tests until you have successfully completed the “Create_Document” test. We have modified the test instructions for the sequence tests that use CDA documents to require the creator to document which “Create_Document” test was used. We do not want to run the sequence tests before we know we have good CDA documents.

    Archiving Forms:  We have a single test -- "RFD-based_Profiles_Archive_Form" to test Form Archivers and Form Fillers that support the 'archive' option.  There are separate tests for archiving source documents.

    Testing of options: IHE does not report Connectathon test results for Options in IHE profiles. We readily admit that the tests that cover options will vary by domain and integration profile. If you read the tests in domains other than QRPH (or even in QRPH), you may find profiles that have no tests for named options. We continue to try to enhance tests for named options and other combinations of parameters found in the QRPH profiles.

    1. This has created an explosion of tests. You can look in the BFDRe, FP, SDC and VRDR tests and see this. BFDR-E and VRDR are interesting because they specify two different classes of pre-pop documents. SDC has three named options that control the Retrieve Form transaction. Finally, we still have to consider the Form Processor vs. Form Manager + Form Receiver issue.
    2. This is mainly directed at Form Fillers, but is relevant for other actors: If you have registered as a Form Filler with multiple options (say in SDC for HTML Package, URI Form and XML Form), you only have to complete testing for one of the paths to receive Connectathon credit. We are happy to work with you to test the other options; we do not have the systems in place for our public reporting to make this distinction.
    3. An option that you support might be something we need to test in order to validate a different actor from a different organization. If you decide to not test an option, we may still request your help with that option to help test another system for whom that behavior is required.

    ATNA Logging: If you implement the ATNA profile, one of the requirements is that you send audit messages for certain transactions, primarily those that carry PHI. The ITI-35 can be a PHI export transaction, implying that Form Filler who also supports ATNA should send an audit message. This is an issue for a Form Filler when the form was retrieved as an unencoded form (just retrieved the Form URI); the Form Filler does not actually have control over the form.

    If your Form Filler has requested an encoded form and has an XML form to process, it does have control over the ITI-35 transaction and should be able to send the appropriate audit message. The same issue exists for the ITI-36 Archive Form transaction.

    Form Instance Identifiers: The RFD profile discusses the concept of partial or interim storage of a form. In order to make this work, the Form Filler needs to have a Form Instance ID to retrieve the correct instance of the form that was stored. There are two different mechanisms for a Form Filler to obtain a Form Instance ID:

    1. The Form Instance ID is an optional field in the response to ITI-34.
    2. The Form Instance ID is an optional field in the response to ITI-35.

    That is, the Form Processor or Form Manager is going to control when the Form Instance ID is returned to the Form Filler. We need to get clarification from the ITI Technical Committee if the intended sequence included the Form Filler obtaining that Form Instance ID from the ITI-34 or from the ITI-35 transaction. Until we have that clarification, we will have to work with test participants and their interpretation of this mechanism.

    Use of the Gazelle Proxy

    Because RFD transactions are generally not sent over TLS connections (unless required by a specific profile), RFD tests are good candidates to use the Gazelle proxy. It will record HTTP transactions and allow you and the monitors to review those logs in a centralized manner.  We highly encourage you to use the proxy when performing this test.   It will allow you to easily see the messages exchanged between test partners and to document them for the monitor.

    Evaluation

    There is no result to upload into Gazelle Test Management for this informational test.

    RFD_formIDs

    Introduction

    In this tes preparatory test, Form Managers and Form Processors document the formID(s)  that will be used during a Connectathon in the [ITI-34] transaction.

    The formID may apply to the base RFD profile, or it may apply to content profile(s) based on RFD (eg CRD, VRDR, many others)

    Instructions

    Form Managers and Form Processors:

    For Connectathon testing, we expect that you create your own forms for content profiles with your own identifiers (formID).

    Edit the google spreadsheet linked below. Add one row to the spreadsheet for each formID hosted on your test system.

    Please do this in advance of the Connectathon. The goal is to provide documentation for your Form Filler test partners.

    Form Fillers:

    There is no specific task for you; however, access the spreadsheet linked below to see the formIDs you will use during Connectathon testing.

    RFD Form Identifiers google spreadsheet: https://docs.google.com/spreadsheets/d/11LM9eKzuA_CKJZKsQA7PRJ8uXjYLQ7LTukNJU4LzkDg/edit#gid=1667031332

    Evaluation

    When you complete the task above, create a small text file stating that your entries are complete. Upload that file into Gazelle Test Management as the results for this test.

    SVCM: Load FHIR Resources for Connectathon testing

    Introduction

    At the Connectathon, the SVCM Connectathon tests assume that a pre-defined set of FHIR Resources have been loaded on all of the  Terminology Repository actors under test.

    Instructions

    Prior to performing Connectathon tests, SVCM Terminology Repositories must load FHIR Resources:

    • CodeSystems
    • ValueSets
    • ConceptMaps

    These resources are available in Github here:  https://github.com/IHE/connectathon-artifacts/tree/main/profile_test_data/ITI/SVCM

    For reference only: This google sheet contains a list of the above resources.

     

    Your Repository may also contain other additional FHIR Resources during Connectathon.

    Evaluation

    There are no result files to upload into Gazelle Test Management.  Preloading these Resources in advance is intended to save you precious time during Connectathon week.

     

    SVS: Load Value Sets for Connectathon testing

    Introduction

    At the Connectathon, the SVS tests assume that a pre-defined set of value sets have been loaded on all of the Value Set Repository actors under test.

    Instructions

    (1) Prior to the Connectathon, SVS Repositories must load these value sets: http://gazelle.ihe.net/content/gazelle-value-sets

    (2) Ideally, your Repository will also contain other additional value sets during Connectathon.

    Evaluation

    There are no result files to upload into Gazelle Test Management.  Preloading these prior to the Connectathon is intended to save you precious time during Connectathon week.

     

    XC_Read_This_First

    This informational 'test' provides an overview of peer-to-peer Connectathon testing for the cross-community (XC*) family of IHE Profiles. Please read what follows and complete the preparation described before the Connectathon.

    Overview

    Approach for Connectathon testing of cross-community profiles (XCA, XCPD, others):

    Cross-community test scenarios are more complex than most in the Connectathon because of the effort to set up the initiating and responding communities' configuration.

    Try to be efficient in testing gateways that support Cross-Community profiles.  If your Gateway also supports multiple XC* profiles, you will want to run these tests while the XCA configurations are in place. 

    Explanatory resource material for these tests:

    Please refer to these resources used to manage cross-community profile testing.  These are shared during the Preparatory phase of the Connectathon:

     

     

    Testing Scope and 'Pass' Criteria

     

    How do I know if I have enough tests to "pass" at Connectathon??

    This is what we will look for when we grade the XCA profile Connectathon tests:

    -- Any XC* actor doing Supportive testing: 

    1. You will Pass if you have **one verified peer-to-peer test**  for a given profile/actor pair.  The Connectathon Monitors will be focusing most of their attention assisting Gateways doing Thorough testing.
    2. Supportive systems are not required to test async (because you tested it a previous Connectathons).  Because you are supportive, a test partner may ask you to help them test async.

    -- XCA Initiating Gateways doing Thorough testing:

    1. For Cross-Gateway Query [ITI-38], you must demonstrate the ability to translate the Patient ID known in your community into the patient ID with the assigning authority in known in two different responding communities (eg, you take the 'red' patient ID as known in your initiating community and query a Responding Gateway with a 'blue' patient ID and another Responding Gateway with a 'green' patient ID).
    2. For Cross-Gateway Retrieve [ITI-39], you must demonstrate the ability retrieve from two different communities and consolidate those retrieved documents into your initiating community (e.g., if you are a 'red' Initiating Gateway, you must retrieve from both a 'blue' and 'green Responding Gateway).
    3. You do not have to do #1 and #2 three different times.  If you have tested with Responding GWs from two different communities, you will have enough to pass.  You may be asked to perform more tests in order to help your partners complete their work.
    4. You must do all your tests using TLS.
    5. Async testing:
    • **If you have chosen to support the "Asynchronous Web Services Exchange (WS-Addressing-based)" option**, you must demonstrate once, your ability to do the query & retrieve using async communication.  If you have signed up for this option, you will see a related test to perform, but you can pass as an Initiating Gateway without supporting this option.
    • **If you have chosen to support the "AS4 Asynchronous Web Services Exchange" option**, you must demonstrate  your ability to do the query & retrieve using AS4 async communication.  If you have signed up for this option, you will see a related test to perform, but you can pass as an Initiating Gateway without supporting this option.  Because AS4 is a newer requirement, we ask you to perform the test with two partners (if there are two).
  • **If you have chosen to support the "XDS Affinity Domain" or "On Demand Docs" option**, you must pass one of these tests to get credit for the option.  You can pass as an Initiating Gateway without supporting these options.
  • -- XCA Responding Gateways doing Thorough testing:

    1. You must complete the XCA_Query_and_Retrieve test with at least two different Initiating Gateways.  You may be asked to perform more tests in order to help you partners complete their work.  (While your Init GW partner is likely to be in a different patient ID domain -- eg the Init GW is Red, your Resp GW is Blue -- note that since the Init GW does the patient ID translation all queries look the same from the Resp GW's point of view.)
    2. You must demonstrate your ability to support asynchronous communication  You cannot pass as an XCA Responding GW without demonstrating async support.  ***NEW as of 2019***, Responding Gateways shall support one of two types of async messaging.  You may support both:
      • Asynchronous Web Services Exchange (WS-Addressing based).  This is the original async protocol included in the XCA profile.
      • AS4 Asynchronous Web Services Exchange.  This was added to the XCA Profile in fall of 2018.
  • You must do all of your tests using TLS.
  • **If you have chosen to support the "On Demand Documents" option**, you must do one of these tests.  You can pass as a Responding Gateway without supporting this option.
  •  

    Configuration / Test Data

    homeCommunityID:

    Each XC*IGateway is assigned a homeCommunityId value for use during Connectathon week.  Gateways can find the homeCommunityId for their system and for their partner gateways in Gazelle Test Management under menu Preparation-->OID Registry

    Configuration Setup:

    XCA Initiating Gateways...

    • must be configured for all Responding Gateways identified as test partners, eg, if you are a "Red" Initiating Gateway, you should be prepared to test with "Blue" and "Green Responding Gateways
    • must be prepared to map Patient IDs in its community to patient IDs known by the responding community(ies). The Initiating Gateway must query each Responding Gateway with the correct patient identifier corresponding to the Responding GW's community.
    • for Initiating GWs that support the XDS Affinity Domain option, you must trigger a "Find Documents" query and a "Retrieve Document Set to the Initiating Gateway to trigger the Cross-Gateway query and retrieve.   You can do this several ways:
      • use the NIST XDS Toolkit to act as an XDS.b Document Consumer simulator to quedry * retrieve
      • ask another vendor's XDS.b Doc Consumer to help with the test
      • or the Initiating Gateway may  use its own application to trigger the Cross-Gateway Query and Retrieve

    XCA Responding Gateways...

    • The Responding Gateway will be configured to query the XDS.b Document Registry and retrieve from the Document Repository in its community.  For XCA tests, it is OK if it is your own company's Registry or Repository in your local community.
      --or--
    • The Responding Gateway will have access to the test patient's documents through non-IHE mechanisms.

    XCA-I Responding Imaging Gateways... 

    • must be configured to access one (or more) XDS.b Registry/Repository pairs that hold documents (ie DICOM manifests) for the XCA-I test patients. For XCA-I tests, it is OK if it is your own company's Registry or Repository in your 'local' community.  
    • must have access to an Imaging Document Source with the XDS-I DICOM Studies in order for the XCA-I query/retrieves to work.   This Imaging Document Source must be on a test system from another vendor (eg company AAA's Responding Imaging Gateway cannot retrieve images from its own Imaging Document Source)

    XCA-I Initiating Imaging Gateways...

    • The Initiating Imaging Gateway must be in a 'local community' with a Registry/Repository and an Imaging Doc Source that contains the XDS-I test patients.   It is OK for the Init Imaging GW, Registry, Repository and Imaging Doc Source to be from the same company
    • For XCA-I testing (unlike for XCA testing), we do not use the NIST XDS toolkit to trigger query and retrieves in the initiating community.  The Imaging Document Consumer must initiate the query and retrieve. 
    • Initiating Imaging Gateways must be configured to query Responding Imaging Gateway(s) identified as a test partner.  Initiating Imaging Gateways must query for the correct patient ID (ie you must be aware of whether you are in a 'red', 'green' or 'blue' patient ID domain.

    XCPD Responding Gateways have test data identified in the "XCPD_Responding_GW_Setup" test.

    You must demonstrate your ability to support asynchronous communication  You cannot pass as an XCPD Responding GW without demonstrating async support.  ***NEW as of 2019***, Responding Gateways shall support one of two types of async messaging.  You may support both.

    • Asynchronous Web Services Exchange (WS-Addressing based).  This is the original async protocol included in the XCA profile.
    • AS4 Asynchronous Web Services Exchange.  This was added to the profile in fall of 2018.

     

     

     

    XCA / XCPD test patient:

    We have a specific test patient we use for XCA and XCPD tests.

    • Patient XCPD^Pat with patient IDs in 3 different communities: IHERED-1039 or IHEGREEN-1039 or IHEBLUE-1039  
    • The XC* profiles do not define how the Gateways learn of the Patient IDs. The Gateways may have been pre-loaded connectathon demographics, may receive a Patient Identity Feed, or may learn of Patient IDs by some other means. Initiating Gateways will need to do mapping of these these patient IDs to the appropriate value in the Responding Gateways it is connected to.

    This patient is part of the 'connectathon demographics' which should be pre-loaded on XDS.b Registries, PIX Managers and Patient Identity Sources prior to the Connectathon.   (Note that this patients also available in the 'pre-load' demographics provided in the Gazelle PatientManager tool. See instructions in preparatory test Preload Connectathon Test Patients.)

     

    XCA test data -- documents  to query/retrieve:

    The XCA Responding Gateway must host in its community documents for the test patient.  If  you have an XDS Registry/Repository behind your Responding Gateway, host documents there.  If your Responding Gateway is not using XDS.b, it will find another way to host documents for the test patient.

    • If you have documents with a patient ID in metadata that matches the test patient, you may use them.  
    • Alternatively, in the NIST XDS Toolkit for the Connectathon we have test documents for test patient *1039*. You can use the NIST XDS Toolkit to Provide-and-Register the document to the Repository/Registry in your responding community, using correct patient ID IHERED-1039, IHEGREEN-1039 or IHEBLUE-1039. Find the templates for these documents in the XDS Toolkit: Select  Connectathon Tools.  Under "Load Test Data", you will find an entry to "Provide and Register" test data. 

     

    XCA-I patient and test data:

    For XCA-I, each Initiating & Responding Gateway represents and XDS affinity domain with an Imaging Document Source, and XDS Registry and Repository.  Each XCA-I community must host a DICOM Study and associated Manifest.  for XCA-I testing, we one of the three DICOM studies that the Imaging Document Source used for XDS-I.b tests.  

    Summary of the DICOM studies

    Patient ID   Procedure Code  Modality   Series Count    Image Count
    ---------------------------------------------------------------------
    C3L-00277           36643-5     DX                 1              1
    C3N-00953           42274-1     CT                 3             11    <-----we use this one for XCA-I
    TCGA-G4-6304        42274-1     CT                 3             13
    
    Procedure Codes (0008,1032)
    ---------------------------
    36643-5 / (LOINC / 2.16.840.1.113883.6.1) / XR Chest 2V
    42274-1 / (LOINC / 2.16.840.1.113883.6.1) / CT Abd+Pelvis WO+W contr IV
    

     

    Patient IDs to use with the XDS-I Manifest for the XCA-I tests.

    The Patient ID in the DICOM header for the images is considered the 'local' or 'departmental' Patient ID for this patient, (ie sourcePatientId in the DocumentEntry metadata). When submitting a Manifest for this study to an XDS Repository/Registry, the Imaging Doc Source must use the affinity domain ID for the patient in the XDS metadata for the submitted manifest. This patient has Patient IDs included in the Connectathon Patient Demographics pre-loaded onto each Registry at Connectathon as follows:

    For the CT study with "local" Patient ID C3N-00953, the affinity domain Patient IDs are listed here:

    • IHERED-2737^^^IHERED&1.3.6.1.4.1.21367.13.20.1000&ISO
    • IHEGREEN-2737^^^IHEGREEN&1.3.6.1.4.1.21367.13.20.2000&ISO
    • IHEBLUE-2737^^^IHEBLUE&1.3.6.1.4.1.21367.13.20.3000&ISO

    The Patient ID in the manifest will depend on the patient ID affinity domain (red, green, blue) of your local Registry & XCA-I Initiating or Responding Imaging Gateway.

     

    Evaluation

    There is no evaluation for this informational test.   If the systems testing XC* profiles do not do the set-up described above, then cross-community tests at Connectathon will not work.

    XDM_Create_Media

    Introduction

    This test applies to Portable Media Creators in the XDM Profile that create either CD-R or USB media, or a ZIP file for the ZIP over Email optionl.  

    This test case is used to create XDM-formatted media (CD-R and/or USB). The media you create will be used by Portable Media Importers during the Connectathon in the XDM_Import_* peer-to-peer tests.

    Instructions

    As a Portable Media Creator, when you create your media, we encourage to to put documents  from the various IHE content profiles your system supports (eg APR, BPPC, EDES,  EDR, IC,  XDS-MS, XDS-SD, XD-LAB, XPHR, etc).  A larger variety of document types will help the Importer systems find compatible content.

    You will also be required to demonstrate that you send an 'export' audit message when you create XDM media.

    To create your XDM media for exchange during Connectathon:

    STEP 1: Create 2 copies of physical media:  USB and/or CD/R, if you support those options.

    STEP 2:  Label your physical media.  The label should contain your system name in Gazelle Test Management, your table location, and the name of a technical contact at your table.  Also include the document types you have included on media (eg XPHR, XDS-SD, etc...)  (We recognize the space limitations on USB; create a piece of paper that can accompany your media.) Bring your media with you to the Connectathon.

    STEP 3: Create a zip file of the file structure on your XDM media. Upload that zip file into the samples area of Gazelle Test Management: menu Connectathon-->Connectathon-->List of Samples. On the 'Samples to share' tab, upload your zip file under the 'XDM' entry.

    Evaluation

    During Connectathon, a Monitor will do a two-part evaluation of your media.   You should do these for yourself in advance of the Connectathon so that you are confident your test will pass.

    EVALUATION PART 1 - METADATA  VALIDATION:

    Earlier versions of this test involved manual scrutiny of the METADATA.XML file.  Now, we use the Gazelle EVSClient:

    1. Access https://gazelle.ihe.net/EVSClient/home.seam
    2. Select menu IHE--> XD* Metadata -->Validate
    3. Click the "Add" button then upload one METADATA.XML from the media under test.
    4. From the "Model Based Validation" dropdown list, select "IHE XDM ITI-32 Distribute Document Set on Media"
    5. The Monitor will be looking for a validation results with no errors.

     

    EVALUATION PART 2 - Validate XDM structure using the Edge Test Tool

    1. Validate the zip file that you created above using the XDM validator in https://ett.healthit.gov/ett/#/validators/xdm
    2. Create a screen capture of the successful results from the validator and upload it into the Samples area of Gazelle Test Management, along with your ZIP file.

    XD*_Metadata_Do_This_First

    This page instructions for preloading coded values used during Connectathon testing

    of IHE document sharing profiles -- > XD*, XC*, and MHD

    Introduction

    IHE profiles for Document Sharing (XD*, XC*, MHD) rely on coded values provided in the metadata when documents are submitted and searched. These Document Sharing profiles define the structure of the document metadata as well as coded values for some metadata attributes; however, allowable values for many of the coded values are not constrained by IHE, but are defined by the Affinity Domain that will deploy and support the document sharing systems.

    • Source-type actors include coded values in metadata in document submission transactions
    • Recipient-type actors perform metadata validation based on coded values received in a document submission
    • Consumer-type actors use coded values for metadata when performing queries for documents

    For testing of Document Sharing profiles at IHE North America and Europe Connectathons, the set of allowable code values for document sharing metadata are defined by IHE Technical Project Managers and deployed in the NIST XDS Toolkit. 

    This page describes where to find the set of allowable codes for document sharing testing at IHE Connectathons.  This enables you to configure your test system prior to performing these types of tests. 

    (NOTE:  Some Connectathons or Projectathons may use different codes for metadata.  If that is the case, the Technical Project Manager will provide other guidance.)

    Which metadata attributes have coded values?

    These documentEntry metadata attributes have defined codes:

    • associationDocumentation
    • classCode
    • confidentialityCode
    • contentTypeCode
    • eventCodeList
    • folderCodeList
    • formatCode
    • healthcareFacilityTypeCode
    • mimeType
    • practiceSettingCode
    • typeCode

     

    Instructions

    Find the coded values then load these coded values onto your test system.  Loading these codes is a prerequisite to performing any preparatory tests with the NIST XDS tools or NIST FHIR Tools.  It is also a prerequisite to performing peer-to-peer Connectathon test for the XD*, XC* and MHD profles.

     

    For IHE NA and EU Connectathons, allowable codes for Document Sharing metadata are contained in the codes.xml file distributed here: https://tools.iheusa.org/xdstools/sim/codes/default 

    Note 1:  These codes are deployed on the public version of NIST XDS Toolkit hosted here:  https://tools.iheusa.org/xdstools/

     

    Note 2 : These codes are also available in SVS format, but values of codes in SVS format may not exactly match those in codes.xml above.  See the Gazelle SVS Simulator tool that hosts many value sets, including codes for metadata attributes.

     

    Evaluation

    There is no result file to upload to Gazelle Test Management for this test. If you do not do the configuration described above, then tests with tools or your test partners will not work.

     

    How to request an update to codes.xml

    If you find an error in codes.xml, or to request that a code be added, please submit an issue in the Github repository for XDS Toolkit: https://github.com/usnistgov/iheos-toolkit2/issues. You may also directly edit the codes.xml file here with your suggested change and submit a Pull request.

    AttachmentSize
    File codes.xml file for EU CAT2023 84.66 KB

    XDS.b_Registry_Set_Up

    Introduction

    XDS.b Document Registries must complete the preparation described here before performing XDS.b Connectathon tests.

    Instructions

    (1) Affinity Domains for RegistriesDuring Connectathon, each XDS.b Document Registry has been assigned to an affinity domain that determines the Patient IDs your Registry will accept.  These affinity domains are referred to as the "Red", "Blue" or "Green". (If this is your first Connectathon, these affinity domains are explained here.)  The Connectathon Project Manager announces the Red/Blue/Green assignments in advance of the Connectathon.  It is documented in this google spreadsheet.  

    (2) Connectathon patients to pre-load on your RegistryTo support XDS tests, Registries load patient demographics provided in advance by the Connectathon Technical Project Manager.  If you  have performed pre-Connecthon test Preload_Connectathon_Test_Patients , you already have these patients in your database; otherwise follow the instructions in that test now.   You will only load patients for the Affinity Domain you were assigned to above

    (3) Metadata CodesDocument Registries must also be configured with codes for Connectathon Affinity Domain Metadata.  These are documented in the codes.xml file found in the latest release of the NIST XDS Toolkit here:  https://github.com/usnistgov/iheos-toolkit2/releases/.  First-time Connectathon participants can read background information about metadata codes here.

    NOTE:  Some Connectathons may use different codes for metadata.  If that is the case, the Technical Project Manager will provide other guidance.

    Evaluation

    There is no result file to upload to Gazelle Test Management for this informational test. If the Document Registry does not do the set-up described above, then peer-to-peer XDS.b tests at Connectathon will not work.

    XUA_Read_This_First

    Introduction

    Prior to arriving at the Connectathon, it is important for participants testing XUA (or the IUA profile with the SAML Token option) to be familiar with:

    • the Connectathon testing scenario
    • the tool used to provide assertions for XUA tests -- the Gazelle-STS Security Token Service

    The description that follows:

    • explains the structure of the Connectathon tests and related tools
    • describes preparation to do before you start testing XUA at the Connectathon.

    Tool and SAML Token Information

    Locate and use the Gazelle-STS Security Token Service:

    To familiarize yourself with the Gazelle-STS tool used for Connectathons:

    • Read the STS information pagehttps://gazelle.ihe.net/gazelle-documentation/Gazelle-STS/user.html.  That page describes how to use the tool to request, renew, cancel, and validate a security token.  
    • Systems testing X-Service User and X-Service Provider should use the STS prior to the Connectathon so you are familiar with its use.

    Assertions used for Connectathon testing:  

    The [ITI-40] transaction (ITI TF-2: 3.40.4.1.2) specifies the SAML assertion, including that all assertions contain a Subject (principal).  The 'Subject' in the assertion could be a user or it could be an 'application'. 

    For Connectathon, we have pre-defined Subjects (ie HTTP authentication users) that we use for XUA testing .  Several different Subject/users are defined, and they are associated with a different assertions used for the XUA "success" test - XUA_Transaction_with_Assertion and the "fail" test XUA_Restrict_Access.

    Please refer to the Gazelle STS user manual for the list of assertions available:  https://gazelle.ihe.net/gazelle-documentation/Gazelle-STS/user.html#requesting-a-security-token.

    The Gazelle-STS tool is able to generate assertions with the success and failure conditions defined in the tests.  (We expect that X-Service Users that are generating their own assertions will have less flexibility.)

    Note  - Many options are possible for the AuthnStatement parameter in the Assertion. For the Connectathon, the assertions provided to the X-Service Users by the X-Assertion Providers will encode a timestamp representing when the authentication occurred and that the password class was used, eg:

    • urn:oasis:names:tc:SAML:2.0:ac:classes:Password

    Configuration Details:

    For X-Service Users who will request assertions from the Gazelle-STS, three configuration items have been identified. When requesting a security token, the X-Service User needs to provide the X-Assertion Provider with:
       (1)
    An HTTP authentication user
       (2)
    A valid password
       (3)
    The 'AudienceRestriction' of the X-Service Provider

    For item (3) at the Connectathon, to ease configuration, we will apply the same URL to all X-Service Providers, eg all Registries and Repositories. (Note that this URL is **not** the URL the Document Consumer will use to query a Registry or retrieve documents from a Repository).  This same, general URL used as the value of 'AudienceRestriction' for all service providers will simplify the STS configuration and will ensure that users can' access any Registry/Repository with the SAML token obtained from the STS.

    The required URL is :

    • http://ihe.connectathon.XUA/X-ServiceProvider-IHE-Connectathon

    XUA Connectathon Test Approach

    Actors in the XUA Profile can be grouped with actors in any IHE profile using webservices transactions (eg. XDS.b, XDR, PDQv3, PIXv3, RFD, XCA, many others...). Thus, you will be testing the exchange of XUA assertions in conjunction with transactions from another profile. This means you not only need to find your XUA test partners, you must also find a partner which supports a webservices transaction that you support.

    Here is the sequence of activities you must do to accomplish your XUA testing during the Connetathon:

    1. Before any XUA peer-to-peer tests can be run, each X-Service Provider actor must run one instance of test XUA_X-Service_Provider_Setup. Do that first.
    2. The XUA profile allows X-Service User actors to get their assertions from an external assertion provider or generate their own. You need to talk to your test partners about how your system works in this regard.  
    3. You will need to find XUA test partners who support a webservices-based IHE profile that you also support.
    4. There are 3 'flavors' of XUA Connectathon tests.  You can do these tests in conjunction with any webservices transaction:
      • The 'success' case (XUA_Transaction_with_Assertion test). 
      • The 'fail' case (XUA_Restrict_Access test).
      • A test where the X-Service Provider does not trust the Assertion Provider (XUA_Policy_Test).
    5. When you perform XUA tests, you must use TLS and enable ATNA logging.

    These notes apply to testing of the XUA profile at the Connectathon:

    1. The method that the X-Service User uses to get the SAML Identity Assertion is not constrained by the XUA profile.  Test cases for this profile allow use of these two methods.  The first is most common.
      • SAML Assertion Provider -- The X-Service User will authenticate against an identity provider (XUA X-Assertion Provider).
        For Connectathon testing, we provide the Gazelle-STS tool as an X-Assertion Provider from which X-Service Users can request assertions, and against which X-Service Providers can validate assertions received. The certificate to trust in order to access that service is available here.
        The Gazelle-STS will verify signature, validity and trusted chain relations for Assertions, but will not check specific content such as AudienceRestriction value, IHE rules (or other standards business rules). This is the responsibility of the X-Service Provider, so it should not fully delegate the validation of the Assertion to the Gazelle-STS. See associated documentation here.
      • Self-assertion -- The X-Service User does not rely on an external assertion provider; rather, it generates its own valid SAML assertions
        (In this scenario, if you want Gazelle-STS to be able to verify signature and trust-chain, the certificate/or its CA signing the assertion must be available in Gazelle Security Suite (GSS) PKI. You can either use a certificate delivered by GSS, or you can contact a Gazelle Administrator to update your public certificate into GSS.)
    2. XUA X-Service Providers need to be able to 'trust' assertions provided by a mix of X-Service Users who either:
      • do self-assertion
      • or, send an assertion which originated at an external STS (X-Assertion Provider). This is the more common case.
    3. X-Service Providers should be configurable to 'not trust' assertions from a selected assertion provider.
    4. (This test has been deferred: X-Service Providers need to be able to be configured to 'not trust' assertions containing certain parameters. For example, for the Connectathon, we will define a 'policy' that assertions containing an authentication method of 'InternetProtocol' in its AuthnStatment should fail validation.)
    5. The XUA Profile constrains only a subset of the possible parameters that can be included in a SAML assertion. The proper use of these specific parameters is tested.  It is valid to include other parameters in the assertion, but these are not tested. Other parameters will not affect the validity of the assertion. The XUA X-Service Provider must accept all valid SAML assertions and process attributes profiled in ITI-40.
    6. In these tests, XDS.b Document Registry and Document Repository actors which are also XUA X-Service Providers do not need to support a scenario where some Document Consumers in the Affinity Domain support XUA, and some do not. This will not be a required test at Connectathon; however, vendors are welcome to test this scenario if they wish.
    7. All transactions, including those with the X-Assertion Provider, shall be done using TLS.
    8. Auditing will be tested:
      • The XUA profile requires that the X-Service Provider generate an ATNA audit event for 'Authentication Failure' whenever its validation of the assertion fails. Eg. The Document Consumer must send an audit message when it initiates a Registry Stored Query. These audit events are tested in these tests.
      • The XUA profile requires that "Any ATNA Audit Messages recorded by Actor grouped with the X-Service Provider Actor, shall have the user identity recorded according to the XUA specific ATNA encoding rules" (See ITI TF-2:3.40.4.2 ATNA Audit encoding). For example: when the XDS.b Document Consumer actor records the Stored Query event, this event record will include the identity provided in the XUA Identity Assertion. This assures that the X-Service User and X-Service Provider ATNA Audit messages can be correlated at the ATNA Audit Repository. This will be tested in the XUA tests.
    9. Authorization is out-of-scope for XUA (that is tested in the IUA and SeR Profiles), i.e., there are no XUA tests that allow or restrict a user's access to a document based on who a user (principal) is, nor are there tests that allow a user to see a subset of a patient's document based on the contents of the assertion. Either the validation succeeds at the X-Service Provider, and all of a patient's record is returned to the X-Service User, or validation fails, and the X-Service User receives nothing.

    Evaluation

    There is no result file to upload to Gazelle Test Management for this informational test. If the systems testing XUA do not do the set-up described above, then peer-to-peer XUA tests at Connectathon will not work.

     

    Connectathon Doc Sharing testing -- 3 XDS Affinity Domains


    This page applies to vendors testing IHE's document sharing profiles:  XD*, XC*, MHD
    and supporting Patient ID manangement profiles:  PIX*, PDQ*

     

    If you're testing these profiles, please review this page prior to the IHE Connectathon.

    Note that there are updates below related to Patient IDs in Patient Resources in some IHE profiles based on HL7® FHIR®.

     

    At IHE Connectathons in both Europe and North America, we define three Affinity Domains. These represent three different document sharing (XDS) communities.

    Patient IDs in Connectathon Affinity Domains

    Each of these domains is associated with its own Patient ID assigning authority. For ease of reference we refer to these as:

    • Red Patient ID domain.  The Patient ID assigning authority value is:
      • 1.3.6.1.4.1.21367.13.20.1000 for Patient IDs used in the PIX, PIXv3, PDQ, PDQv3 and XDS.b profiles
      • urn:oid:1.3.6.1.4.1.21367.13.20.1000 for Patient IDs used in FHIR-based profiles PIXm, PDQm, MHD
    • Green Patient ID domain.  The Patient ID assigning authority value is:
      • 1.3.6.1.4.1.21367.13.20.2000 for Patient IDs used in the PIX, PIXv3, PDQ, PDQv3 and XDS.b profiles
      • urn:oid:1.3.6.1.4.1.21367.13.20.2000 for Patient IDs used in FHIR-based profiles PIXm, PDQm, MHD
    • Blue Patient ID domain.  The Patient ID assigning authority value is:
      • 1.3.6.1.4.1.21367.13.20.3000 for Patient IDs used in the PIX, PIXv3, PDQ, PDQv3 and XDS.b profiles
      • urn:oid:1.3.6.1.4.1.21367.13.20.3000 for Patient IDs used in FHIR-based profiles PIXm, PDQm, MHD
    • We also have a 'local' Patient ID assigning authority (1.3.6.1.4.1.21367.3000.1.6 and urn:oid:1.3.6.1.4.1.21367.3000.1.6).

    We have a tool -- the Gazelle PatientManager --  for creating a patient with a Patient ID with a Red, Green or Blue assigning authority and sending it via HL7v2 or v3 to your test system.  It also can create equivalent FHIR Patient Resources.  Instructions on how to use this tool to populate your test system with patients for Connectathon testing are found in pre-Connectathon test Preload_Connectathon_Test_Patients.

    Explanatory resources:

    • Three-domain-assigning-authority: XDS.b Registries and XC*Gateways are assigned to one of the 3 affinity domains for the entire connectathon week.
      • These assignments are in the spreadsheet linked above.
      • Registries accept a patient identity feed, and register documents, with patient IDs in their assigned domain (ie with their designated Patient ID assigning authority)
      • Likewise, Gateway actors represent communities with documents in their one assigned domain.
      • PIX Managers, and some PDQ Suppliers, can operate across domains.
    • XC* testing schedule during Connectathon week - contains the Tues/Wed/Thur schedule for XC testing for the current Connectathon.   

    PCC 'Do/Read This First' Tests

    This is an index of Do This First and Read This First tests defined for profiles in the IHE Patient Care Coordination (PCC) domain.

     

    IPS: Read This First

    Introduction

    At the Connectathon, the IPS Content Creator is required to provide documents that meet the requirements of the International Patient Summary (IPS) Profile.  The IPS Content Creator is required to accept / retrieve documents and process them using one or more of the options defined by the IPS Profile.

    This page provides a general overview of the IPS testing process.

    Cross-Profile Considerations

    • Value Sets for some of the data elements / resources are coordinated with testing the SVCM profile in the ITI domain.
    • Content of the documents is coordinated with testing the QEDm profile in the PCC domain. We intend to reuse patients and content across the IPS and QEDm profiles to reduce work for systems that create / host the clinical test data.

    Testing Overview

    1. A fixed set of patients with demographics and defined clinical content are specified. See those details below, including consideration for changes needed to support regional testing.
    2. The Content Creator enters the data with the defined clinical content into their system. See details below.
    3. As part of testing, monitors will test IPS documents with a software tool. See details below.
    4. Interoperability tests are executed for both CDA and FHIR versions of the IPS documents. Content Creator and Content Consumer actors are welcome to test either flavor.
    5. The IHE IPS profile says that the Content Creator makes the IPS document available through the PCC-1 Document Sharing transaction. This transaction takes on several forms, but the key aspect of this is that the Content Creator actively exports the document. The Content Consumer does not query for an IPS document using a FHIR search transaction.

    Fixed Patients

    The table below lists the patients defined for testing. Demographic information can be found in the Connectathon Artifacts GitHub repository (see the IPS-QEDm README.md) or in the Gazelle Patient Manager. The Optionality column indicates if the patient data is required for IPS testing (R for Required, O for Optional).

    • It is in your best interests to use the patient names and demographics that are listed as this will simplify testing. IPS tests do not examine patient address, so you are welcome to substitute values that are defined for your region. If you have issues with the specific patient name, you can also use a different patient. Be clear in your communication with your test partners.
    Name DOB Trillium Bridge ID IHERED ID Optionality
    Charles Merlot 1966.04.04 EUR01P0008 IHERED-3158 R
    Mary Gines 1963.09.09 EUR01P0020 IHERED-3159 R
    Annelise Black 1988 EUR01P0002 IHERED-3160 O
    Marco Peroni 1995.07.28 EUR01P0011 IHERED-3163 O
    Allen Perot 1963.02.18 EUR01P0013 IHERED-3161 O

     

    Defined Clinical Content

    As mentioned above, a set of patients is defined for QEDm testing. Clinical content should be extracted from the files described here: https://github.com/IHE/connectathon-artifacts/tree/main/profile_test_data/PCC/IPS-QEDm. The README.md file in the GitHub repository provides an index to files but does not describe the clinical content. Further notes:

    • The IPS profile does not place requirements on the method for entering the resource data into the Content Creator. For IPS testing, the Content Creator is allowed to use any method of extraction / translation. That method will not be examined / reviewed.
    • The table below indicates the number of resources that should be extracted and included in the IPS document.
    • The Connectathon Artifacts repository has sample IPS documents for the patients listed above. Most of the files correspond to the previous DSTU3 version that were created as part of the Trillium Bridge project. One file follows the R4 format. You will need to extract the clinical content and place it in your system.
    • The two required patients (Merlot, Gines) exercise required sections in the IPS document and include encodings where values are not known. The optional patients include other IPS sections.

     

     

    Merlot

    (DSTU3)

    Gines

    (DSTU3)

    Black

    (DSTU3)

    Peroni

    (R4)

    Perot

    (DSTU3)

    Required
    Medication Summary 2 No information  2  2  3
    Allergies and Intolerances NKA 1  NKA NKA   1
    Problem List (Condition) 2  No known  3  4  5
    Recommended     
    Immunizations    1  1  2  2
    History of Procedures          
    Medical Devices     No known     
    Diagnostic Results          
    Optional    
    Vital Signs          
    Past History of Illness          
    Pregnancy        
    Social History          
    Functional Status          
    Plan of Care          
    Advance Directives          

     

    You will find that clinical content in this folder https://github.com/IHE/connectathon-artifacts/tree/main/profile_test_data/PCC/IPS-QEDm in the Connectathon Artifacts GitHub repository. Find the data you need by matching the patients listed in the table above with the README.md file in that folder.

    Please do not enter less content or more content than is defined for each patient. You might add content to an individual resource, but do not add or substract resources. Validation of test results is difficult when you do not include the expected data.

    You might discover that the data in the patient record is contradictory and/or might generate alerts because medications do not go with diagnoses. Please contact the owner of the test data to help resolve the issue and make corrections.

    The document that you create/export is a self-contained document. If you are creating a FHIR Bundle, the resources that are referenced in the document must also exist in the document. The FHIR Bundle does not refer to resources that exist on a FHIR server. 

    IPS Data Exchange

    We use the Samples area of Gazelle Test Management to exchange IPS documents (CDA or FHIR format) between Content Creator and Content Consumer systems.  There are a separate Preparatory tests containing instructions.  See:

    In addition, the IHE IPS Profile says that the Content Creator transmits the IPS document to the Content Consumer using the PCC-1 transaction. That transaction contains a number of options:

    • XDS
    • XDM
    • XDR
    • XCA
    • MPQ
    • MHD
    • RFD
    • "others as appropriate"

    During Connectathon, we will communicate with test participants and work with them to resolve the mechanism for exchanging the document.  The Content Creator creates the document and actively exports the document. This is not a FHIR search request to retrieve a summary document. The HL7 IPS Implementation Guide does not forbid a FHIR search / read requests, but the IHE profile has used the push model of a document.

    For Preparatory testing purposes, we are more concerned with document content and reliable data import and less concerned with the mechanics of creating/exporting the document.

    Software Test Tools

    We use the Gazelle External Validation Service (evs) tool (https://gazelle.ihe.net/evs/home.seam) to validate IPS Content.  There are a separate Preparatory tests containing instructions.  See: 

    IPS: Create CDA and FHIR Documents for Connectathon Testing

    Introduction

    At the Connectathon, the IPS Content Creator is required to provide documents that meet the requirements of the International Patient Summary (IPS) Profile. This test tells you where to find the documentation that describes what is expected for testing.

    Instructions

    First, create your IPS Documents:

    These instructions are for the IHE IPS Content Creator.

    The page IPS Read This First describes the set of patients and where to locate the clinical information for each patient. Please refer to that page prior to creating your IPS content below.

    Required - CDA Option

    Create the clinical content in your system that you need to produce an IPS CDA document for these two patients:

      • Merlot
      • Gines

    Required - FHIR Option

    Create the clinical content in your system that you need to produce an IPS FHIR document for these two patients:

      • Merlot
      • Gines

    Optional Content - CDA Complete Option

    Create the clinical content in your system that you need to produce an IPS CDA document for one or more of these patients:

      • Black
      • Peroni
      • Perot

    Optional Content - FHIR Complete Option

    Create the clinical content in your system that you need to produce an IPS FHIR document for one or more these patients:

      • Black
      • Peroni
      • Perot

    Next, upload your IPS content into Gazelle Test Management:

     

     

    QEDm: Read This First

    Introduction

    At the Connectathon, the QEDm Clinical Data Source is required to respond to search requests for one or more FHIR Resources as defined by the QEDm profile. The QEDm Clinical Data Consumer is required to search for one or more one or more FHIR Resources as defined by the QEDm profile. The resources used in the search depend on the QEDm options declared by a system.

    This page provides a general overview of the QEDm testing process.

    Cross Profile Considerations

    • Value Sets for some of the data elements / resources are coordinated with testing the SVCM profile in the ITI domain.
    • Clinical content is coordinated with testing the IPS profile in the PCC domain. We intend to reuse patients and content across the IPS and QEDm profiles to reduce work for systems that create / host the clinical test data.
    • Provenance testing for the QEDm Provenance Option is coordinated with testing of the ITI mXDE profile.

    Testing Overview

    1. A fixed set of patients with demographics and defined clinical content are specified. See those details below, including consideration for changes needed to support regional testing.
    2. The Clinical Data Source enters the resources with the defined clinical content into their system. See details below.
    3. As part of conformance testing, monitors will query the Clinical Data Source with a software tool. See details below.
    4. Interoperability tests are executed for those resources where there is overlap between the Clinical Data Source and Clinical Data Consumer. Overlap means that both the Source and Consumer support one or more of the same options.
    5. Interoperability tests focus on the search functions defined in the QEDm profile. Search functions and query paramters are specific to each FHIR resource. You can see the list of search parameters below.
    6. Testing of the QEDm Provenance option is related to testing of the ITI mXDE Profile. You can read about that testing environment here: mXDE Read This First

    Fixed Patients

    The table below lists the patients defined for testing. Demographic information can be found in the Connectathon Artifacts GitHub repository (see the IPS-QEDm README.md) or in the Gazelle Patient Manager. The Optionality column indicates if the patient data is required for QEDm testing (R for Required, O for Optional).

    • It is in your best interests to use the patient names and demographics that are listed as this will simplify testing. QEDm tests do not examine patient address, so you are welcome to substitute values that are defined for your region. If you have issues with the specific patient name, you can also use a different patient. Be clear in your communication with your test partners.
    Name DOB Trillium Bridge ID IHERED ID Optionality
    Charles Merlot 1966.04.04 EUR01P0008 IHERED-3158 R
    Mary Gines 1963.09.09 EUR01P0020 IHERED-3159 R
    Chadwick Ross 1960.02.29   IHERED-3162 R
    Annelise Black 1988 EUR01P0002 IHERED-3160 O
    Marco Peroni 1995.07.28 EUR01P0011 IHERED-3163 O
    Allen Perot 1963.02.18 EUR01P0013 IHERED-3161 O

     

    Defined Clinical Content

    As mentioned above, a set of patients is defined for QEDm testing. Clinical content should be extracted from the files described here: https://github.com/IHE/connectathon-artifacts/tree/main/profile_test_data/PCC/IPS-QEDm. The README.md file in the GitHub repository provides an index to files but does not describe the clinical content. Further notes:

    • The QEDm profile does not place requirements on the method for entering the resource data into the Clinical Data Source. For QEDm testing, the Clinical Data Source is allowed to use any method of extraction / translation. That method will not be examined / reviewed.
    • The table below indicates the number of resources that should be extracted and made available for query.
    • The IPS files for patients Merlot and Gines provide a simple source of data for FHIR resources as these files contain FHIR Bundles.
    • The files for Ross (CCD, Procedure, Diag Imaging) are Consolidated CDA documents are provide some challenges:
      • Note that the CCD document has 20 observations. These are scattered throughout the document. If you already have CDA software that automatically extracts observation data, please use it. If you do not have automated software, extract 5 observations by hand and move on.
      • We will provide some flexibility when looking at the FHIR resources you have defined by looking at the CDA documents. We hope that a future version of the test environment will be more explicit.

     

     

    Merlot

    (IPS)

    Gines

    (IPS)

    Ross

    (CCD)

    Ross

    (Procedure)

    Ross

    (Diag Imaging)

    Observation     4 (3*)  1  
    AllergyIntolerance  1  1  2    
    Condition  2        
    Diagnostic Report        1  1
    MedicationStatement  2  1  2    
    MedicationRequest          
    Immunization    1  5    
    Procedure      3  1  
    Encounter      1    

     

    You will find that clinical content in this folder https://github.com/IHE/connectathon-artifacts/tree/main/profile_test_data/PCC/IPS-QEDm in the Connectathon Artifacts GitHub repository. Find the data you need by matching the patients listed in the table above with the README.md file in that folder.

    Please do not enter less content or more content than is defined for each patient. You might add content to an individual resource, but do not add or substract resources. Validation of test results is difficult when you do not include the expected data.

    You might discover that the data in the patient record is contradictory and/or might generate alerts because medications do not go with diagnoses. Please contact the owner of the test data to help resolve the issue and make corrections.

    *two of the observations are blood pressure observations (systolic, diastolic) which need to be combined in FHIR accoding to the Vital Sign profile.

     

    Software Test Tools

     

    QEDm Search Functions

    The search parameters in QEDm depend on the FHIR resource. The summary table below shows the combinations of query parameters defined for the various resources. The last row for Provenance is special because you do not search directly for a Provenance resource. You search for a base resource and ask the server to include Provenance resources in the response.

     

      Patient Patient + Category Patient + Category + Code Patient + Category + Date Patient + Category + Code + Date Patient + Clinical Status Patient + Date _include
    Observation   x x x x      
    AllergyIntolerance x              
    Condition x x       x    
    DiagnosticReport   x x x x      
    MedicationStatement x             x
    MedicationRequest x             x
    Immunization x              
    Procedure x           x  
    Encounter  x            x  
    Provenance                

     

     

     

     

     

     

     

    QEDm: Create CDA and FHIR Content

    Introduction

    This page provides instructions for the Clinical Data Source when testing the QEDm profile. You should read the overview at QEDm: Read This First and then follow the instructions below.

    Instructions

    Find clinical content in IPS bundles and CDA documents in the IPQ-QEDm folder of the Connectathon Artifacts repository. For each content option you support (e.g., Observation, Allergy/Intolerance), load FHIR resources into your system per the subsections below.  

    • Please do not enter less content or more content than is defined for each patient. You might add content to an individual resource, but do not add or substract resources. Validation of test results is difficult when you do not include the expected data.
      • Separate disclaimer under Observations
    • You might discover that the data in the patient record is contradictory and/or might generate alerts because medications do not go with diagnoses. Please contact the owner of the test data to help resolve the issue and make corrections.
    • The QEDm profile does not place requirements on the method for entering the resource data into the Clinical Data Source. For QEDm testing, the Clinical Data Source is allowed to use any method of extraction / translation. That method will not be examined / reviewed.
    • The IPS files for patients Merlot and Gines provide a simple source of data for FHIR resources as these files contain FHIR Bundles.
    • The files for Ross (CCD, Procedure, Diag Imaging) are Consolidated CDA documents and provide some challenges. You are welcome to use automated software to extract the clinical data or extract by hand.
    • We will allow some flexibility when looking at the FHIR resources you have defined by looking at the CDA documents. We hope that a future version of the test environment will be more explicit. 

     

    Observation

    The Observation resource is more difficult than the other clinical items as there are 20 observations in the Ross CCD file and one observation in the Ross Procedure File.

    • Extract and enter the one observaton from the Ross Procedure file.
    • Extract and enter at least 5 observations from the Ross CCD file. As mentioned above, this can be a manual or automated process.

     

    AllergyIntolerance

    Extract and enter 4 clinical data items for Allergy/Intolerance from these sources and enter into your system:

    Merlot / IPS 1
    Gines / IPS 1
    Ross / CCD 2

     

    Condition

    Extract and enter 2 clinical data items for Condition from these sources and enter into your system:

    Merlot / IPS 2

     

     

    Diagnostic Report

    Extract and enter 2 clinical data items for DiagnosticReport from these sources and enter into your system:

    Ross / Procedure 1
    Ross / Diagnostic Imaging 1

     

     

    MedicationStatement

    The documents listed below include medications for three patients. Extract the medications from the documents and convert to MedicationStatement resources with the related Medication resources.

    Merlot / IPS 2
    Gines / IPS 1
    Ross / CCD 2

     

     

    MedicationRequest

    The section above tells you to create MedicationStatement resources from medications found in three documents. Follow the same guidance to create MedicationRequest resources for these three patients. That is, you can assume that each medication is also described by a MedicationRequest.

    Merlot / IPS 1
    Gines / IPS 1
    Ross / CCD 2

     

     

    Immunization

    Extract and enter 5 clinical data items for Immunization from these sources and enter into your system:

    Gines / IPS 1
    Ross / CCD 4

     

     

    Procedure

    Extract and enter 2 clinical data items for Procedure from these sources and enter into your system:

    Ross / CCD 1
    Ross / Procedure 1

     

    Encounter

    Extract and enter 1 clinical data items for Encounter from these sources and enter into your system:

    Ross / CCD 1

     

    Provenance

    You might be expecting to find directions for Provenance resources here. Testing for Provenance resources is coupled with mXDE testing. You can read about that testing environment on the mXDE Read This First page.

     

    RAD 'Do This First' Tests

    This is an index of Do This First tests defined for profiles in the IHE Radiology (RAD) domain.

     

    AIR_EC_Capabilities

    Overview

    The AI Results (AIR) Profile specifies how AI Results encoded as DICOM Structured Reports (SRs).   Depending on the AI algorithms implemented on the AIR Evidence Creator (EC) actor, the EC will create/encode one or more of the different result primitives in its SRs, e.g. qualitative findings, measurements, locations, regions, parametric maps, tracking identifiers, image references.

    For the Connectathon, there is a set of no-peer tests to evaluate how the Evidence Creator encodes its AI results; the tests follow the naming pattern AIR_Content_*. Each of these tests align with a different result primitive included in an AI results SR.  We have created separate tests for the different result primitives to make it test execution and evaluation more manageable.  The Evidence Creator will perform Connectathon tests that are applicable to the SRs and primitives it has implemented.

    The purpose of this Preparatory test is to have the Evidence Creator describe in narrative form the nature of its AI results implementation.     Reading this description will help the Connectathon monitor have the proper context to evaluate your Evidence Creator application, the AI results you produce, and the result primitives included in your AI SR instances.

    Instructions for Evidence Creators

    For this test you (the Evidence Creator) will produce a short document describing your implementation in the context of the AI Results Profile specification.  The format of the document is not important.  It may be a PDF, a Word or google doc, or some other narrative format.

    Your document shall include the following content:

    1. Your system name in Gazelle Test Management (eg. OTHER_XYZ-Medical)
    2. AI Algorithm Description - this should be a sentence or two describing what your algorithm does (e.g. detect lung nodules)
    3. DICOM IODs implemented (one or more of:  Comprehensive 3D SR Storage IOD, Segmentation Storage IOD, Parametfic Map Storage IOD, Key Object Selection (KOS) Document Storage IOD)
    4. Result primitives encoded in the AI Result SR. (one more of: qualitative findings, measurements, locations, regions, parametric maps, tracking identifiers, image references)
    5. If you encode measurements, indicate whether your measurements reflect a planar region of an image (i.e. use TID 1411, a volume (TID 1410), or are measurements that are not tied to a planar region or volume (TID 1501). (Refer to RAD TF-3: 6.5.3.3 in the AIR TI Supplement for details.)
    6. If you encode regions, indicate whether they are contour-based regions (i.e. use TID 1410 or 1411) or pixel/voxel-based regions (i.e. use the DICOM Segmentation Storage SOP Class) (Refer to RAD TF-3: 6.5.3.5 for details).
    7. Please add any additional information (e.g. screen shots) that would help the reader understand your algorithm, and output.
    8. REPEAT 2 - 7 for each AI Algorithm that produces result(s) on your Evidence Creator
    9. Finally, in order to share it with your test partners, upload your document as a Sample in Gazelle Test Management.  On the 'List of Samples' page, use the dropdowns to find your test system, and on the 'Samples to share' tab, add a new "AIR_EC_Capabilities" sample and upload your document there.   When you save your sample, it will be visible to your test partners.

     

    Instructions for AIR 'consumer' systems

      1. Each Evidence Creator should have uploaded a description of their AI algorithms and inputs and outputs into Gazelle Test Management (Gazelle TM under Testing-> Sample exchange.    On the Samples available for rendering tab under the AIR_EC_Capabilities entry,  This page will evolve as your partners add samples, so be patient. 
      2. Retrieve the document uploaded. The purpose is to give you an understanding of the types of AI content that your Image Display, Image Manager or Imaging Doc Cosumer will store/display/process  Refer to these help pages for details on this task.

    Evaluation

    There is no "pass/fail" for this test.  However, you must complete it because it is a prerequisite for several Connectathon tests.  The Connectathon monitor will be looking for the document you produce here and use it when s/he is evaluating your AI result content.

    AIR_EC_Content_Test_Overview

    Overview

    This Preparatory test is informational.  It is intended to prepare the AIR Evidence Creator for Connectathon tests that will be used to evaluate the AI result SRs produced by the Evidence Creator.

    Another Preparatory test, AIR_Sample_Exchange, instructs the Evidence Creator to upload AI result SRs into the Samples area of Gazelle Test Management.  In that test, the Evidence Creator will also use the Pixelmed DICOM validator to perform DICOM validation of your SRs.   The Pixelmed validator checks the baseline requirements of the DICOM SR, including the requirements of the Templale IDs (TIDs) within the SR.   The tool does not, however, check the requirements and constraints that are part of the content specification in the AIR Profile.

    In Gazelle Test Managment, on your Test Execution page, you will find a set of no-peer Connectathon tests used to evaluate encoding of AI results; these Connectathon tests follow the naming pattern AIR_Content_*.  The different tests align with different result primitives that are included in an AI results SR, e.g. qualitative findings, measurements, locations, regions, parametric maps, tracking identifiers, image references.

    Depending on the AI algorithms it implements, we expect an Evidence Creator to create/encode one or more of these types of result primitives.  We have created separate tests for the different result primitives to make test execution and evaluation more manageable during the Connectathon.

    Instructions

    Prior to the start of the Connectathon, we highly recommend that the Connectathon participant that will test the Evidence Creator actor read each AIR_Content_* Connectathon test

    >>Note:  There is a Content test for each of the AI result primitives.   The AI algorithm(s) on your Evidence Creator may not include all of the defined result primitives (e.g. you may not produce parametric maps.  For the Connectathon, you will only be required to perform the AIR_EC_Content* and AIR_Display* tests that are applicable to your system.  (This separation of capabilities into separate tests results in some redundant test steps, but one large test for all primitives would have been difficult for testers and monitors to manage.)

    In each AIR_Content_* test, you will find test steps and evaluation criteria for specific encoding requirements for the different result primitives.  We recommend that you examine your AI result SR content using these test steps.    If you find discrepancies, you may need to update your software to be compliant with the AIR content requirements.   If you disagree with any of the tests or test steps, you should contact the IHE Radiology Domain Technical Project Manager to resolve your concern.

    If you use the tests to review the SRs during the Prepatarory phase, you can be confident that the Connectathon monitor will find no errors when s/he evaluates your SRs during the Connectathon.

    Evaluation

    There is no result file to submit into Gazelle Test Management for this informational test.

    AIR_Test_Data

    Overview

    The AI Results (AIR) Profile requires the Image Display to demonstrate specific display capabilities when rendering AI Result SRs.  These requirements are in Display Analysis Result [RAD-136].

    At the Connectathon, a monitor will sit down at your AIR Image Display and run through a set of tests to evaluate the display requirements in [RAD-136].

    In this preparatory test, we are providing you with some test data advance of the Connectathon that you will use to demonstrate AIR display requirements.   The test data includes:

    • binary DICOM Structured Reports (SRs) that encode the AI result examples documented in RAD TF-3, Appendix A Example Analysis Result Encodings (currently in the AIR Trial Implementation Supplement).
    • vendor samples from prior IHE Connectathons

    NOTE:  During the Connectathon, the Image Display will be required to perform tests with with AI Result IODs from the Evidence Creator test partners at that Connectathon.  The Image Display may also be asked to use AI Result IODs in this test data, especially where this sample data contains DICOM object types or AIR primitives that the 'live' test partners do not produce.

    Instructions

     For AIR IMAGE DISPLAY systems:

    1. Access the test data and load the SRs, and accompanying images, onto your Image Display.  See: https://github.com/IHE/connectathon-artifacts/tree/main/profile_test_data/RAD/AIR
    2. Review requirements in the Connectathon tests listed below
    3. Use the test data in your own lab to prepare to demonstrate those display requirements to a monitor during Connectathon.

    >> AIR_Display_Analysis_Result

    >> AIR_Display_Parametric_Maps

    >> AIR_Display_Segmentation_IOD

    >> AIR_Display_*  (etc...)

     

    For ALL OTHER AIR ACTORS:

    It is OPTIONAL non-Image-Display actors to access the samples, but we recognize the value of test data to all developers, so you are welcome to access the samples.

    1. To access the test data, see: https://github.com/IHE/connectathon-artifacts/tree/main/profile_test_data/RAD/AIR.  Samples are arranged in sub-directories by Connectathon and then by vendor
    2. To download a file in one of the sub-directories, click on the individual file name, then use the links on the right side of the page to download the 'Raw' file.

    Evaluation

    IMAGE DISPLAY SYSTEMS:  Create a text file that briefly describes your progress in using the SRs with your Image Display. Upload that file into Gazelle Test Management as the result file for test. There is no pass/fail for this preparatory test . We want to make sure you're making progress toward what is expected during evaluation of your Image Display at the Connectathon.

    AIW-I_Task_Performer_Capabilities

    Overview

    The AI Workflow for Imaging (AIW-I) Profile specifies how to request, manage, perform, and monitor AI Inference on digital image data.  

    Both the sequence of transactions in AIW-I and the content of the workitem(s) created by the Task Requester depend on the AI inferences and workflows implemented on the AIW-I Task Performer actor.  Therefore, the purpose of this Preparatory test is to gather information from the Task Performer which will influence how it will interact with its test partners during the Connectathon.   The Task Performer will describe:

    • AIW-I workflows it supports. See AIW-I Section 50.1.1.3, Section 50.4.1.5, and Section 50.4.2 .  One or more of:
      • Pull workflow
      • Triggered pull workflow
      • Push workflow
    • the AI algorithms (inferences) it has implemented
    • the inputs each algorithm needs when it is triggered.  See AIW-I Section 50.4.1.1 and 50.4.1.2.

    This description will help the Task Requester ensure that the workitems it creates are adequately populated for you, and that you test the workflow(s) you support with your partners at the Connectathon.

    Instructions for Task Performers

    For this test you (the Task Performer) will produce a short document describing your implementation in the context of the AIW-I Profile specification.  According to AIW-I, Section 50.4.1.1, a DICOM Conformance Statement is the ideal home for these details.  If you have one, great!  But, for the purpose of this preparatory test, the format of the document is not important.  It may be a PDF, a Word or google doc, or some other narrative format.

    Your document shall include the following content:

    1. Your system name in Gazelle Test Management (eg. OTHER_XYZ-Medical)
    2. Technical Contact name/email - this is someone who can be contacted if there are questions about what you provide below
    3. The AIW-I Workflow(s) you support:   Pull Workflow,  Triggered-pull Workflow, and/or Push Workflow
    4. AI Algorithm Description - this should be a sentence or two describing what your algorithm does (e.g. detect lung nodules)
    5. The Workitem Code that will trigger this AI Algorithm - refer to AIW-I Section 4.1.2.   You may use a code in Table 50.4.1.2-1 if there is one that applies.  Otherwise, suggest a code that is appropriate for your AI algorithn
    6. The Input Parameters  & Values required by your AI Algorithm - you will need to be very specific in answering this.  Please refer to AIW-I Section 50.4.1.1 and Table 50.4.1.1-1.  Identify the UPS attribute(s) your algorithm relies on, and the value(s) you expect for each.
    7. The Input Information Sequence content your AI altorithm requires - Please refer to Section 50.4.1.3 and identify the DICOM images (if any) that your AI algorithm expects to see in the Input Information Sequence (0040,4021) of the UPS Workitem that triggers your AI algorithm.
    8. Any other information that will help the reader understand your algorithm and how it is triggered.
    9. REPEAT 3 - 8 for each AI Algorithm that can be triggered on your Task Performer
    10. Finally, in order to share it with your test partners, upload your document as a Sample in Gazelle Test Management.  On the 'List of Samples' page, use the dropdowns to find your test system, and on the 'Samples to share' tab, find the "AIW-I_Performer_Capabilities" entry and upload your document there.   When you save your sample, it will be visible to your test partners.

     

    Instructions for Task Requesters, Managers, Watchers

    You will find and read the document provided by the Task Performer above.

    1. In Gazelle Test Management, on the 'List of Samples' page, use the dropdowns to find your test system, and on the 'Samples available for rendering' tab, find the "AIW-I_Performer_Capabilities" entry with the document provided by your Task Performer partner(s) above.  
    2. Download that document and use it to configure your system for items such as workitem codes, input parameters, etc, that you will need in order to create a UPS workitem [RAD-80] for that Performer

    Evaluation

    There is no "pass/fail" for this test.  However, you must complete it because it is a prerequisite for several Connectathon tests.  Your AIW-I test partners, plus the Connectathon monitor, will be looking for the document produced here.

    BIR_Test_Data

    Overview

    The Image Display actor in the Basic Image Review (BIR) Profile is unlike other IHE actors in that its requirements are primarily functional and do not require exchange of messages with other actors.  

    At the Connectathon, a monitor will sit down at your system and run through a set of tests to evaluate the requirements in the BIR profile. In this preparatory test, we are providing you with the test okab and the accompanying images in advance of the Connectathon.   To prepare, we expect you to load the test data 9images) run these tests in your lab in preparation for the Connectathon itself.

    Instructions

    1. Find the test plan and test data for BIR in Google Drive in IHE Documents >Connectathon > test_data > RAD-profiles > bir_data_sets .   From that folder download the following:
    • The Connectathon Test Plan for BIR Image Display: BIR_Image_Display_Connectathon_Tests-2023*.pdf
    • The BIR test images in file BIRTestData_2015.tar.bz
    • The index to the BIR test images in _README_BIR_dataset_reference.xls

    After loading the test images onto your Image Display, run the test in the BIR Test Plan document using your display application.

    Evaluation

    Create a text file that briefly describes your progress in running these tests. Upload that file into Gazelle Test Management as the result file for test. There is no pass/fail for this preparatory test . We want to make sure you're making progress toward what is expected during evaluation of your Image Display at the Connectathon. .

    IID_Prepare_Test_Data

    Overview

    To enable Connectathon testing, the Image Display is required host studies on its Image Display.

    There is one Connectathon test -- IID Invoke Display -- to exercise the Image Display and Image Display Invoker in the IID profile. The 'Special Instructions' for that test ask you to host a set of studies. This preparatory 'test' ensures you have the proper data loaded on your system prior to arriving at the Connectathon.

    We do not provide specific studies for you, but rather define the characteristics of the studies you should bring

    Instructions

    Come to the Connectathon with:

      • At least 3 studies for the same patient, ie will have the same value in patient ID, but the accession number and study dates will be different for each of these studies.
      • At least one other study for a different patient
      • A study containing a KOS object identify at least one image in the study as a 'key image'

    Evaluation

    There are no result files to upload into Gazelle Test Management for this test.  Preloading these prior to the Connectathon is intended to save you precious time during Connectathon week.

    PDI_Prepare_Media

    Overview

    The goal of this “test” is for the Portable Media Creator system to prepare, in advance of the Connectathon, your PDI media that the Portable Media Importer partners will test with during the Connectathon.   Doing this in your home lab will save you valuable time during Connectathon week.

    All PDI Portable Media Creators must support CD media; USB and DVD are optional. The media you create should contain a “representative sample” of the data produced by your system.  Complete and representative data on your media makes for a better interoperability test.

    Special Instructions for Connectathon Online:

    At a Connectathon Online, it is not possible for test partners to exchange physical PDI media.  In that case, we ask the Portable Media Creator (PMC) to:

    1. create an ISO image of your CD media
    2. upload that ISO file into the Sample area of Gazelle Test Management
    • On the 'Samples to share' tab for your test system, find the 'PDI' entry
    • Upload and save your ISO image on that sample page

    Instructions for PDI Portable Media Creators (face-to-face Connectathon):

    Prior to Connectathon, you should create two copies of your media: CD, USB, and/or DVD, depending on what you support.  On the first day of the Connectathon, you will give one copy to Connectathon monitor who is evaluating PDI tests.  You will keep one copy and use it for your peer-to-peer tests with your Importer partners.

    Use the following guidelines when creating your media:

    1. Modality systems shall put all IOD types on the media that it is capable of creating (eg. MG, US, KOS, SR, CAD-SR etc).  If you can create GSPS or KOS objects, these should also be included.
    2. PACS vendors & multi-modality workstations shall put at least 5 different image types on their media.  If they support SR, KOS, etc, they shall also put those types on the media.
    3. Media creators will create two copies of appropriate media with your images and other DICOM objects.
    4. Label your physical media.  The label should contain:
    • your system name in Gazelle Test Management
    • your table location
    • and the name of a technical contact at your table at the Connectathon

    Note that you may not have the information to make your label until you arrive at Connectathon.

    Optional:

    Starting in 2019, the ITI and Radiology Technical Framework contains specifications for including PDI and XDM content on the same media.  If your Portable Media Creator supports both the PDI and XDM Profile, you should create media with the appropriate content.   For details, see:

    • RAD TF-2: 4.47.4.1.2.3.3 "Content when Grouping with XDM"
    • ITI TF-1: 16.1.1 "Cross profile considerations - RAD Portable Data for Imaging (PDI)"
    • ITI TF-2b: 3.32.4.1.2.2. "Content Organization Overview"
    • Connectathon test "PDI_with_XDM_Create"

    Evaluation

    1. There is no file to upload to Gazelle Test Management for this test.
    2. There is no specific evaluation for this test.  Feedback will come when your partners import the contents of your media during Connectathon week.
    3. Make sure you pack up the media you created and bring it to Connectathon!

     

    REM_Modality_Type_and_Template_Support

    Instructions

    There are no test steps to execute for this test.

    Instead, create a text file which documents the type of DICOM images your modality creates and lists the DICOM Baseline Template your Acquisition Modality uses when creating Dose SRs for the REM profile.

    CT modalitites which report on irradiation events shall be capable of producing an SR compliant with TID 10011.

    Actors which support on irradiation events for Modalities of type XR, XA, RF, MG, CR, or DX shall be capable of producing an SR compliant with TID 10001

    Your text file should have the following naming convention: CompanyName_SystemName_REM.txt.

    Evaluation

    Submit the text file into the Gazelle Test Management as the results this test.

    Preload_Codes_for_EBIW

    Introduction

    To prepare for testing the RAD Encounter-based Imaging Workflow (EBIW) Profile, the EBIW actors must prepare to use a common set of DICOM codes. 

    • Encounter Manager:  Please complete the configuration and preparation described below prior to performing any peer-to-peer Connectathon tests for the EBIW profile.
    • Image Manager, Results Aggregator, Modality actors:   There is no work for you to perform, but this test contains a description of the procedures and patients that will be used in peer-to-peer EBIW tests.   You will benefit from reading them prior to the Connectathon. 
    • Modalities:  If you find that the proposed DICOM codes do not adequately match what your application would use, please contact the Connectathon Radiology Technical Manager **well in advance** of the Connectathon so that the set of codes can be expanded to meet your needs.

    Instructions

    The codes you need are identified in the peer-to-peer test that you will perform at the Connectathon.

    1.  In Gazelle Test Management, find the test "EBIW_10_Read_This_First" on your main Test Execution page.

    2.  Read the entire Test Description to understand the test scenario.

    3.  For each of the DICOM attributes listed in the Test Description, the Encounter Manager should configure its system to be able to use the values in the bullet lists. This ensures that consistent values will be returned in modality worklist responses for EBIW tests during the Connectathon.

     

    Evaluation

    There is no file to upload to Gazelle Test Management for this preparatory test.   If you do not load the codes you need on your test system prior to the Connectathon, you may find yourself wasting valuable time on the first day of Connectathon syncing your codes with those of your test partners.

    Preload_Codes_for_HL7_and_DICOM

    Introduction

    To prepare for testing workflow profiles in RAD, CARD, LAB, and EYECARE domains, and also for the ITI PAM Profile, it is helpful for systems that send HL7 messages (eg patient registration and orders) and/or DICOM messages (modality worklist, storage) to work with a common set of codes. 

    We ask ADT, Order Placer, Order Filler and Acquisistion Modality actors and PAM and PLT actors to load codes relevant to their system in advance of the Connectathon

    These codes include, for example:

    • Administrative sex codes in PID-8
    • Dcotors sent in PV1
    • Facility codes sent in HL7 PV1-3
    • Universal Service ID (order codes) sent in OBR-4
    • Priority codes sent in OBR-27 or TQ1-9
    • Acquisition Modality code sent in OBR-24 and (0008,0060)
    • ...and more

    Instructions

    The codes that you need depend on the profile/actors you support.  HL7 and DICOM codes used for Connectathon testing are the same set that is used in the Gazelle OrderManager tool. OrderManager contains simulators for some actors in workflow profiles.

    ** HL7 codes ** - are documented here:

    Some of these codes are also mapped into DICOM messages.  Use the spy-glass icon in the right column to view the value set for each code.  (Note that the format of these files is compliant with the IHE SVS Sharing Value Sets profile.)

    • ADT, Order Placer, and Order Filler plus PAM Supplier systems should review the link above and load codes relevant to the HL7 messages it supports

    ** DICOM codes ** - Order Filler and Acquisition Modality actors need a mapping between Requested Procedure codes, Scheduled Procedure codes, and Protocol Codes. 

    For RAD and CARD, that hierarchy is here: https://gazelle.ihe.net/common/order-manager/orderHierarchy4Radiology.xml   
    For EYECARE, that hierarchy is here: https://gazelle.ihe.net/common/order-manager/orderHierarchy4Eyecare.xml. (Note that this is documented in excel form here.)

    • An Order Filler system should load codes relevant to the domain(s) it is testing. 
    • An Acquisition Modality system should load codes relevant to the acquisitions it can perform. 

    Evaluation

    There is no result file to upload to Gazelle Test Management for this preparatory test.   If you do not load the codes you need on your test system prior to the Connectathon, you may find yourself wasting valuable time on the first day of Connectathon syncing your codes with those of your test partners.

    DICOM_QR_Test_Data

    Introduction

    This test gives you access to DICOM studies used to test XDS-I Query & Retrieve, and the QIDO-RS Query [RAD-129] transaction that is used by actors in several profiles (WIA, AIR, ...).  The data is also used to test the RAD-14 transaction with the Enterprise Identiy option in SWF.b

    Location of the studies

    There are four DICOM studies available.  The Responder system (e.g. and Image Manager, Imaging Document Source or Imaging Document Responder) must load these four studies onto its system.  

    Summary of the DICOM studies

    The contents of the studies are summarized in the "XDS-I,b XCA-I and WIA studies" google sheet. 

    There are 3 tabs in the sheet:

    1. Identifies values in key attributes in the DICOM header for each study.  
    2. You can ignore tabs 2 and 3; they apply to the XDS-I profile that re-uses these studes.
    Patient ID   Procedure Code  Modality   Series Count    Image Count
    ---------------------------------------------------------------------
    C3L-00277           36643-5     DX                 1              1
    C3N-00953           42274-1     CT                 3             11
    TCGA-G4-6304        42274-1     CT                 3             13
    IHEBLUE-199                     CT                 1              1
    

     

     

    Instructions

    Prior to the Connectathon, the Imaging Document Source should:

    1. Load the 4 DICOM studies onto your test system. (See 'Location of the Studies' above.)

     

    Evaluation

    There is no file to upload to Gazelle Test Management for this preparatory test.   If you do not load the studies you need on your test system prior to the Connectathon, you may find yourself wasting valuable time on the first day of Connectathon.

    XDS-I.b_Prepare_Manifests

    Introduction

    This test is for Imaging Document Source actors in the XDS-I.b and XCA-I Profiles that support the "Set of DICOM Instances" option.  (If your Imaging Document Source only supports PDF or Text Reports, then this test does not apply to you.)

    For this test, we ask you to create manifests for 3 studies that Connectathon Technical Managers provide.  This enables us to check both the metadata and manifest for expected values that match data in the images and in the XDS metadata affinity domain codes defined for the Connectathon (i.e. codes.xml).  (For other peer-to-peer tests during Connectathon, you will be able to also test with studies that you provide.)

    The manifests you create for these 3 studies will be used for some XDS-I/XCA-I tests during Connectathon week.

    Prerequisite test

    Before you prepare the manfiests using the Instructions below, first load the DICOM Studies in the Test Data.  See Prepratory Test DICOM_QR_Test_Data

    Instructions

    Prior to the Connectathon, the Imaging Document Source should:

    1. Load the 3 DICOM studies onto its test system. (See 'Prerequisite test' above.)
    2. Construct 3 XDS-I Manifests, one for each of the studies.
    3. Submit one DICOM Manifest, for the CT study for patient C3N-00953, as a sample in Gazelle Test Management, and perform DICOM validation:
      • Log in to Gazelle Test Management for the Connectathon
      • Access the samples page:  menu Testing-->Samples exchange
      • On the "Samples to share" tab, find the entry for "XDS-I_Manifest"
      • Upload the .dcm file for your manifest
      • Under "Actions", use the green triangle icon to perform DICOM validation of your manifest using Gazelle EVS.  We expect a validation result with no errors.

     

    Evaluation

    During Connectathon, a monitor will examine your Manifest; there are two verifications that Connectathon Monitors will perform:

    (1) examine the DICOM Manifest for the study

    (2) examine the metadata for the submitted manifest

    We do not duplicate the Evaluation details here, but we encourage the Imaging Document Source to read those details now to ensure its manifest will pass verification during Connectathon.  Find those details in Gazelle Tests Management on your Text Execution page in Connectathon test "XDS-I.b_Manifest_and_Metadata".

     

     

    Gazelle External Validation Service (EVS) tests

    This section contains test cases performed with the Gazelle External Validation Service (EVS) tool.

    Tool: https://gazelle.ihe.net/evs/home.seam

    Tool user guide:  https://gazelle.ihe.net/gazelle-documentation/EVS-Client/user.html

    14110: XDS-SD Schematron & PDF/A

    In this test, an XDS-SD Content Creator will use Gazelle External Validation Service (EVS Client) to:

    • verify the attributes in the XDS-SD document using schematron
    • verify that the scanned portion of the document is PDF/A
    There are two sets of instructions below.  Pick the instructions appropriate for your situation:
    1. Instructions for validating a sample XDS-SD for Connectathon testing.
    2. Instructions for validating your XDS-SD using EVSClient directly.

    (1) Instructions for validation your sample XDS-SD for Connectathon testing.

    First, upload your XDS-SD document into Gazelle Test Management.  If you support both PDF and Text, you will upload two samples

    1. Create an XDS-SD document according to the capabilities of your application. Name the file using this convention:

    • SYSTEMNAME_XDS-SD-TEXT.xml, for example
    • EHR_XYZmed_XDS-SD-TEXT.xml
    • Get your system name right; get the document type right.

    2. Upload the document into the Samples area of Gazelle Test Management.

    • Click on the Samples to share tab for your test system
    • Find the XDS-SD entry and upload your document
    3. Next, validate your document
    • On the samples tab, use the green triangle under "Actions" to EVSClient to perform validation.
    • Now, proceed with the EVSClient instructions that follow below.

    (2) Instructions for validating your XDS-SD using EVSClient directly:

    First, run the schematron check:
    1. Select menu: IHE / CDA / Validate
    2. Upload the XDS-SD file you wish to validate
    3. In the Schematron dropdown list, select IHE ITI ... XDS-SD
    4. Select the "Validate" button
    Second, run the Model-based validation:
    1. Access the External Validation Service: https://gazelle.ihe.net/evs
    2. Select menu: IHE / CDA / Validate
    3. Upload the XDS-SD file you wish to validate
    4. In the Model Based Validation dropdown list, select IHE - ITI...XDS-SD
    5. Select the Validate button
    Next, find your results:
    1. Select menu: IHE / CDA / Validation Logs
    2. Use the interface to find your validated document. This will show your results.
    Finally, verify that you have valid PDF/A (if this is a PDF scanned document):
    1. Select menu: IHE / CDA / Validation Logs
    2. Use the interface to find your validated document
    3. Select the link to the document
    4. On the page of your results, scroll all the way down to the section labelled "File Content".
    5. Select the "html" tab.
    6. At the top of the tab, you will see a link to your pdf. Click on the link and the validation will be performed.
    Evaluation
     
    The EVSClient identifies errors in your content. 
     
    Once you have Passed validation, capture your results:
    1. If you uploaded a sample into Gazelle Test Management, you can copy/paste a permanent link to your sample, which will show Passed results.
    2. If you used EVSClient directly, you can copy/paste the permanent link to your results in EVSClient.

    Finally, paste the Permanent Link into Gazelle Test Management as the results for this test.

    30400 : LPOCT : Accepted Observation Set (for the POCDM)

    This test concerns the POCDM actor of the LPOCT profile. You will need to validate with EVSClient tool the IHE conformance of your HL7 messages.

    Instructions

    As your system implements the  POCDM   actor, you will need to test the HL7 messages used in the LAB-32 transaction : "Accepted Observation Set".
    Your system must be able to send HL7 messages (to the Oder Filler actor) of types :

    • ORU^R30^ORU_R30, for the LAB-32  transaction
    • ORU^R31^ORU_R30 , for the LAB-32  transaction

    To test the IHE conformance of your HL7 messages, go to the EVSCLient tool at this location : EVSClient
    If it is your first time with this tool, please read the user manual : EVSClient User Manual
        
    In the EVSClient tool, go to the HL7 menu entry and choose "HL7v2" then click on "Message Validation".
    Paste your message to validate in the box. (You can hit the "Guess" button to preset the Message Profile OID.)
        
    For example, for the ORU^R30^ORU_R30 message :

    1. Copy the ORU^R30^ORU_R30 HL7 message from your system. Paste it in the box in the EVSClient page. 
    2. Select the Message profile OID according to your message. In this example : 
      • Affinity Domain : IHE
      • Domain : Laboratory
      • Actor : Point Of Care Data Manager
      • Transaction : LAB-32
      • Message Type : ORU^R30^ORU_R30
      • HL7 Version : HL7v2.5
    3. Once you have selected the Profile OID, press the "Validate" button. If the Validation Result is not "PASSED", it means your HL7 message is not according to the LPOCT Profile of the Laboratory  Technical Framework. If the Validation Result is "PASSED", copy the "Permanent link" and paste it in Gazelle as the result of this test. For further details, see this tutorial : How enter your pre-connectathon test result in Gazelle (only for the LAW, LCSD, LPOCT and LBL Profiles of the Laboratory Domain) ?


    Do this for all messages and don't forget to copy/paste the "Permanent link" of the validation result to Gazelle.

    Evaluation

    • The validation status shall be "PASSED" for all messages.
    • The message type shall be the right message type, according to IHE.
    • Each of the three messages must be validated at least one time and the permanent link must be paste to Gazelle.

    30401 : LPOCT : Accepted Observation Set (for the Order Filler)

    This test concerns the Order Filler actor of the LPOCT profile. You will need to validate with EVSClient tool the IHE conformance of your HL7 messages.

    Instructions

    As your system implements the Order Filler actor, you will need to test the HL7 messages used in the LAB-32 transaction : "Accepted Observation Set".
    Your system must be able to send HL7 message (to the POCDM actor) of type:

    • ACK^R33^ACK, for the LAB-32  transaction

    To test the IHE conformance of your HL7 messages, go to the EVSCLient tool at this location : EVSClient
    If it is your first time with this tool, please read the user manual : EVSClient User Manual
        
    In the EVSClient tool, go to the HL7 menu entry and choose "HL7v2" then click on "Message Validation".
    Paste your message to validate in the box. (You can hit the "Guess" button to preset the Message Profile OID.)
        
    For example, for the ACK^R33^ACK message :

    1. Copy the ACK^R33^ACK HL7 message from your system. Paste it in the box in the EVSClient page. 
    2. Select the Message profile OID according to your message. In this example : 
      • Affinity Domain : IHE
      • Domain : Laboratory
      • Actor : Department System Scheduler/Order Filler
      • Transaction : LAB-32
      • Message Type : ACK^R33^ACK
      • HL7 Version : HL7v2.5
    3. Once you have selected the Profile OID, press the "Validate" button. If the Validation Result is not "PASSED", it means your HL7 message is not according to the LPOCT Profile of the Laboratory  Technical Framework. If the Validation Result is "PASSED", copy the "Permanent link" and paste it in Gazelle as the result of this test. For further details, see this tutorial : How enter your pre-connectathon test result in Gazelle (only for the LAW, LCSD, LPOCT and LBL Profiles of the Laboratory Domain) ?


    Do this for all messages and don't forget to copy/paste the "Permanent link" of the validation result to Gazelle.

    Evaluation

    • The validation status shall be "PASSED" for all messages.
    • The message type shall be the right message type, according to IHE.
    • The Acknowledgment code of the ACK^R33^ACK messages shall be "AA" (MSA-1).
    • Each of the three messages must be validated at least one time and the permanent link must be paste to Gazelle.

    EVS_ADX_Validation

    This test applies to the Content Data Structure Creator actor in the Aggregate Data Exchange (ADX) profile.

    This test ensures that the DSD file produced by the Content Data Structure Creator actor is conformant to the ADX schematron specification.

    Reference:  QRPH TF-3: 8.2 and Appendix 8A, currently in the ADX Trial Implementation Supplement.

    The gazelle External Validation Service (aka EVSClient) hosts the schmatron file and is used to check the DSD.

    Instructions

      1. Access the Gazelle EVSClient application: https://gazelle.ihe.net/evs
      2. From the EVS Client user interface, select menu IHE-->SDMX-->Validate
      3. Upload the XML file for the document produced by your application
      4. Select the schmatron for ADX.
      5. Click on Validate

      Evaluation

        1. The tool reports the results of the evaluation
        2. The EVS Client creates a Permanent link to your results.
        3. Finally, paste the Permanent Link into Gazelle Test Management as the results for this test.

        EVS_APPC_Validation

        This test applies to the Content Creator actor in the Advanced Patient Privacy Consents (APPC) profile.

        This test ensures that the file produced by the Content Creator actor is conformant to the specification.

        Reference:  ITI TF-3: 5.6, currently in the APPC Trial Implementation Supplement.

        The gazelle External Validation Service (aka EVSClient) hosts the schmatron file used to check the APPC documents.

        Instructions

          1. Access the Gazelle EVSClient application: https://gazelle.ihe.net/evs
          2. From the EVS Client user interface, select menu IHE-->APPC-->Validate
          3. Upload the XML file for the document produced by your application
          4. Select the APPC schematron
          5. Click on Validate

          Evaluation

            1. The tool reports the results of the evaluation
            2. The EVS Client creates a Permanent link to your results.
            3. Finally, paste the Permanent Link into Gazelle Test Management as the results for this test.

            EVS_Assertion-related_Message_Validation

            You will use the Gazelle EVSClient to validate messages in SAML- and XACML-related profiles.

            These messages are used in several IHE profiles.  The messages you validate will depend on the profile/actor pairs supported by your test system.

            Instructions

            • Access the Gazelle EVSClient application: https://gazelle.ihe.net/evs
            • From the EVS Client user interface, select menu IHE-->Assertions-->Validate.
            • Paste/upload your the XML file for the message produced by your application
            • Select the correct Model Based Validator from the dropdown list:
              • ITI-40 XUA Provide X-User Assertion 
              • ITI-40 XUA Authz-Consent option
              • ITI-40 XUA Purpose of Use option
              • ITI-40 XUA Subject Role option
              • ITI-79 SeR Authorization Decisions Query Request
              • ITI-79 SeR Authorization Decisions Query Response
            • Click on Validate.

            Evaluation

            • The EVS Client creates a Permanent link to your results.
            • Finally, paste the Permanent Link into Gazelle Test Management as the results for this test.

            EVS_CDA_Validation

            You will use the Gazelle EVSClient to evaluate CDA documents defined in IHE profiles in several domains.

            The documents you validate will depend upon the profile/actor pairs supported by your test system, and the availability of schematron or model-based validation available in the EVSClient

            Instructions

              1. Access the Gazelle EVSClient application: https://gazelle.ihe.net/evs
              2. From the EVS Client user interface, select menu IHE-->CDA-->Validate
              3. Upload the XML file for the document produced by your application
              4. Select a validator (the list is found below)
              5. Click on Validate

              Evaluation

                1. The EVS Client creates a Permanent link to your results.
                2. Paste the Permanent Link into Gazelle Test Management as the results for this test.

                These validators available in the Schematron dropdown list. Note that the tool is updated over time, so there may be additional validators available in the tool itself.

                    • C-CDA 2.1 All Document Types, templateId@extensions=2015-08-01
                    • IHE - CARD - Registry Content Submission (RCS-C)
                    • IHE - CARD - Registry Content Submission - Electrophysiology (RCS-EP)
                    • IHE - EYE - GEE Progress Note (2015 version)
                    • IHE - EYE - Summary Record (EC-Summary)
                    • IHE - ITI - Basic Patient Privacy Consent (BPPC)
                    • IHE - ITI - Cross-Enterprise Sharing of Scanned Docs (XDS-SD)
                    • IHE - LAB - Sharing Laboratory Reports (XD-LAB)
                    • IHE - PCC - Antepartum Education (APE)
                    • IHE - PCC - Antepartum History & Physical (APHP)
                    • IHE - PCC - Antepartum Summary (APS)
                    • IHE - PCC - Composite Triage & Nursing Note (CTNN)
                    • IHE - PCC - Emergency Dept Referral (EDR)
                    • IHE - PCC - eNursing Summary (ENS)
                    • IHE - PCC - Immunization Content (IC)
                    • IHE - PCC - Labor & Delivery History & Physical (LDHP)
                    • IHE - PCC - Labor & Delivery Summary (LDS)
                    • IHE - PCC - Maternal Discharge Summary (MDS)
                    • IHE - PCC - Newborn  Discharge Summary (NDS)
                    • IHE - PCC - Nursing Note (NN)
                    • IHE - PCC - PHMR
                    • IHE - PCC - Physician Note (PN)
                    • IHE - PCC - Post partum Visit Summary (PPVS)
                    • IHE - PCC - Triage Note (TN)
                    • IHE - PCC - XDS-MS Dischange Summary
                    • IHE - PCC - XDS-MS Referral Note
                    • IHE - PCC - XPHR Extract
                    • IHE - PCC - XPHR Update
                    • IHE - PHARM - Pharmacy Dispense (DIS)
                    • IHE - PHARM - Pharmacy Pharmaceutical Advice (PADV)
                    • IHE - PHARM - Pharmacy Prescription (PRE)
                    • IHE - QPRH - Clincial Research Document (CRD)
                    • IHE - QRPH - Labor and Delivery Summary - Vital Records (LDS-VR)
                    • IHE - QRPH - Physican Reporting to Public Health-Cancer Registry (PRPH-Ca)

                  These validators are available in the Model Based Validator dropdown list. Note that the tool is updated over time, so there may be additional validators available in the tool itself.

                      • HL7 - CDA Release 2
                      • HL7 - CDA Release 2 (strict)
                      • IHE - CARD - Cath Report Content (CRC)
                      • IHE - CARD - Registry Content Submission (RCS-C)
                      • IHE - CARD - Registry Content Submission - Electrophysiology (RCS-EP)
                      • IHE - EYE - GEE Progress Notes (2015 version)
                      • IHE - EYE - Summary Record (EC Summary)
                      • IHE - ITI - Basic Patient Privacy Consent (BPPC)
                      • IHE - ITI - Cross-Enterprise Sharing of Scanned Docs (XDS-SD)
                      • IHE - LAB - Sharing Laboratory Reports (XD-LAB)
                      • IHE - PATH - Anatomic Pathology Structured Report (APSR)
                      • IHE - PCC - Antepartum Profiles
                      • IHE - PCC - Common templates
                      • IHE - PCC - many more...
                      • IHE - PHARM - Community Medication Administration (CMA)
                      • IHE - PHARM - Community Medication List (PML)
                      • IHE - PHARM - Community Medication Treatment Plan (MTP)
                      • IHE - PHARM - Pharmacy Dispense (DIS)
                      • IHE - PHARM - Pharmacy Pharmaceutical Advice (PADV)
                      • IHE - PHARM - Pharmacy Prescription (PRE)
                      • IHE - RAD - CDA document wrapper (XDS-I.b)

                    EVS_DICOM_Object_Evaluation

                    Overview

                    We use DICOM validator tools hosted in the Gazelle External Validation Service (EVS) to evaluate your DICOM objects.

                    In this test, Acquisition Modality, Lightweight Modality, or Evidence Creator systems evaluate samples of Composite Objects that you create using the DICOM evaluation tools available in the Gazelle External Validation Service (EVS).   This test also applies to actors such as Importers that modify objects originally created by other actors.

                    The number of evaluations you run depends on the types of images or SRs that you produce. We will not list specific requirements, but ask you to apply good judgment. For example, a CT scanner that produces Localizer and Axial images would evaluate samples from both of those image types. A CR device may evaluate an AP chest, a lateral chest and an image of a limb.  A Lightweight Modality might create a VL Photographic Image IOD or a Video Photographic Image IOD.

                    You must evaluate and provide the output for at least one DICOM Composite Object using '''one'' of the available validation tools.   If you support multiple profiles and create different DICOM IODs, you should validate each type. 

                    One or more of these tools may be available from within the Gazelle EVS.  (Note:  The links below are for resource pages for each tool.). (For some testing events, the list of DICOM validators may be smaller or larger.)

                    • dicom3tools from David Clunie
                    • PixelMed  used for DICOM SRs, including SRs for the Evidence Documents, AI Results, REM, and REM-NM Profiles
                    • dcmcheck from OFFIS
                    • dcm4che
                      • NOTE:  The label for the PixelMed validator may  say "Dose SR", but it can be used for all DICOM SR objects

                    Evaluating your objects using the different tools available as well as evaluating different objects can only help your implementation.

                    There are two ways to access the validators: (1) in the Samples area of Gazelle Test Management, or (2) directly in the EVS tool.  Either method may be used.

                    (1) Instructions for accessing the DICOM validators from the Samples area of Gazelle Test Management:

                    1. Access Gazelle Test Management and log in.
                    2. Select menu Testing-->Sample exchange
                    3. Select your test system from the dropdown list
                    4. On the 'Samples to share' tab, find the entry for 'DICOM_OBJECTS'
                    5. Select the '+' and proceed to upload a DICOM file(s)
                    6. Once the file is uploaded, you will be able to use the green triangle icon to call the DICOM validator in EVS.

                     

                    (2) Instructions for accessing the DICOM validators in the Gazelle EVS Tool:

                    1. Access the Gazelle EVS: https://gazelle.ihe.net/evs/home.seam
                    2. Select menu IHE-->DICOM-->Validate
                    3. Upload the file you want to validate.
                    4. Then, select one of the DICOM validator tools from the dropdown list.  To get the best coverage, we encourage you to try multiple validators, as applicable.
                    5. Click Execute.
                    6. You will see your results. (to find them later, Select menu IHE-->DICOM-->Validation logs
                    7. The tool identifies errors in your content.  Examine your results.
                      -->You are finished with this test when you have either satisfied all of the errors detected by the tool. If you disagree with the results of the tool, you can write a short (1 or 2 line) description that describes why your object is correct and the DICOM tool software is in error.
                    8. Find the Permanent link to the results

                    Finally, capture your results:

                    1. Copy either the permanent link to your sample or the permanent link to your results in the EVS and paste the link as the result for this test in Gazelle Test Management. (This is also where you can comment on the results of the validation.)  

                    EVS_DSG_Validation

                    This test applies to the Content Creator actor in the Document Digital Signature (DSG) profile.

                    This test ensures that the file produced by the Content Creator actor is conformant to the specification.

                    Reference:  ITI TF-3: 5.5.

                    Instructions

                      1. Instructions are in the EVS Client User Manual here: https://gazelle.ihe.net/gazelle-documentation/EVS-Client/user.html#validation-of-digital-signatures

                      Evaluation

                        1. The tool reports the results of the evaluation
                        2. The EVS Client creates a Permanent link to your results.
                        3. Paste the Permanent Link into Gazelle Test Management as the results for this test.

                        EVS_DSUB_Message_Validation

                        You will use the Gazelle EVSClient to validate messages in the DSUB profile..

                        The messages you validate will depend upon the profile/actor pairs supported by your test system

                        Instructions

                        • Access the Gazelle EVSClient application: https://gazelle.ihe.net/evs
                        • From the EVS Client user interface, select menu IHE-->DSUB-->Validate
                        • Upload the XML file for the message produced by your application
                        • Select the correct Model Based Validator from the dropdown list:
                          • ITI-52 Document Metadata Subscribe Request
                          • ITI-52 Document Metadata Subscribe Response
                          • ITI-52 Document Metadata Unsubscribe Request
                          • ITI-52 Document Metadata Unsubscribe Response
                          • ITI-53 Document Metadata Notify Request
                          • ITI-53 Document Metadata Publish Request
                        • Click on Validate.

                        Evaluation

                        • The EVS Client creates a Permanent link to your results.
                        • Paste the Permanent Link into Gazelle Test Management as the results for this test.

                        EVS_FHIR_IG-based_Validation

                        Test Summary

                        The Gazelle External Validation Service (EVS) performs validation using FHIR StructureDefinitions contained in IHE profiles published in FHIR Implementation Guide (IG) format (e.g. the ITI Mobile Access to Health Documents (MHD) IG, and many others).  The StructureDefinitions used by EVS are found on the "Artifacts" page of the IG. 

                        In this test, you will use the Gazelle EVS tool to validate:

                        • FHIR®© Resources created by your application and/or returned by your application in response to a query
                        • IHE transactions based on FHIR:  outbound messages and responses with constraints defined in IHE transaction definitions.

                        The validators are available in EVS under menu IHE-->FHIR IG-based -->Validate.

                         

                        Who performs this test

                        This test appears in Gazelle Test Management as a Preparatory test, so... 

                        • If your actor icreates FHIR resources and/or initiates a transaction containing one or more FHIR Resources in XML or JSON format, you should perform this test to use the EVS tool ensure the Resource(s) and transactions you create pass validation.
                        • If your actor responds to a query and returns FHIR Resource(s) in XML or JSON format, you should perform this test.
                        • If your actor audits a transaction using an ATNA ATX FHIR Feed (AuditEvent), you should validate your audit messages.

                          • How do you know which resources and transactions to test?  IHE profiles that exchange FHIR-based Resources identify the actor(s) and the applicable transactions and resources.  We will not restate the requirements for each actor here, but here are some examples:
                            • A PDQm Patient Demographics Supplier responds to an [ITI-78] query by sending a Bundle of one or more Patient Resources.  The Patient Resource is not constrained in the PDQm profile
                            • An IPS Content Creator creates a FHIR bundle with IPS content.
                            • An MHD Document Source sends [ITI-65].  Depending on the type of submission, ITI-65 sends a Transaction Bundle with a combination of List, DocumentReference, Binary and Patient Resources, with constraints defined in MHD.
                            • An ATNA Secure Node/Application that supports the ATX: FHIR Feed option for [ITI-20] sends AuditEvent Resources to an Audit Record Repository; some IHE transactions have constraints on the AuditEvent Resource.
                          • You should validate Resources that your system has created.  For some actors (e.g. PDQm Supplier, mCSD Supplier), Connectathon technical managers provide pre-defined test data used in Connectathon tests.   You should not validate these resources; instead, we assume that your application is also able to create Patient, Practitioner, Organzation, etc Resources.  These are the ones you should test.

                        Instructions

                        • Identify the FHIR Resources and FHIR-based transactions supported by your application.
                        • Access the Gazelle EVS: https://gazelle.ihe.net/evs
                        • From the EVS user interface, select menu IHE-->FHIR IG-based-->Validate 
                        • Upload the XML or JSON file produced by your application
                        • Select the correct Validator from the dropdown list.
                        • Click on Validate.
                        • Repeat as needed for other resources or transactions.

                        Evaluation

                        • The EVS creates a Permanent link to your results under IHE-->FHIR IG-based-->Validation Log
                        • Paste the Permanent Link into Gazelle Test Management as the results for this test.
                        • Many systems will have more than one result to report.

                        RAD_IMR_IG-based_Validation

                         

                        Test Summary

                        The Gazelle External Validation Service (EVS) performs validation using FHIR StructureDefinitions contained in the IHE Radiology Interactive Multimedia Report (IMR) Profile. The StructureDefinitions used by EVS are found on the "Artifacts" page of the IMR IG. 

                        In this test, you will use the Gazelle EVS tool to validate:

                        • FHIR®© Resources created by your application and/or returned by your application in response to a query
                        • IHE transactions based on FHIR:  outbound messages and responses with constraints defined in IHE transaction definitions.

                        The validators are available in EVS under menu IHE-->FHIR IG-based -->Validate.

                         

                        Who performs this test

                        This test appears in Gazelle Test Management as a Preparatory test for several IMR actors.

                        • If your actor initiates a transaction containing one or more FHIR Resources, or if your actor responds to a query and returns FHIR Resource(s), you should perform this test to use the EVS tool ensure the Resource(s) and transactions you support pass validation.

                        This table identifies IMR actors (Col 1), the content it creates (Col 2), and which validators in EVS to use to validate that content (Col 3).

                        IMR actor Content that is validated
                        RAD IMR Validator in EVS
                        Report Creator create/store a report with RAD-141 Transaction Bundle
                        Report Creator in the IMR bundle DiagnosticReport
                        Report Creator may include in IMR bundle ServiceRequest
                        Report Creator may include in IMR bundle ImagingStudy
                        Report Creator in the ImagingStudy ImagingStudy Endpoint
                        Report Repository, Report Reader,
                        Rendered Report Reader
                        response to a RAD-141 transaction Bundle Response
                        Report Repository response to an RAD-143 transaction Find Multimedia Report Response
                            Imaging Observation (experimental)

                         

                        Instructions

                        • Using the table above, identify the FHIR Resources and FHIR-based transactions supported by your application
                        • Access the Gazelle EVS: https://gazelle.ihe.net/evs
                        • From the EVS user interface, select menu IHE-->FHIR IG-based-->Validate
                        • Upload the XML or JSON file produced by your application
                        • Select the correct Validator from the dropdown list.
                        • Click on Validate.
                        • Repeat as needed for other resources or transactions.

                         

                        Evaluation

                        • The EVS creates a Permanent link to your results under IHE-->FHIR IG-based-->Validation Log
                        • Paste the Permanent Link into Gazelle Test Management as the results for this test.
                        • Many systems will have more than one result to report.

                        RAD_IRA_IG-based_Validation

                         

                        Test Summary

                        The Gazelle External Validation Service (EVS) performs validation using FHIR StructureDefinitions contained in the IHE Radiology Integrated Reporting Applications (IRA) Profile. The StructureDefinitions used by EVS are found on the "Artifacts" page of the IRA IG. 

                        In this test, you will use the Gazelle EVS tool to validate:

                        • FHIR®© Resources created by your application and/or returned by your application in response to a query
                        • IHE transactions based on FHIR:  outbound messages and responses with constraints defined in IHE transaction definitions.

                        The validators are available in EVS under menu IHE-->FHIR IG-based -->Validate.

                         

                        Who performs this test

                        This test appears in Gazelle Test Management as a Preparatory test for IRA actors.

                        • If your actor initiates a transaction containing one or more FHIR Resources, you should perform this test to use the EVS tool ensure the Resource(s) and transactions you support pass validation.

                        This table identifies IMR actors (Col 1), the content it creates {Col 2), and which validators in EVS to use to validate that content (Col 3).

                        IRA actor
                        Content that is validated
                        RAD IRA Validator in EVS
                        Image Display, Report Creator,
                        Worklist Client
                        Open Report Context [RAD-148] DiagnosticReport Context
                        Patient Context
                        ImagingStudy Context
                        Content Creator (see Note) Update Report Content [RAD-150] DiagnosticReport Update
                        Content Creator Update Report Content [RAD-150] ImagingSelection Content
                        Content Creator Update Report Content [RAD-150] Observation Content
                        Content Creator may include in DiagnosticReport DiagnosticReport associated study

                         Note: The Content Creator in Column 1 is grouped with one of these actors: Report Creator, Evidence Creator, Image Display, Evidence Creator, Stateless Evidence Creator. The type of 'content' it creates (i.e. the FHIR Resource validated in Column 3) depends on this grouped actor.

                         

                        Instructions

                        • Using the table above, identify the FHIR Resources and FHIR-based transactions supported by your application
                        • Access the Gazelle EVS: https://gazelle.ihe.net/evs
                        • From the EVS user interface, select menu IHE-->FHIR IG-based-->Validate
                        • Upload the XML or JSON file produced by your application
                        • Select the correct Validator from the dropdown list.
                        • Click on Validate.
                        • Repeat as needed for other resources or transactions.

                         

                        Evaluation

                        • The EVS creates a Permanent link to your results under IHE-->FHIR IG-based-->Validation Log
                        • Paste the Permanent Link into Gazelle Test Management as the results for this test.
                        • Many systems will have more than one result to report.

                        EVS_FHIR_Validation

                        This test has been deprecated as of the 2022 IHE Connectathon

                        Test Summary

                        You will use the Gazelle EVSClient tool to validate:

                        • FHIR®© Resources created by your application and/or returned by your application in response to a query
                        • IHE transactions based on FHIR:  outbound messages and responses with constraints defined in IHE transaction definitions.

                        Scope of testing available in Gazelle EVSClient

                        1. Validation of Resources using the baseline requirements defined in FHIR.   This is available in EVSClient under menu FHIR®©-->Validate.
                        2. Validation of Resources and transactions based on constraints defined in IHE profiles.  This is available in EVSClient under menu IHE-->FHIR®©-->ValidateThe contraints tested by EVSClient are defined in FHIR conformance resources published by IHE (e.g., StructureDefinition and ValueSet resources)
                          • For IHE profiles published in FHIR Implementation Guide (IG) format (e.g. MHD, mCSD, and more), conformance resources are found on the "Artifacts" page of the IG.
                          • For IHE profiles published in PDF format, conformance resources are found on https://github.com/IHE/fhir

                        Who this test applies to

                        This test appears in Gazelle Test Management as a preparatory test for actors in all FHIR-based IHE profiles, so... 

                        • If your actor initiates a transaction containing one or more FHIR Resources in XML or JSON format, you should perform this test to use the EVSClient tool ensure the Resource(s) and transactions you create pass validation.
                        • If your actor responds to a query and returns FHIR Resource(s) in XML or JSON format, you should perform this test.
                        • If your actor, produces/sends FHIR Resources, how do you know which resources and transactions to test?  IHE profiles that exchange FHIR-based Resources identify the actor(s) and the applicable transactions and resources.  We will not restate the requirements for each actor here, but here are some examples:
                            • A PDQm Patient Demographics Supplier responds to an [ITI-78] query by sending a Bundle of one or more Patient Resources.  The Patient Resource is not constrained in the PDQm profile.
                            • An MHD Document Source sends [ITI-65].  Depending on the type of submission, ITI-65 sends a Transaction Bundle with a combination of List, DocumentReference, Binary and Patient Resources, with constraints defined in MHD.
                            • A mACM Alert Reporter sends an [ITI-84] alert request that contains a CommunicationRequest Resource.  The Alert Aggregator's  [ITI-84] response contains a CommunicationRequest Resource and Communication Resource(s).
                            • An ATNA Secure Node/Application that supports the ATX: FHIR Feed option for [ITI-20] sends AuditEvent Resources to an Audit Record Repository; some IHE transactions have constraints on the AuditEvent Resource.
                          • You should validate Resources that your system has created.  For some actors (e.g. PDQm Supplier, mCSD Supplier), Connectathon technical managers provide pre-defined test data used in Connectathon tests.   You should not validate these resources; instead, we assume that your application is also able to create Patient, Practioner, Organzation, etc Resources.  These are the ones you should test.
                        • If your actor only consumes FHIR Resources (i.e., it only receives them from other actors), then you can skip this test.  (**However** you may be interested in testing with Resources submitted by others and collected in the tool.  In the EVSClient, select menu IHE-->FHIR®©-->Validation logs and use the filters on the page to find Resources of interest to you.)

                        Instructions

                        • Identify the FHIR Resources and FHIR-based transactions supported by your application,
                        • Access the Gazelle EVSClient application: https://gazelle.ihe.net/evs
                        • From the EVSClient user interface, select menu IHE-->FHIR®©-->Validate (to test IHE constraints) or FHIR®©-->Validate (to test baseline requirements for FHIR Resources)
                        • Upload the XML or JSON file produced by your application
                        • Select the correct Model Based Validation from the dropdown list.
                        • Click on Validate.
                        • Repeat as needed for other resources or transactions.

                        Evaluation

                        • The EVSClient creates a Permanent link to your results under IHE-->FHIR®©-->Validation Log or FHIR®©-->Validation Log
                        • Paste the Permanent Link into Gazelle Test Management as the results for this test.
                        • Many systems will have more than one result to report.

                        EVS_HL7v2.x_Message_Validation

                        You will use the Gazelle EVSClient to validate HL7v2.x based messages.  

                        These messages are applicable across many IHE profiles in several domains.  The messages you validate will depend upon the profile/actor pairs supported by your test system.

                        Instructions

                        • Access the Gazelle EVSClient application:  https://gazelle.ihe.net/evs
                        • From the EVS Client user interface, select menu IHE-->HL7v2.x messages-->Validate
                        • Upload the file (in ER7 format) for the HL7 message produced by your application
                        • From the Transaction column, select the applicable transaction (eg RAD-1)
                        • From the HL7 version column, select 2.3.1, 2.5, or 2.5.1
                        • Some transactions require actors to support for multiple messages (eg, new order, cancel order...). If multiple messages are listed, test them all.
                        • The following validations are available.  Note that the tool is updated over time, so there may be additional validators available in the tool itself.
                          • CARD-7 Encapsulated Report Submission
                          • CARD-8 Report Reference SubmissionITI-8 v2.3.1 Patient Identity Feed
                          • ITI-8 v2.3.1 Patient Identity Feed
                          • ITI-9 v2.3.1 PIX Query
                          • ITI-10 v2.3.1 PIX Update Notification
                          • ITI-21 v2.5 Patient Demographics Query
                          • ITI-22 v2.5 Patient Demographics and Visit Query
                          • ITI-30 v2.5 Patient Identity Management
                          • ITI-31 v2.5 Patient Encounter Management
                          • ITI-64 v2.5 Notify XAD-PID Link Change
                          • LAB-1 Lab Placer Order Management
                          • LAB-2 Lab Filler Order Management
                          • LAB-3 Order Results Management
                          • LAB-4 Work Order Management
                          • LAB-5 Test Result Management
                          • LAB-21 WOS Download
                          • LAB-22 WOS Query
                          • LAB-23 AWOS Status Change
                          • LAB-26 SWOS Status Change
                          • LAB-27 Query for AWPS
                          • LAB-28 AWOS Broadcast
                          • LAB-29 AWOS Status Change
                          • LAB-32 POCT observation accepted
                          • LAB-35 Sub-order Management
                          • LAB-36 Sub-order Results Delivery
                          • LAB-51 Laboratory Code Set Management
                          • LAB-61 Label delivery request
                          • LAB-62 Query for lable delivery instructions
                          • LAB-63 Labels and Containers Delivered
                          • PHARM-H1 Prescription Order
                          • PHARM-H2 Validated Order
                          • PHARM-H3 Medication Preparation Report
                          • PHARM-H4 Administration Report
                          • PHARM-H5 Advance Prescription Notification
                          • PHARM-H6 Validated Order Confirmation
                          • RAD-1 v2.3.1 Patient Registration
                          • RAD-2 v2.3.1 and v2.5.1 Placer Order Management
                          • RAD-3 v2.3.1 and v2.5.1 Filler Order Management
                          • RAD-4 v2.3.1 and v2.5.1 Procedure Scheduled
                          • RAD-12 v2.3.1 Patient Update
                          • RAD-13 v2.3.1 and v2.5.1 Procedure Update
                          • RAD-28 v2.5 Structured Report Export
                          • RAD-35 v2.3.1 Charge Posted
                          • RAD-36 v2.3.1 Account Management
                          • RAD-48 v2.3.1 and v2.4 Appointment Notification
                        • Click on Validate.

                        Evaluation

                        • The EVS Client creates a Permanent link to your results.
                        • Paste the Permanent Link into Gazelle Test Management as the results for this test.

                        EVS_HL7v3_Message_Validation

                        You will use the Gazelle EVSClient to validate HL7v3-based messages.  

                        These messages are applicable across many IHE profiles.  The messages you validate will depend upon the profile/actor pairs supported by your test system.

                        Instructions

                        • Access the Gazelle EVSClient application: https://gazelle.ihe.net/evs
                        • From the EVS Client user interface, select menu IHE-->HL7v3-->Validate
                        • Upload the XML file for the HL7 message produced by your application
                        • You have a choice of a schematron or Model Based Validation for many messages.  We encourage you to perform both, if available.
                        • Some transactions require actors to support multiple messages. If multiple messages are listed, test them all.
                        • The following schematron validations are available.  Note that the tool is updated over time, so there may be additional validators available in the tool itself.
                          • ITI-44  Patient Identity Feed (Register/Admit, Update, and Merge)
                          • ITI-45 PIXv2 Query Request
                          • ITI-45 PIXv3 Query Response
                          • ITI-47 PDQ v3Patient Demographics Query Response
                          • ITI-55 XCPD Cross-Gateway Patient Discovery Request
                          • ITI-55 XCPD Cross-Gateway Patient Discovery Response
                        • The following Model Based Validations are available:
                          • ITI-44 Patient Identity Feed (Add, Revise, and Merge record) 
                          • ITI-44 Patient Identity Feed Acknowledgement
                          • ITI-45 PIXv3 Query Request
                          • ITI-45 PIXv3 Query Response
                          • ITI-46 PIXv3 Update Notification
                          • ITI-46 PIXv3 Update Notification Acknowledgement
                          • ITI-47 PDQv3 Patient Demographics Query (incl continuation, cancellation)
                          • ITI-47 PDQv3 Patient Demographics Query Response
                          • PDQv3 Accept Acknowledgement
                          • ITI-55 XCPD Cross-Gateway Patient Discovery Query Request & deferred option 
                          • ITI-55 XCPD Cross-Gateway Patient Discovery Query Repoonse
                          • ITI-56 XCPD Patient Location Query Request
                          • ITI-56 XCPD Patient Location Query Response
                          • XCPD Accept Acknowlegement
                        • Click on Validate.

                        Evaluation

                        • The EVS Client creates a Permanent link to your results.
                        • Paste the Permanent Link into Gazelle Test Management as the results for this test.

                        EVS_HPD_Message_Validation

                        You will use the Gazelle EVSClient to validate messages in the HPD profile..

                        The messages you validate will depend upon the profile/actor pairs supported by your test system

                        Instructions

                        • Access the Gazelle EVSClient application: https://gazelle.ihe.net/evs
                        • From the EVS Client user interface, select menu IHE-->HPD-->Validate
                        • Upload the XML file for the message produced by your application
                        • Select the correct Model Based Validator from the dropdown list:
                          • ITI-58 Provider Information Query Request
                          • ITI-58 Provider Information Query Response
                          • ITI-59 Provider Information Feed Request
                          • ITI-59 Provider Information Feed Response
                          • DSMLv2 Batch Request
                          • DSMLv2 Batch Response
                        • Click on Validate.

                        Evaluation

                        • The EVS Client creates a Permanent link to your results.
                        • Paste the Permanent Link into Gazelle Test Management as the results for this test.

                        EVS_QIDO-RS_Query

                        This test uses the Gazelle EVS Tool to validate a QIDO-RS Query Request or Response message.

                        Actors tested:   An initiator or responder in the [RAD-129] QIDO-RS Query transaction

                        Instructions

                        • Access the Gazelle EVS application:  https://gazelle.ihe.net/evs/home.seam
                        • From the EVS user interface, select menu IHE-->DICOM Web - QIDO-RS Query-->Validate or IHE-->DICOM Web - QIDO-RS Response-->Validate
                        • Upload a file with the QIDO-RS Query request or response message produced by your application.
                        • From the dropdown list, select a validator that matches your message; the tool provides several validators for different messages in the WIA or AIR profile.
                        • Click on Validate.

                        Evaluation

                        • The EVS Client creates a Permanent link to your results.
                        • Paste the Permanent Link into Gazelle Test Management as the results for this test.

                        EVS_Radiation_Dose_SR_Evaluation

                        Actors in the REM profile are required to create DICOM-compliant Radiation Dose SR objects.  This applies to Acquisition Modalities and also to Dose Information Reporters that perform de-identification for transaction [RAD-63] Submit Dose Information

                        In this test, we validate your sample SRs using PixelMed's DoseUtility tool, one of the DICOM validators hosted in the Gazelle EVS tool.

                        Instructions

                        1. Access the EVS: https://gazelle.ihe.net/evs/home.seam
                        2. Select menu IHE-->DICOM-->Validate
                        3. Upload the Dose SR file you want to validate.
                        4. Then, select Pixelmed from the dropdown list of DICOM validator tools 
                        5. Click Execute
                        Evaluation
                         
                        Capture your results:
                        1. Select menu IHE / DICOM / Validation logs
                        2. Examine your results and find the Permanent link to the results
                        3. Finally, paste the Permanent Link into Gazelle Test Management as the results for this test.
                        • You are finished with this test when you have either satisfied all of the errors detected by the tool. If you disagree with the results of the tool, you can write a short (1 or 2 line) description that describes why your object is correct and the DICOM tool software is in error.

                        EVS_SOLE_Event_Report_Check

                        Overview of the test

                        This test has two parts:

                        (1) First, the Event Reporter identifies the SOLE events that it supports.  This helps your Connectathon test partners and the Connectathon monitors understand the capabilities of your system.

                        (2)  Then, you will use the Gazelle EVSClient tool to test the content of the event record(s) you create against the DICOM schema & SOLE Event Requirements. This tool does not test the transport of the event record.

                        (1) Identify your SOLE Event Reports

                        • We use a google doc to collect this information from all SOLE Event Reporters.  Click HERE to open it.
                        • The rows in the table list the Baseline SOLE events from the SOLE Profile, Vol 3, Table 6.3.2-1.
                        • The columns hold information for each Connectathon test system.  Column C is an example
                        • In the next available column, add your company and test system name, then mark the Events you support with an 'X'.

                        (2) Test your SOLE Event Reports

                         

                        1. In the Gazelle EVSClient, select menu IHE-->ATNA Audit messages-->Validate
                        2. Select the Add button, and upload the XML file for your event report
                        3. From the Model based validation dropdown list, select the entry that matches the event, eg "IHE - RAD - SOLE101 - Order Filled"
                        4. Select the Validate button.  
                        5. You should validate at least 3 event records; of course, we encourage you to do more.

                        Evaluation

                        • The EVS Client creates a Permanent link to your results.
                        • Paste the Permanent Link into Gazelle Test Management as the results for this test.
                        • You may report more than one result.

                        EVS_SVS_Message_Validation

                        You will use the Gazelle EVSClient to validate messages in the SVS profile.

                        The messages you validate will depend upon the profile/actor pairs supported by your test system

                        Instructions

                        • Access the Gazelle EVSClient application: https://gazelle.ihe.net/evs
                        • From the EVS Client user interface, select menu IHE-->SVS-->Validate
                        • Upload the XML file for the message produced by your application
                        • Select the correct Model Based Validator from the dropdown list:
                          • ITI-48 Retrieve Value Set Request
                          • ITI-48 Retrieve Value Set Response
                          • ITI-60 Retrieve Multiple Value Sets Request
                          • ITI-60 Retrieve Multiple Value Sets Response
                        • Click on Validate.

                        Evaluation

                        • The EVS Client creates a Permanent link to your results.
                        • Paste the Permanent Link into Gazelle Test Management as the results for this test.

                        EVS_WADO-RAD-55

                        This tests the request in the WADO Retrieve [RAD-55] transaction

                        In XDS-I or IOCM, the Imaging Document Consumer is required to support one or more of RAD-55, RAD-68 or RAD-16 (or other C-MOVE).

                        • So, if your Imaging Document Consumer does not support RAD-55, you should skip this test.

                        The aim of this test is to verify that the Imaging Document Consumer is able to create a valid WADO request (the list of parameters is well specified, there are no inconsistency between them, etc). To do so, we use a WADO validator of the request from EVSClient tool.

                        In this test we do not verify that the request was well treated, or not, by the server side.

                        Instructions

                        • From the Gazelle EVS user interface, select menu IHE-->RAD-55 WADO Request-->Validate
                        • Paste/upload your request, select the correct validator, and Validate. (Yes you can use any WADO back-end and any data set, to create your request ).

                        Evaluation

                        • The EVS creates a Permanent link to your results.
                        • Paste the Permanent Link into Gazelle Test Management as the results for this test.

                        EVS_XD*_XC*_Message_Validation

                        You will use the Gazelle EVSClient to validate messages for the Cross-Enterprise (XD*) and Cross-Community (XC*) family of profiles.

                        The messages you validate will depend upon the profile/actor pairs supported by your test system

                        Instructions

                        • Access the gazelle EVSClient application: https://gazelle.ihe.net/evs
                        • From the EVS Client user interface, select menu IHE-->XD* Metadata-->Validate
                        • Upload the XML file for the message produced by your application
                        • Select the correct Model Based Validator from the dropdown list:
                          • ITI-18 Registry Stored Query Request
                          • ITI-18 Registry Stored Query Response
                          • ITI-32 Distribute Document Set on Media (metadata)
                          • ITI-38 Cross-Gateway Query Request
                          • ITI-38 Cross-Gateway Query Response
                          • ITI-39 Cross-Gateway Retrieve Request
                          • ITI-39 Cross-Gateway Retrieve Response
                          • ITI-41 Provide & Register Document Set Request
                          • ITI-41 XDR Provide & Register Document Set Limited Metadata Request
                          • ITI-41 Provide & Register Document Set Response
                          • ITI-42 Register Document Set Request
                          • ITI-42 Register Document Set Response
                          • ITI-43 Retrieve Document Set Request
                          • ITI-43 Retrieve Document Set Response
                          • ITI-51 Multi-Patient Query Request
                          • ITI-51 Multi-Patient Query Response
                          • ITI-61 Register On-Demand Document Entry Request
                          • ITI-61 Register On-Demand Document Entry Response
                          • ITI-62 Delete Document Set Request
                          • ITI-62 Delete Document Set Response
                          • ITI-63 Cross-Gateway Fetch Request
                          • ITI-63 Cross-Gateway Fetch Response
                          • ITI-79 Authorization Decisions Query Request
                          • ITI-79 Authorization Decisions Query Response
                          • PHARM-1 request
                          • PHARM-1 response
                          • RAD-68 Provide & Register Imaging Doc Set-MTOM/XOP Request
                          • RAD-68 Provide & Register Imaging Doc Set-MTOM/XOP Response
                          • RAD-69 Retrieve Imaging Document Set Request
                            • If your Imaging Doc Consumer does not support RAD-69, then skip this.
                          • RAD-69 Retrieve Imaging Document Set Response
                          • RAD-75 Cross-Gateway Retrieve Imaging Document Set Request
                          • RAD-75 Cross-Gateway Retrieve Imaging Document Set Response
                        • Click on Validate.

                        Evaluation

                        • The EVS Client creates a Permanent link to your results.
                        • Paste the Permanent Link into Gazelle Test Management as the results for this test.

                        EVS_XDW

                        Content Creator creates a Workflow dDocument and validates it

                        Content Creator will create a workflow document produced by your application.  This could be as defined in the base XDW profile, or on the specialized workflow documents such as XTHM, XBeR, and others.

                        Then you will use the XDW Validator in Gazelle EVS Client to verify your Workflow Document.

                        Finally, you will upload your sample document into Gazelle Test Management so that Content Consumers can access your sample and test with it prior to the Connectathon. 

                        Instructions

                        For this test, you will create a Workflow Document for a task that is relevant to your product/application; the contents of the task you complete should be in the context of the clinical functions your product performs.

                        For this test, you do not have to submit your workflow document to an XDS Repository/Registry.

                        (1) Once you have created/updated the workflow document, upload it into the Samples area of Gazelle Test Management:

                        • Select the Samples to share tab for your test system
                        • For the XDW entry (or XTHM, or XBeR... entry) select the 'Add sample' (+) icon.
                        • Upload the XML for your XDW document.  Name the XML file using this convention: <your-system-name>-<profile>.xml, eg EHR_XYXMedicalCo-XTHM.xml
                        • Select the "Validate" icon.

                        Alternative:  Access XDW Validator via the Gazelle External Validation Service (EVS):

                        • Access the External Validation Service: https://gazelle.ihe.net/evs
                        • Select menu:  IHE / XDW / Validate
                        • Upload the XML for your XDW document.
                        • Select the appropriate Validator from the 'Model Based Validation' drop-down list.
                        • Select the Validate button
                        • Next, find your results.  Select menu:  IHE / XDW / Validated XDW documents
                        • Use the interface to find your validated document.  This will show your results.

                        Evaluation

                        • The validation service creates a Permanent link to your results
                        • Paste the Permanent Link into Gazelle Test Management as the results for this test.

                        Your test partners that are Content Consumers will also be able to access your sample in Gazelle Test Management.

                        EVS_XDW-update

                        Content Updater updates a Workflow Document

                        In this test you will access a sample workflow made by a Content Creator; then you will update that document using your application, and use the XDW Validator provided in Gazelle EVS Client to validate your updated document.  

                        Finally, you will upload your 'updated' Workflow Document into the Samples area of Gazelle Test Management.  

                        Instructions

                        For this test, you will update a workflow document 

                        (1) For the base XDW profile, access this sample workflow document: XDW-Document-for-Updater.xml

                        --  If you are testing XBeR, XTHM, we do not have a generic sample for you to start from.  You may use a sample submitted by a 'Creator' vendor in your profile.

                        (2) Download the Workflow Document into your local environment, then update the document and add a task that is relevant to your product/application.

                        • Your new task should contain an input document, and output document, or both

                        For this test, you do not have to submit your workflow document to an XDS Repository/Registry

                        (3) Once you have created the updated workflow document, upload your document into the Samples area of Gazelle Test Management:

                        • Select the Samples to share tab for your test system
                        • For the XDW entry, select the 'Add sample' (+) icon.
                        • Upload the XML for your XDW document.  Name the XML file using this convention:
                          • SYSTEMNAME_XDW.xml, where "SYSTEMNAME" is your test system's name in gazelle, eg: EHR_XYZmed_XDW.xml
                        • Select use "Pre-Connectathon" from the dropdown list on the summary of the sample.
                        • Select the "Validate" button.

                        Alternative:  Access XDW Validator via the Gazelle External Validation Service (EVS):

                        • Access the External Validation Service: https://gazelle.ihe.net/evs
                        • Select menu:  IHE / XDW / Validate
                        • Upload the XML for your XDW document.
                        • Select the Validate button
                        • Next, find your results.  Select menu:  IHE / XDW / Validated XDW documents
                        • Use the interface to find your validated document.  This will show your results.

                        Evaluation

                        • The EVS Client creates a Permanent link to your results.
                        • Paste the Permanent Link into Gazelle Test Management as the results for this test.

                        Your test partners that are Content Consumers will also be able to access your sample in Gazelle Test Management.

                        AttachmentSize
                        File XDW-Document-for-Updater.xml3.9 KB

                        HPD Simulator tests

                        This section contains test cases performed with the the Healthcare Provider Directory -- HPD -- Simulator tool.

                        Tool: https://gazelle.ihe.net/HPDSimulator/home.seam

                        Tool information page: https://gazelle.ihe.net/content/hpd-simulator

                        16501: Pointer to HPD Simulator

                        We use this 'test' to inform you of the Gazelle HPD Simulator & Validator tool available for your testing. 

                        HPD actors simulated:

                        • Provider Information Consumer
                        • Provider Information Directory
                        • Provider Information Source

                        Location of the tool: https://gazelle.ihe.net/HPDSimulator/home.seam

                        Tool user manual: https://gazelle.ihe.net/content/hpd-simulator

                        We encourage you to test with the simulator prior to the Connectathon.

                        There are no results to upload into Gazelle Test Management for this test.

                        LBL Simulator tests

                        This section contains test cases performed with the LBL Simulator.

                        Tool: http://gazelle.ihe.net/LBLSimulator

                        Tool information page: http://gazelle.ihe.net/content/lbl-simulator

                        31001 : LBL Request Mode For Labeling Instruction (for Label Broker)

                        This test concerns the LB (Label Broker) actor. You will use the Order Manager tool to send a request to your system under test.

                        Instructions

                        Access the LBL Simulator tool at this location : Order Manager
                        If it is your first time with this tool, please read the user manual : Order Manager user manual
                        Please be remind that if you are logged in your configurations will be private.

                        In this test, the SUT (System Under Test) must received the labeling instructions from the LBL Simulator.

                        As the SUT implements the LB (LB acts as a responder in this test) :

                        1. Create or update the configuration corresponding to the SUT.
                        2. Then, go to the "Laboratory" menu entry and choose Label Information Provider > [LAB-61] page to begin the test. 
                        3. In the SUT Configurations drop-down list, select the SUT configuration.
                        4. Once you have selected the patient, hit the "Send" button. The simulator will send the message to the SUT with the labeling instructions.
                        5. Check the simulator has properly received your acknowledgement message.

                        The messages exchanged between the simulator and the SUT can be found in the message table on the very same page or from the HL7 Message menu entry.
                        Hit the link on the left side  of the raw (first column of the table), to display the Permanent Link to the test report for the corresponding HL7 message. (You can access to the hl7v2 report tutorial for more details.)
                        If the validation report status is passed for the message and the response, copy the "test result link" and paste it in Gazelle as the result of this test.
                        For further details, see this tutorial : How enter your pre-connectathon test result in Gazelle (only for the LAW, LCSD and LBL Profiles of the Laboratory Domain which send HL7v2 messages)?

                        Evaluation

                        • The validation status must be passed for the two messages of the transaction. The message type must be the right message type, according to IHE.
                        • The Acknowledgment code must be "AA" (MSA-1) in the acknowledgment message.

                        31002 : LBL Request Mode For Labeling Instruction (for Label Information Provider)

                        This test concerns only the LIP (Label Information Provider) actor. Your system under test will request the Order Manager tool (acting as Label Broker) to deliver the labels (LAB-61 transaction).

                        Instructions

                        Access the LBL Simulator tool at this location : Order Manager
                        If it is your first time with this tool, please read the user manual : Order Manager user manual
                        Please be remind that if you are logged in your configurations will be private.

                        In this test, the SUT (System Under Test) must send the labeling instructions to the LBL Simulator.

                        As your system under test implements the LIP (LIP acts as an initiator in this test) :

                        1. You do not need to create a SUT configuration for your system. Just go to the "Laboratory" menu entry and choose Label Broker > Configuration page to begin your test.
                        2. Configure your system under test to send the messages to IP address and port displayed on screen.
                        3. Don't forget to hit the "Refresh List" button after to have send your message.
                        4. Check the simulator has properly received and acknowledged your message.

                        The messages exchanged between the simulator and the SUT can be found in the message table on the very same page or from the HL7 Message menu entry.
                        Hit the link on the left side of the raw (first column of the table), to display the Permanent Link to test report for the corresponding HL7 message. (You can access to the hl7v2 report tutorial for more details.)
                        If the validation report status is passed for the message and the response, copy the "test result link" and paste it in the comment section of the pre-connectathon log return page as the result for this test.
                        For further details, see this tutorial : How enter your pre-connectathon test result in Gazelle (only for the LAW, LCSD and LBL Profiles of the Laboratory Domain which send HL7v2 messages)?

                        Evaluation

                        • The validation status must be passed for the two messages of the transaction. The message type must be the right message type, according to IHE.
                        • The Acknowledgment code must be "AA" (MSA-1) in the acknowledgment message.

                        31003 : LBL Query Mode For Labeling Instruction (for Label Information Provider)

                        This test concerns only the LIP (Label Information Provider) actor. You will need to communicate with the LBL Simulator, in order to simulate the LAB-62 transaction of the LBL Profile.

                        Instructions


                        Access the LBL Simulator tool at this location : Order Manager
                        If it is your first time with this tool, please read the user manual : Order Manager User Manual
                        Please be remind that if you are logged in your configurations will be private. Requirements :

                        As your system implements the LIP (LIP acts as a responder in this test) :

                        1. Create or update the configuration corresponding to the SUT (System Under Test).
                        2. Then, go to the "Laboratory" menu entry and choose Label Broker > [LAB-62] page to begin your test.
                        3. In the SUT Configurations drop-down list, select your LB system.
                        4. In order to construct your request, you will need to fill the request parameters table. You will find all information in the SUT system.

                        In this test, you must use the LBL Simulator to query the SUT. Send severals messages using different parameters. All (Required) possibilities are defined in the steps below :

                        • Step 1 : Label Broker (Simulator) queries the Label Information Provider (SUT) using the patient ID.
                        • Step 2 : Label Broker (Simulator) queries the Label Information Provider (SUT) using the Placer Visit Number.
                        • Step 3 : Label Broker (Simulator) queries the Label Information Provider (SUT) using the Placer Group Number.
                        • Step 4 : Label Broker (Simulator) queries the Label Information Provider (SUT) using the Placer Order Number.
                        • Step 5 : Label Broker (Simulator) queries the Label Information Provider (SUT) using the Filler Order Number.
                        • Step 6 : Label Broker (Simulator) queries the Label Information Provider (SUT) using the search period and patient id.
                        • Step 7 : Label Broker (Simulator) queries the Label Information Provider (SUT) using the search period and patient visit number.
                        • Step 8 : Label Broker (Simulator) queries the Label Information Provider (SUT) using a patient ID which is unknown by the LIP. (This is an error case, to test the answer of your LIP in this situation.)

                        Take an example, for the step 1 :

                        1. In the request parameters table, fill the Patient ID field with a patient ID from you SUT.
                        2. Then hit the "Send" button. 3.The LBL Simulator will send the message to your system and will get a response (if the SUT answers).
                        3. Check the simulator has properly received your acknowledgement message.
                        4. The messages exchanged between the simulator and the SUT can be found in the message table on the very same page or from the HL7 Message menu entry. Hit the link on the left side of the raw (first column of the table), to display the Permanent Link to test report for the corresponding HL7 message. (You can access to the hl7v2 report tutorial for more details.)
                          If the validation report status is passed for the message and the response, copy the "test result link" and paste it in the comment section of the pre-connectathon log return page as the result for this test. For further details, see this tutorial : How enter your pre-connectathon test result in Gazelle (only for the LAW, LCSD and LBL Profiles of the Laboratory Domain which send HL7v2 messages) ?

                        Do this for all steps and don't forget to copy/paste the "test result link". Link the step number to the "test result link".

                        Evaluation

                        • The point below must be verified for each steps :
                        • The validation status must be passed for the two messages of the transaction. The message type must be the right message type, according to IHE.
                        • The Acknowledgment code must be "AA" (MSA-1) in the acknowledgment message.
                        • All steps must have been done.

                        31004 : LBL Query Mode For Labeling Instruction (for Label Broker)

                        This test concerns the LB (Label Broker) actor. You will need to communicate with the LBL Simulator, in order to simulate
                        the LAB-62 transaction of the LBL Profile.

                        Instructions

                        Access the LBL Simulator tool at this location : Order Manager
                        If it is your first time with this tool, please read the user manual : Order Manager User Manual
                        Please be remind that if you are logged in your configurations will be private. Requirements :

                        As your system implements the LB (LB acts as an initiator in this test) :

                        1. You do not need to register your SUT (System Under Test) configuration in the simulator. Just go to the "Laboratory" menu entry and choose Label Information Provider > Configuration page to begin your test.
                        2. Configure your SUT to send messages to IP Address and Port of the simulator.
                        3. Go to the Messages > HL7v2 messages

                        In this test, the SUT must query the LBL Simulator. Send severals messages using different parameters. All (Required) possibilities are defined in the steps below :

                        • Step 1 : Label Broker (SUT) queries the Label Information Provider (Simulator) using the patient ID.
                        • Step 2 : Label Broker (SUT) queries the Label Information Provider (Simulator) using the Placer Visit Number.
                        • Step 3 : Label Broker (SUT) queries the Label Information Provider (Simulator) using the Placer Group Number.
                        • Step 4 : Label Broker (SUT) queries the Label Information Provider (Simulator) using the Placer Order Number.
                        • Step 5 : Label Broker (SUT) queries the Label Information Provider (Simulator) using the Filler Order Number.

                        Take an example, for the step 1 :

                        1. The SUT queries the LBL Simulator using a patient ID (the SUT send a message to the LBL Simulator). See the "Patient Information available for your Request to the simulator" panel to get the patient Identifier. This panel give information about the patients of the LBL simulator.
                        2. The messages exchanged between the simulator and the SUT can be found in the message table on the very same page or from the HL7 Message menu entry.
                        3. Check the simulator has properly received and acknowledged your message.
                        4. Hit the link on the left side of the raw (first column of the table), to display the Permanent Link to test report for the corresponding HL7 message. (You can access to the hl7v2 report tutorial for more details.) If the validation report status is passed for the message and the response, copy the "test result link" and paste it in the comment section of the pre-connectathon log return page as the result for this test. For further details, see this tutorial : How enter your pre-connectathon test result in Gazelle (only for the LAW, LCSD and LBL Profiles of the Laboratory Domain which send HL7v2 messages) ?

                        Do this for all steps and don't forget to copy/paste the "test result link". Link the step number to the "test result link".

                        Evaluation

                        The point below must be verified for each steps :

                        • The validation status must be passed for the two messages of the transaction. The message type must be the right message type, according to IHE.
                        • The Acknowledgment code must be "AA" (MSA-1) in the acknowledgment message.
                        • All steps must have been done.

                        31005 : LBL Labels And Containers Delivered (for Label Information Provider)

                        This test concerns the LIP (Label Information Provider) actor. Your SUT will receive the label delivered notification from the Order Manager tool acting as Label Broker.

                        In this test, the Label Broker notifies the effective labeled containers production to the Label Information Provider.
                        So, the LBL Simulator, needs to know the labeling instruction, before to send a notification message to the LIP for the effective labels printing and labeled containers production.
                        Two steps are necessary in this test :

                        • First, the SUT (System Under Test) will send the labeling instruction to the LBL Simulator (see test LBL Request Mode For Labeling Instruction).
                        • Second, the LBL Simulator will notify the SUT for the effective labels printing and labeled containers production (this test).

                        Instructions


                        Access the LBL Simulator tool at this location : Order Manager
                        If it is your first time with this tool, please read the user manual : Order Manager User Manual
                        Please be remind that if you are logged in your configurations will be private.

                         

                        1. Create or update the configuration corresponding to your SUT from the SUT Configurations > HL7v2 responders page.
                        2. Go to the "Laboratory" menu entry and choose Label Broker > [LAB-63] page to begin your test.
                        3. In the SUT Configurations drop-down list, select your LIP system.
                        4. Then, select in the "Table on the tubes label by the LB simulator" table, the tube according to your labeling instruction send to the LBL Simulator in the first step of this test. You can use the filter option to find it easy.
                        5. Once you have selected the right tube to confirm, hit the "Send the confirmation of the selected labeled tubes" button. The LBL Simulator will send the message to your system.
                        6. Check the simulator has properly received your acknowledgement message.
                        7. The messages exchanged between the simulator and the SUT can be found in the message table on the very same page or from the HL7 Message menu entry. Hit the link on the left side  of the raw (first column of the table), to display the Permanent Link to test report for the corresponding HL7 message. (You can access to the hl7v2 report tutorial for more details.) If the validation report status is passed for the message and the response, copy the "test result link" and paste it in the comment section of the pre-connectathon log return page as the result for this test. For further details, see this tutorial : How enter your pre-connectathon test result in Gazelle (only for the LAW, LCSD and LBL Profiles of the Laboratory Domain which send HL7v2 messages) ?

                        Evaluation

                        • The validation status must be passed for the two messages of the transaction.
                        • The message type must be the right message type, according to IHE.
                        • The Acknowledgment code must be "AA" (MSA-1) in the acknowledgment message.

                        31006 : LBL Labels And Containers Delivered (for Label Broker)

                        This test concerns the LB (Label Broker) actor. You will use the Order Manager tool to simulate the Label Information Provider actor.

                        In this test, the Label Broker notify the effective labeled containers production to the Label Information Provider.
                        So, the SUT (System Under Test), needs to know the labeling instruction, before to label the tubes and to send a notification message to the LIP for the effective labels printing and labeled containers production.
                        Two steps are necessary in this test :

                        • First, the LBL Simulator will send the labeling instruction to the SUT (see test "LBL Request Mode For Labeling Instruction")
                        • Second, the SUT will notify the LBL Simulator for the effective labels printing and labeled containers production.

                        Instructions


                        Access the LBL Simulator tool at this location : Order Manager
                        If it is your first time with this tool, please read the user manual : Order Manager User Manual
                        Please be remind that if you are logged in your configurations will be private.

                         

                        1. Go to the "Laboratory" menu entry and choose Label Information Provider > Configuration page to begin your test.
                        2. Configure your SUT to send messages to the IP Address and Port displayed on screen.
                        3. Send a message to the LBL Simulator for the effective labels printing and labeled containers production.
                        4. Access the messages exchanged between your SUT and the simulator under Messages > HL7v2 messages.
                        5. Check the simulator has properly received your acknowledgement message.
                        6. The messages exchanged between the simulator and the SUT can be found in the message table on the very same page or from the HL7 Message menu entry. Hit the link on the left side  of the raw (first column of the table), to display the Permanent Link to test report for the corresponding HL7 message. (You can access to the hl7v2 report tutorial for more details.) If the validation report status is passed for the message and the response, copy the "test result link" and paste it in Gazelle as the result of this test. For further details, see this tutorial : How enter your pre-connectathon test result in Gazelle (only for the LAW, LCSD and LBL Profiles of the Laboratory Domain which send HL7v2 messages) ?

                        Evaluation


                        The validation status must be passed for the two messages of the transaction. The message type must be the right message type, according to IHE.
                        The Acknowledgment code must be "AA" (MSA-1) in the acknowledgment message.

                        LCSD Simulator tests

                        This section contains test cases performed with the LCSD Simulator.

                        Tool: https://gazelle.ihe.net/LCSDSimulator

                        Tool information page: https://gazelle.ihe.net/content/lcsd-simulator

                        30801 : LCSD Code Set Exchange (for Code Set Consumer)

                        This test concerns the CSC (Code Set Consumer) actor. You will need to communicate with the LCSD Simulator, in order to simulate the LAB-51 transaction of the LCSD Profile.

                        Instructions

                        Access the LCSD Simulator tool at this location : LCSD Simulator
                        If it is your first time with this tool, please read the user manual : LCSD Simulator User Manual
                        Please be remind that if you are logged in your configurations will be private.
                           
                        As your system implements the CSC (CSC acts as a responder in this test) :

                        1. Create or update the configuration corresponding to the SUT (System Under Test).
                        2. Then, go to the "Simulators" menu entry and choose "CSM" page to begin your test.
                        3. In the SUT Configurations drop-down list, select your CSC system.
                        4. You will need to select the Code Set Category, then the Code Sets to send. Hit the "Send" button and the LCSD Simulator will send the message with selected code sets to your system.

                        Send at least, one code set for each code set category (Battery, Calculated, Not Numeric and Numeric. The Batch mode is not available yet in the LCSD Simulator.)
                        All (Required) possibilities are defined in the steps below.   

                        • Step 1 : The CSM (Simulator) send a Battery code set to the CSC (SUT).
                        • Step 2 : The CSM (Simulator) send a Calculated code set to the CSC (SUT).
                        • Step 3 : The CSM (Simulator) send a Not Numeric code set to the CSC (SUT).
                        • Step 4 : The CSM (Simulator) send a Numeric code set to the CSC (SUT).

                        How run and log this steps ?
                        For example, for the step 1:

                        1. choose in the code set category : "Battery" and select one code set among the available code sets.
                        2. Then hit the "Send" button. The LBL Simulator will send the message to your system and will get a response (if the SUT answers). 
                        3. Check the simulator has properly received your acknowledgement message.   

                        The messages exchanged between the simulator and the SUT can be found in the message table on the very same page or from the HL7 Message menu entry.
                        Hit the link on the left side of the raw (first column of the table), to display the Permanent Link to test report for the corresponding HL7 message. (You can access to the hl7v2 report tutorial for more details.)
                        If the validation report status is passed for the message and the response, copy the "test result link", and paste it in Gazelle.
                        For further details, see this tutorial : How enter your pre-connectathon test result in Gazelle (only for the LAW, LCSD and LBL Profiles of the Laboratory Domain which send HL7v2 messages) ?

                        Do this for all steps and don't forget to copy/paste the "test result link". Link the step number to the "test result link".

                        Evaluation

                        • The validation status shall be passed for the two messages of the transaction.
                        • The message type shall be the right message type, according to IHE.
                        • The Acknowledgment code shall be "AA" (MSA-1) in the acknowledgment message.

                        30802 : LCSD Code Set Exchange (for Code Set Master)

                        This test concerns only the CSM (Code Set Master) actor. You will need to communicate with the LCSD Simulator, in order to simulate
                        the LAB-51 transaction of the LCSD Profile.

                        Instructions


                        Access the LCSD Simulator tool at this location : LCSD Simulator
                        If it is your first time with this tool, please read the user manual : LCSD Simulator User Manual
                        Please be remind that if you are logged in your configurations will be private.
                           
                        As your system implements the CSM (CSM acts as an initiator in this test) :

                        1. Don't need to create a SUT (System Under Test) configuration for your system. Just go to the "Simulators" menu entry and choose "CSC" page to begin your test.
                        2. In the charset drop-down list, select the desired charset. 
                        3. Use the Ip Address and the Port linked to this charset to send your message to the LCSD Simulator.
                        4. Don't forget to hit the "Refresh List" button after to have send your message.

                        Send at least, one code set for each code set category (Battery, Calculated, Not Numeric and Numeric. The Batch mode is not available yet in the LCSD Simulator.)
                        All (Required) possibilities are defined in the steps below.

                        • Step 1 : The CSM (SUT) send a Battery code set to the CSC (Simulator).
                        • Step 2 : The CSM (SUT) send a Calculated code set to the CSC (Simulator).
                        • Step 3 : The CSM (SUT) send a Not Numeric code set to the CSC (Simulator).
                        • Step 4 : The CSM (SUT) send a Numeric code set to the CSC (Simulator).

                        How run and log this steps ?
                        For example, for the step 1 :

                        1. The SUT must send a code set from the category : "Battery".
                        2. The LCSD Simulator will respond with an acknowledgment.
                        3. Check the simulator has properly received and acknowledged your message. 

                        The messages exchanged between the simulator and the SUT can be found in the message table on the very same page or from the HL7 Message menu entry.
                        Hit the link on the left side of the raw (first column of the table), to display the Permanent Link to test report for the corresponding HL7 message. (You can access to the hl7v2 report tutorial for more details.)
                        If the validation report status is passed for the message and the response, copy the "test result link" and paste it in Gazelle as the result of this test.
                        For further details, see this tutorial : How enter your pre-connectathon test result in Gazelle (only for the LAW, LCSD and LBL Profiles of the Laboratory Domain which send HL7v2 messages) ?  

                        Evaluation

                        • The validation status shall be passed for the two messages of the transaction.
                        • The message type shall be the right message type, according to IHE.
                        • The Acknowledgment code shall be "AA" (MSA-1) in the acknowledgment message.

                        Manual Scrutiny tests

                        This section contains test cases where sample messages and objects are:

                        • exchanged between test partners in advance of the Connectathon
                        • manually verified in the absence of a tool

                        EYECARE-15_Manual_Evaluation

                        EYECARE-15 is a Patient Registration message

                        In this test, we ask you to provide a sample message produced by your system, and we will validate your message by manual evaluation using the requirements in the Eye Care Technical Framework.

                        This is the same evaluation we will perform during the Connectathon.  This test enables you to prepare in advance.

                        Instructions
                        1. Capture an ADT^A04 message as produced by your system.  Create a file containing the contents of that message.
                        2. In Gazelle, select menu Connectathon-->List of samples.
                        3. On the "systems" dropdown list, select your system name
                        4. On the "Samples to share" tab, find the entry for "EYECARE-15"
                        5. Upload the file containing your HL7 message.  If you need help managing samples, refer to this help page.

                        Finally, update the status of this pre-Connectathon test in gazelle to signal that your message is ready for evaluation:

                        1. In Gazelle, select menu Connectathon-->Pre-Connectathon testing 
                        2. Find the entry for this test instance, and change the status to "Verified by Vendor"
                         
                        We examine your message using the same evalution defined in the Connectathon test for this message:
                         
                        1. in Gazelle, select menu Connectathon-->Connectathon.
                        2. Find test "EYECARE-15_Scrutiny"
                        3. In the Evaluation section of that test, you will find the message segments and fields that we will look for in your message.
                         
                         

                        EYECARE-16_Manual_Evaluation

                        EYECARE-16 is a Appointment Scheduling Management message

                        In this test, we ask you to provide a sample message produced by your system, and we will validate your message by manual evaluation using the requirements in the Eye Care Technical Framework.

                        This is the same evaluation we will perform during the Connectathon.  This test enables you to prepare in advance.

                        Instructions
                        1. There are multiple SIU messages supported in the EYECARE-16 transaction.  You will capture these messages as produced by your system.  Create a files containing the contents of each message:
                          1. SIU^S12 New Appointment
                          2. SIU^S14 Modify Appointment
                          3. SIU^S15 Cancel Appointment
                          4. SIU^S17 Delete Appointment
                          5. SIU^S26 Patient No Show
                        2. In Gazelle, select menu Connectathon-->List of samples.
                        3. On the "systems" dropdown list, select your system name
                        4. On the "Samples to share" tab, find the entry for "EYECARE-16"
                        5. Upload the files containing your HL7 messages.  If you need help managing samples, refer to this help page.

                        Finally, update the status of this pre-Connectathon test in gazelle to signal that your message is ready for evaluation:

                        1. In Gazelle, select menu Connectathon-->Pre-Connectathon testing 
                        2. Find the entry for this test instance, and change the status to "Verified by Vendor"
                         
                        We examine your message using the same evalution defined in the Connectathon test for this message:
                         
                        1. in Gazelle, select menu Connectathon-->Connectathon.
                        2. Find test "EYECARE-16_Scrutiny"
                        3. In the Evaluation section of that test, you will find the message segments and fields that we will look for in your message.
                         
                         

                        EYECARE-17_Manual_Evaluation

                        EYECARE-17 is a Charge Posting message

                        In this test, we ask you to provide a sample message produced by your system, and we will validate your message by manual evaluation using the requirements in the Eye Care Technical Framework.

                        This is the same evaluation we will perform during the Connectathon.  This test enables you to prepare in advance.

                        Instructions
                        1. Capture an DFT^P03 message as produced by your system.  Create a file containing the contents of that message.
                        2. In Gazelle, select menu Connectathon-->List of samples
                        3. On the "systems" dropdown list, select your system name
                        4. On the "Samples to share" tab, find the entry for "EYECARE-17"
                        5. Upload the file containing your HL7 message.  If you need help managing samples, refer to this help page.

                        Finally, update the status of this pre-Connectathon test in gazelle to signal that your message is ready for evaluation:

                        1. In Gazelle, select menu Connectathon-->Pre-Connectathon testing 
                        2. Find the entry for this test instance, and change the status to "Verified by Vendor"
                         
                        We examine your message using the same evalution defined in the Connectathon test for this message:
                         
                        1. in Gazelle, select menu Connectathon-->Connectathon.
                        2. Find test "EYECARE-17_Scrutiny"
                        3. In the Evaluation section of that test, you will find the message segments and fields that we will look for in your message.
                         
                         

                        EYECARE-21_Manual_Evaluation

                        EYECARE-21 is a Procedure Scheduled message

                        In this test, we ask you to provide a sample message produced by your system, and we will validate your message by manual evaluation using the requirements in the Eye Care Technical Framework.

                        This is the same evaluation we will perform during the Connectathon.  This test enables you to prepare in advance.

                        Instructions
                        1. Capture an OMG^019 message as produced by your system.  Create a file containing the contents of that message.
                        2. In Gazelle, select menu Connectathon-->List of samples
                        3. On the "systems" dropdown list, select your system name
                        4. On the "Samples to share" tab, find the entry for "EYECARE-21"
                        5. Upload the file containing your HL7 message.  If you need help managing samples, refer to this help page.

                        Finally, update the status of this pre-Connectathon test in gazelle to signal that your message is ready for evaluation:

                        1. In Gazelle, select menu Connectathon-->Pre-Connectathon testing 
                        2. Find the entry for this test instance, and change the status to "Verified by Vendor"
                         
                        We examine your message using the same evalution defined in the Connectathon test for this message:
                         
                        1. in Gazelle, select menu Connectathon-->Connectathon.
                        2. Find test "EYECARE-21_Scrutiny"
                        3. In the Evaluation section of that test, you will find the message segments and fields that we will look for in your message.
                         
                         

                        EYECARE-23-24_Manual_Evaluation

                        EYECARE-23 is XML for Refractive Measurement (no Pat ID)

                        EYECARE-24 is XML for Refrative Measurement (valid Pat ID)

                        In this test, we ask you to provide a sample message produced by your system, and we will validate your message by manual evaluation using the requirements in the Eye Care Technical Framework and in the JOIA 1.5 specification.

                        This is the same evaluation we will perform during the Connectathon.  This test enables you to prepare in advance.

                        Instructions
                        1. Capture an the XML message as produced by your system.  Create an XML file containing the contents of that message.
                        2. In Gazelle,select menu Connectathon-->List of samples
                        3. On the "systems" dropdown list, select your system name
                        4. On the "Samples to share" tab, find the entry for "EYECARE-23" or "EYECARE-24" as applicable
                        5. Upload the file containing your HL7 message.  If you need help managing samples, refer to this help page.

                        Finally, update the status of this pre-Connectathon test in gazelle to signal that your message is ready for evaluation:

                        1. In Gazelle, select menu Connectathon-->Pre-Connectathon testing 
                        2. Find the entry for this test instance, and change the status to "Verified by Vendor"
                         
                        We examine your message using the same evalution defined in the Connectathon test for this message:
                         
                        1. in Gazelle, select menu Connectathon-->Connectathon.
                        2. Find test "EYECARE-23-24_Scrutiny"
                        3. In the Evaluation section of that test, you will find the content that we will look for in your XML.
                         
                         

                        MRRT_Report_Template_Evaluation

                        In this test, you will examine your sample MRRT Report Template using manual evaluation according the requirements in the MRRT Profile.
                         

                        This is the same evaluation that a monitor will perform during the Connectathon.  This test enables you to prepare in advance.

                        Instructions
                        1. First, perform  test MRRT_Sample_Exchange to ensure you have submitted a MRRT Report Template sample into Gazelle Test Management
                        2. Next, find the manual evaluation criteria:  on the Test Execution page in Gazelle Test Management, find test MRRT_Report_Template_Structure.
                        3. In the Evaluation section of that test, you will find the content that we will look for in your template.  You should examine your template using those criteria and make repairs if necessary
                         
                        There is no result file to upload into Gazelle Test Management for this test.  The evaluation will occur during the Connectathon.
                         

                         
                         

                        RAD-2_with_CDS_Manual_Evaluation

                        In this test, we will evaluate a sample RAD-2 OMG message produced by your system.

                        Since CDS-OAT is a new profile, we do not yet have a tool to evaluate the specific requirements for RAD-2 in CDS-OAT, so we will validate your message using:

                        • the Gazelle EVS Client tool to validate the base requirements for RAD-2
                        • and, manual scrutiny of the message for requirement added for compliance to the CDS-OAT profile.

                        This is the same evaluation we will perform during the Connectathon.  This test enables you to prepare in advance.

                        Instructions
                        1. Capture a [RAD-2] OMG^019 message as produced by your system.  Create a file containing the contents of that message.
                        2. In the EVS Client tool, select menu IHE-->HL7v2.x
                        3. Paste a copy of your OMG message into the tool and run the validator for RAD-2 OMG, HL7 v2.5.1.
                        4. Find the Permanent link to your validation result
                        5. Paste the Permanent Link into Gazelle Test Management as the results for this test.
                        Evaluation
                         
                        Evaluation will be performed during Connectathon.
                        • A monitor will examine the results from EVSClient (using the Permanent link you captured).   These are the baseline requirements for RAD-2.
                        • A monitor will also do manual scrutiny of your OMG message using the requirements specific to CDS-OAT.  To see the evaluation criteria, find test "RAD-2_with_CDS_Scrutiny" on your Test Execution page in Gazelle Teste Management.  In the Evaluation section of that test, you will find the message segments and fields that we will look for in your message.
                         
                         

                        RAD-3_with_CDS_Manual_Evaluation

                        In this test, we will evaluate a sample RAD-3 OMG message produced by your system.

                        Since CDS-OAT is a new profile, we do not yet have a tool to evaluate the specific requirements for RAD-3 in CDS-OAT, so we will validate your message using:

                        • the Gazelle EVS Client tool to validate the base requirements for RAD-3
                        • and, manual scrutiny of the message for requirement added for compliance to the CDS-OAT profile.

                        This is the same evaluation we will perform during the Connectathon.  This test enables you to prepare in advance.

                        Instructions
                        1. Capture a [RAD-3] OMG^019 message as produced by your system.  Create a file containing the contents of that message.
                        2. In the EVS Client tool, select menu IHE-->HL7v2.x
                        3. Paste a copy of your OMG message into the tool and run the validator for RAD-3, HL7 v2.5.1.
                        4. Find the Permanent link to you validation reult
                        5. Find the Permanent link to your validation result
                        6. Paste the Permanent Link into Gazelle Test Management as the results for this test.
                        Evaluation
                         
                        Evaluation will be performed during Connectathon.
                        • A monitor will examine the results from EVSClient (using the Permanent link you captured).   These are the baseline requirements for RAD-3.
                        • A monitor will also do manual scrutiny of your OMG message using the requirements specific to CDS-OAT.  To see the evaluation criteria, find test "RAD-3_with_CDS_Scrutiny" on your Test Execution page in Gazelle Teste Management.  In the Evaluation section of that test, you will find the message segments and fields that we will look for in your message.
                         
                         

                        RAD-4_with_CDS_Manual_Evaluation

                        In this  test, we will evaluate a sample RAD-4 OMI message produced by your system.

                        Since CDS-OAT is a new profile, we do not yet have a tool to evaluate the specific requirements for RAD-4 in CDS-OAT, so we will validate your message using:

                        • the Gazelle EVS Client tool to validate the base requirements for RAD-4
                        • and, manual scrutiny of the message for requirement added for compliance to the CDS-OAT profile.

                        This is the same evaluation we will perform during the Connectathon.  This test enables you to prepare in advance.

                        Instructions
                        1. Capture a [RAD-4] OMI message as produced by your system.  Create a file containing the contents of that message.
                        2. In the EVS Client tool, select menu IHE-->HL7v2.x
                        3. Paste a copy of your OMI message into the tool and run the validator for RAD-4 OMI, HL7 v2.5.1.
                        4. Find the permanent link to you validation reult
                        5. Find the Permanent link to your validation result
                        6. Paste the Permanent Link into Gazelle Test Management as the results for this test.
                        Evaluation
                         
                        Evaluation will be performed during Connectathon.
                        • A monitor will examine the results from EVSClient (using the Permanent link you captured).   These are the baseline requirements for RAD-4.
                        • A monitor will also do manual scrutiny of your OMG message using the requirements specific to CDS-OAT.  To see the evaluation criteria, find test "RAD-4_with_CDS_Scrutiny" on your Test Execution page in Gazelle Teste Management.  In the Evaluation section of that test, you will find the message segments and fields that we will look for in your message.
                         
                         

                        NIST FHIR Toolkit tests

                        This section contains test pre-Connectathon cases with the NIST FHIR Toolkit (aka "Asbestos) prior to IHE CONNECTATHONS.  Follow the link below.

                        FHIR_Toolkit_Tests_Per_Actor

                        The NIST FHIR Toolkit is used by developers before and during IHE North American and European Connectathons.  

                        The FHIR Toolkit is used by participants and monitors during Connectathon week, so participants should prepare by using the tools in their home lab prior to Connectathon.

                        Location of tool and associated documentation

                        Home page for documentation:  https://github.com/usnistgov/asbestos/wiki

                        Source code and release notes for FHIR Toolkit are found here:  https://github.com/usnistgov/asbestos/releases/

                        Installation Guilde:  https://github.com/usnistgov/asbestos/wiki/Installation-Guide

                        IHE profiles/actors tested:

                        The NIST FHIR Toolkit contains tests for these actors:

                            • MHD - Mobile access to Health Documents:
                              • Document Source 
                              • Document Recipient

                        Test instructions for each actor:

                        The list of tests and detail test instructions for each actor reside within the toolkit package. 

                        --> You should perform the tests for the actor(s) and options you have implemented   .e.g. support for Comprehensive metadata vs. Minimal metadata.

                        Preparatory test list in Gazelle Test Management:

                        In Gazelle Test Management, your preparatory test list will contain one test per actor you are testing with the FHIR Toolkit.  The naming convention is FHIR_Toolkit_<profile>-<actor>, e.g., FHIR_Toolkit_MHD-Doc_Source.

                        Because this tool is new, you will not upload any logs as part of preparatory testing.  Instead, you may add a note in the test instance that says, "I have successfully performed testing with the FHIR Toolkit."

                        Stay informed

                        Join the MHD implementer google group to receive information on updates to the tools and to follow Q&A within the developer community.  Users of FHIR Toolkit should subscribe.

                         

                         

                        NIST PCD Test tool

                        NIST maintains a tool known as the PCD Test tools. It offers a suite of Preparatory Connectathon tests for the HL7v2 based transactions for many profiles in the Devices domain.

                        As a preparation for the Connectathon, you are requested to execute the test cases relevant to your system under test.

                        When done, post evidence in Gazelle Test Management (e.g. screen shots) that the tests have been run.

                        The tool is available here: https://ihe-pcd.nist.gov/pcdtool

                        When you run tests in the the "Context-based" tab, make sure you select the test plan for the latest development cycle.

                        NIST PIX/PDQ tool tests

                        January 2024:  Note that the PIX/PDQ tool has been retired by NIST.  IHE is grateful to NIST for their efforts to develop and support this tool.  It was used for over 10 years by participants testing their PIX/PDQ and XPID implementations at IHE Connectathons.

                        This section contains test pre-Connectathon cases with the NIST PIX/PDQ tools prior to IHE Connectathons.  Follow the link below. 

                        PIX-PDQ_Tool_Tests_Per_Actor

                        January 2024:  Note that the PIX/PDQ tool used in this test has been retired by NIST.  IHE is grateful to NIST for their efforts to develop and support this tool.  It was used for over 10 years by participants testing their PIX/PDQ and XPID implementations at IHE Connectathons.

                        The test description below is now DEPRECATED, but kept here for historical purposes.

                        ========

                        The NIST PIX/PDQ tool is used by developers before and during IHE North American and European Connectathons.  The tools enable developers to test actors in several IHE profiles listed below.

                        Location of tool and associated documentation: 

                        The tool is found here: https://pixpdqtests.nist.gov/pixpdqtool/

                        For help using the tool, see the Documentation tab in the tool

                        IHE profiles/actors tested:

                        The NIST PIX/PDQ Tool contains tests for these actors:

                            • PDQ - Patient Demographics Query (HL7v2)
                              • Patient Demographics Consumer
                              • Patient Demographics Consumer with Pediatric Demographics Option
                              • Patient Demographics Supplier
                              • Patient Demographics Supplier with Pediatric Demographics Option
                            • PDQv3 - Patient Demographics Query (HL7v3)
                              • Patient Demographisc Consumer
                              • Patient Demographics Consumer with Pediatric Demographics Option
                              • Patient Demographics Supplier
                              • Patient Demographics Supplier with Pediatric Demographics Option
                            • PIX - Patient Identifier Cross Referencing (HL7v2)
                              • Patient Identifier (PIX) Source
                              • Patient Identifier (PIX) Consumer
                              • Patient Identifier (PIX) Cross-Reference Manager
                            • PIXv3 - Patient Identifier Cross Referencing (HL7v3)
                              • Patient Identifier (PIX) Source
                              • Patient Identifier (PIX) Consumer
                              • Patient Identifier (PIX) Cross-Reference Manager
                            • XPID - XAD-PID Change Management
                              • Patient Identifier Cross-Reference Manager (for ITI-64 only)

                        Test instructions for each actor:

                        The list of tests and the test instrictions for each actor reside on the tool website.  

                        • Access the tool at the link above
                        • Select the Tests tab
                        • Select  HL7v2 or HL7v3
                        • From the "Actor" Dropdown list, select the profile/actor that you want to test
                        • After selecting the actor, you will be presented with a list of tests to perform.   When you select an individual test, the instructions will be displayed.

                        Evaluation

                        In Gazelle Test Management, your preparatory test list will contain one test per actor you are testing with the tools.  The naming convention is NIST_<profile>-<actor>, e.g., NIST_PIX-Manager

                        When you have successfully finished testing your actor, capture evidence of success (eg a screenshot).  

                        Upload that file into Gazelle Test Management as the result for this actor.

                         

                         

                        NIST XDS Toolkit tests

                        This section contains test pre-Connectathon cases with the NIST XDS Toolkit prior to IHE CONNECTATHONS.  Follow the link below.

                        XDS_Toolkit_Tests_Per_Actor

                        The NIST XDS Toolkit is used by developers before and during IHE North American and European Connectathons.

                        The XDS Tools will be used during Connectathon week, so participants should prepare by using the tools in their home lab prior to Connectathon.

                        Location of tool and associated documentation

                        Refer to the home page for the NIST Document Conformant Test Tools here: https://ihexds.nist.gov/

                        Source code and release notes for XDS Toolkit are found here: https://github.com/usnistgov/iheos-toolkit2/releases

                        IHE profiles/actors tested:

                        The NIST XDS Toolkit contains tests for these actors:

                            • XDS.b - Cross-Enterprise Document Sharing:
                              • Document Registry
                              • Document Registry for Metadata Update (ITI-57)
                              • Document Repository
                              • Integrated Source/Repository??
                            • XDR - Cross-Enterprise Document Reliable Interchange:
                              • Document Recipient
                            • XCA - Cross Community Access:
                              • Initiating Gateway
                              • Responding Gateway
                            • MPQ - Multi-patient Query
                              • Document Registry
                            • RMD - Remove Metadata and Documents:
                              • Document Registry
                              • Document Repository
                            • RMU - Restricted Metadata Update:
                              • Update Responder (XDS Persistence Option & XCA Persistence Option)

                        Test instructions for each actor:

                        The list of tests for each actor now reside within the toolkit package.  Likewise, the test definitions are distributed in the package.

                        --> You should perform the tests for the actor(s) you have implemented   You will be instructed to create a 'test session' within the tool to represent the actor you are testing.  When you do this, all of your results are collected within one directory.

                        Evaluation:

                        In Gazelle Test Management, your preparatory test list will contain one test per actor you are testing with the XDS Toolkit.  The naming convention is XDS_Toolkit_<profile>-<actor>, e.g., XDS_Toolkit_XDS.b-Doc_Repository.

                        When you have successfully finished testing your actor, create a zip file of the result files located in toolkit in the {ExternalCache}/TestLogCache directory.  

                        Upload that zip file into Gazelle Test Management as the result for this actor.

                        Stay informed

                        Join the XDS implementer google group to receive information on updates to the tools and to follow Q&A within the developer community.  Users of XDS Toolkit should subscribe.

                         

                         

                        Patient Manager tests

                        Pre-connectathon testing for systems implementing the PAM (Patient Administration Management) integration profile are perfomed against the Patient Manager simulator available at  http://gazelle.ihe.net/PatientManager

                        Before starting your tests, please set up properly your system and/or give the correct information to the simulator in order to enable it to access your system under test. We also strongly recommend to read the documentation located at http://gazelle.ihe.net/content/patient-manager-user-manual

                        Patient Demographic Suppliers

                        Read the configuration parameters of the Patient Demographic Consumer part of the simulator and configure your system to send messages to this part of the simulator. You will find this information following the menu: Patient Identification Management/Patient Demographic Consumer/Configuration and messages. Be careful to select the right character encoding before checking the receiving port. 

                        The messages you will send to the simulator will also be available on that page.

                        The pre-connectathon test dedicated to your system is located here.

                        Patient Demographic Consumers

                        Register your system under test into the Gazelle simulator following the menu SUT Configurations, then click on "Create a configuration". Select the SUT actor as "PDC" and select the encoding character set expected by your SUT otherwise your system will not be able to decode the messages. Make sure your system is available from the Internet and no firewall prevents Gazelle to access your tool.

                        The pre-connectathon test dedicated to your system is located here.

                        Patient Encounter Suppliers

                        Read the configuration parameters of the Patient Encounter Consumer part of the simulator and configure your system to send messages to this part of the simulator. You will find this information by following the menu: Patient Encounter Management/Patient Encounter Consumer/Configuration and messages. Be carreful to select the right character encoding before checking the receiving port.

                        The messages you will send to the simulator will also be available on that page.

                        The pre-connectathon test dedicated to your system is located here.

                        Patient Encounter Consumers

                        Register your system under test into the Gazelle simulator following the menu SUT Configurations, then click on "Create a configuration". Select the SUT actor as "PEC" and select the right encoding character set otherwise you may receive messages your system will not be able to decode. Make sure your system is available from the Internet and no firewall prevents Gazelle to access your tool.

                        The pre-connectathon test dedicated to your system is located here.

                        12122: Patient Encounter Management

                        This test will be performed against the PAMSimulator tool. The goal of this test is to check the capability of your system to send/receive the messages defined within the ITI-31 (Patient Encounter Management) transaction. This test is only dealing with the basic set of trigger events defined for ITI-31, that means Patient admisssion, registration, discharge and the relative cancellation.

                         

                        You will retrieve the patients the simulator has sent or received under the "All patients" menu; for each patient, the list of relative encounters is available under the tab entitled "Patient's encounters". You may want to use the filter to facilitate your search. If you are using the simulator as a PES, you can log onto the application using the CAS mechanism (use your Gazelle credentials) and easily retrieve the patients you have created within the application by checking the "see only patients created by me" checkbox. If you use the simulator as a PEC, the creator of the patients/encounters received by the simulator is the sending facility_sending application of your system under test.  Once you have found the right patient, click on the magnifying glass you will get the permanent link to this patient; copy and paste it into the comment box of your pre-connectathon test instance. 

                        Before starting your test, please read the instructions at http://gazelle.ihe.net/content/pre-connectathon-tests/pam

                         

                        This test requires three patients we will name Patient1, Patient2 and Patient3. According the PAM profile, there is no need for the consumer to be aware of these patients before receiving encounter notifications for them.

                        This test is divided into two parts:

                        Patient Encounter Supplier

                        You will use the PAM Simulator as a Patient Encounter Consumer. Go to the Patient Encounter/Management/Patient Encounter Consumer page in order to retrieve the configuration (IP address, port, receiving facility/application) of the simulator.

                        1. Admit patient

                        1. Within your system, create Patient1 and Patient2. Patient class for both patients needs to be "inpatient" (I). If you are not inspired for creating patient demographics, you can use the Demographic Data Server Gazelle application, or, if your system is also a Patient Demographic Consumer, use the PAM Simulator for receiving patients.
                        2. For each patient, send a ADT^A01^ADT_A01 message to the simulator in order to admit them as inpatient.
                        3. Check the simulator has properly received and acknowledged your messages. Copy and paste the permanent link to the test reports into Gazelle Test Management.

                        2. Register patient

                        1. Within your system, create Patient3. Patient class for this patient must be "outpatient" (O). 
                        2. Send a ADT^A04^ADT_A01 message to the simulator to register this patient.
                        3. Check the simulator has properly received and acknowledged your message. Copy and paste the permanent link to the test report into Gazelle Test Management.

                        3. Cancel admission

                        1. Within your system, cancel the admission of Patient1.
                        2. Send a ADT^A11^ADT_A09 message to notify the simulator of the cancellation of admission for Patient1.
                        3. Check the simulator has properly received and acknowledged your message. Copy and paste the permanent link to the test report into Gazelle Test Management. 

                        4. Discharge patient

                        1. Within your system, end the encounter of Patient2.
                        2. Send a ADT^A03^ADT_A03 message to notify the simulator of the ending of the encounter for Patient2.
                        3. Check the simulator has properly received and acknowledges your message. Copy and paste the permanent link to the test report into Gazelle Test Management.

                        5. Cancel discharge

                        1. Within your system, cancel the discharge of Patient2.
                        2. Send a ADT^A13^ADT_A01 message to notify the simulator of the cancellation of discharge for Patient2.
                        3. Check the simulator has properly received and acknowledged your message. Copy and paste the permanent link to the test report into Gazelle Test Management.

                        In order to help the connectathon manager with checking this test, go to "All patients" page and retrieve Patient1, Patient2, Patient3. For each of those patients, copy the permanent link and paste it in Gazelle Test Management.

                         


                         

                         

                        Patient Enconter Consumer

                        You will use the PAM Simulator as a Patient Encounter Supplier. You may want to log onto the application to easily retrieve the patients/encounter you will create. Go to the Patient Encounter Management/Patient Encounter Supplier page.

                        1. Admit patient

                        In this step, you are going to create Patient1 and Patient2 and to admit them as inpatients.

                        1. Select your system under test in the drop-down menu.
                        2. Set the category of event to "Admit  inpatient" and the action to perform to "INSERT (A01)"
                        3. Select the "Generate a patient" option and in the "Patient generation with DDS" panel, select a country and click on the "Generate patient" button.
                        4. Fill out the encounter showing up below the patient information, do not forget to choose "Inpatient" as patient class.
                        5. Hit the "Send" button at the bottom of the page and check the simulator receives an acknowledgment from your system.
                        6. Copy and Paste the permanent link to the test report into Gazelle Test Management. 
                        7. Hit the "Perform another test" button and redo steps 1 to 6 for admitting Patient2.
                        8. Take a screenshot of your application that shows a proof of admission of those two patients and upload it into Gazelle Test Management.

                        2. Register patient

                        In this step, you are expected to create Patient3 and to register them as outpatient.

                        1. Select your system under test in the drop-down menu.
                        2. Set the category of event to "Register  outpatient" and the action to perform to "INSERT (A04)"
                        3. Select the "Generate a patient" option and in the "Patient generation with DDS" panel, select a country and click on the "Generate patient" button.
                        4. Fill out the encounter showing up below the patient information, do not forget to choose "Outpatient" as patient class.
                        5. Hit the "Send" button at the bottom of the page and check the simulator receives an acknowledgment from your system.
                        6. Copy and Paste the permanent link to the test report into Gazelle Test Management. 
                        7. Take a screenshot of your application that shows a proof of the registration of this patient and upload it into Gazelle

                        3. Cancel admission

                        In this step, you are expected to cancel the admission of Patient1.

                        1. Select your system under test in the drop-down menu.
                        2. Set the category of event to "Admit  inpatient" and the action to perform to "CANCEL (A11)"
                        3. Select Patient1 from the list displayed and cancel his/her admission.
                        4. Copy and Paste the permanent link to the test report into Gazelle Test Management
                        5. Take a screenshot of your application that shows a proof of the cancellation of this admission.

                        4. Discharge patient

                         

                        In this step, you are expected to discharge Patient2.

                        1. Select your system under test in the drop-down menu.
                        2. Set the category of event to "Discharge patient" and the action to perform to "INSERT (A03)"
                        3. Select Patient2 from the list displayed, then, select the encounter to end. Do not forget to fill out the discharge date/time  before clicking on the "Send" button.
                        4. Copy and Paste the permanent link to the test report into Gazelle Test Management
                        5. Take a screenshot of your application that shows a proof that the patient is discharged.

                        5. Cancel discharge

                        In this step, your are expected to cancel the discharge of Patient2

                        1. Select your system under test in the drop-down menu.
                        2. Set the category of event to "Discharge  patient" and the action to perform to "CANCEL (A13)"
                        3. Select Patient2 from the list displayed and select the encounter to reopen. Cancel the discharge.
                        4. Copy and Paste the permanent link to the test report into Gazelle Test Management
                        5. Take a screenshot of your application that shows a proof that the encounter has been reopened.

                        In order to help the connectathon manager with checking this test, go to "All patients" page and retrieve Patient1, Patient2, Patient3. For each of those patients, copy the permanent link and paste it in Gazelle Test Management.

                         

                        12104: Patient Identification Management

                        The aim of this pre-connectathon test is to check your system under test is able to receive/send the messages exchanged within the ITI-30 (Patient Identification Management) transaction. Here, we are testing your capability to create a new patient, update his/her demographics and identifiers and, depending of the set of options you support, to merge and/or link patient demographic information.

                        Both actors (Patient Demographic Consumer and Patient Demographic Supplier) will be asked to test against the PAM Simulator. For each step of this test, you are expected to provide the permanent link to the patient sent or received by the simulator and the permanent link to the test report.

                        You will retrieve the patients the simulator has sent or received under the "All patients" menu. You may want to use the filter to facilitate your search. If you are using the simulator as a PDS or PES, you can log onto the application using the CAS mechanism (use your Gazelle credentials) and easily retrieve the patients you have created within the application by checking the "see only patients created by me" checkbox. If you use the simulator as a PDC or PEC, the creator of the patients received by the simulator is the sending facility_sending application of your system under test.  Once you have found the right patient, click on the magnifying glass you will get the permanent link to this patient; copy and paste it into the comment box of your pre-connectathon test instance. 

                        The test report is also available through a permanent link. Go to the "HL7 Messages" page and select the message related to the current step, you will be the link to the test report.

                        Before starting your test, please read the instructions at http://gazelle.ihe.net/content/pre-connectathon-tests/pam

                        Test definition for Patient Demographic Consumer actor

                        Test definition for Patient Demographic Supplier actor

                        Patient Demographic Consumer

                        You will use the PAM Simulator as a Patient Demographic Supplier. You may want to log onto the application to easily retrieve the patients you will create. You will have to switch among the pages available under the Patient Identification Management/Patient Demographic Supplier menus.

                        1. Patient creation

                        In this step, you are expected to send to your simulator two new patients (ADT^A28^ADT_A05 messages). Go to Patient Identification Management/Patient Demographic Supplier/Create new Patient. 

                        1. Select your system under test in the drop-down menu.
                        2. Select "generate a patient" option. 
                        3. Within the "Patient Generation with DDS" panel, set the creation criteria of your choice and click the "Generate patient"  button. If you need your patient to have a specific identifier, then click on the "Modify patient data"  button and go to the Patient's identifiers tab and update the identifiers in a way your system under test will accept the patient.
                        4. Finally, hit the "Send ADT^A28^ADT_A05" button. 
                        5. Make sure the simulator receives an ackwnoledgment from your system and that the patient is properly registered in your system.
                        6. Take a screenshot of your application or your database as a proof of the good registration of the patient. Retrieve the permanent link to the test report and to the patient and copy them in Gazelle Test Management with a comment "Patient creation". This patient will be called Patient1 further.
                        7. Hit the "Create another patient"  button and redo steps 2 to 5. The aim of this second run is to create another patient who will be used later for merge or link. You only need to enter into Gazelle Test Management the permanent link to this new patient. This patient will be called Patient2 further.

                        2. Update patient information

                        In this step, you are expected to update the first name of Patient1 and send the notification to your system under test. Go to Patient Identification Management/Patient Demographic Supplier/ Update patient information.

                        1. Select your system under test in the drop-down menu.
                        2. Retrieve patient1 and hit the "update" button in order to edit his/her demographics. Update the first name of this patient and click on the "Send ADT^A31^ADT_A05"  button. 
                        3. Make sure the simulator receives and acknowledgment from your system and that the patient is properly updated in your system.
                        4. Take a screenshot of your application or your database as a proof of the good update of the patient. Retrieve the permanent link to the test report and to the patient and copy them in Gazelle Test Management with a comment "Update patient information".

                        3. Change patient's identifier list

                        In this step, you are expected to change one of the identifiers of Patient1. Go to Patient Identification Management/Patient Demographic Suppliser/Change patient identifier list.

                        1. Select your system under test in the drop-down menu
                        2. Retrieve the patient you have just edited and hit the "update patient identifier list" button.
                        3. Change one of the identifiers of the patient, for example replace "DDS" by "NW" in the ID part of the identifier (DDS-1234 becomes NW-1234). Click on the "Send ADT^A47^ADT_A30" button.
                        4. Make sure the simulator receives and acknowledgment from your system and that the patient identifier is properly updated in your system.
                        5. Take a screenshot of your application or your database as a proof of the good update of the patient. Retrieve the permanent link to the test report and to the patient and copy them in Gazelle Test Management with a comment "Update patient information".

                        4. Merge patients (if option supported by your SUT)

                        In this step, you will reuse Patient1 and Patient2 and merge them. Go to Patient Identification Management/Patient Demographic Supplier/Merge patients.

                        1. Select your system under test in the drop-down menu.
                        2. Retrieve the two patients to merge. 
                        3. Drag and drop Patient2 to the first box (Patient with incorrect identifier)
                        4. Drag and drop Patient1 to the second box (Target patient) and hit the "Send ADT^A40^ADT_A39"  button.
                        5. Make sure the simulator receives and acknowledgment from your system and that the patients are properly merged within your system (Patient1 must remain).
                        6. Take a screenshot of your application or your database as a proof of the good merging of the patients. Retrieve the permanent link to the test report and to Patient2 and copy them in Gazelle Test Management with a comment "Update patient information".

                        5. Link patients (if option supported by your SUT)

                        In this step, you will reuse Patient1 and Patient2 and merge them. If your SUT supports both merge and link options and if you have already performed step4; please create a third patient to replace Patient2 in this test. Go to Patient Identification Management/Patient Demographic Supplier/Link Unlink patients.

                        1. Select your system under test in the drop-down menu.
                        2. Retrieve the patients to link.
                        3. Drag and drop Patient1 to the first box (Patient One)
                        4. Drag and drop Patient2 to the second bow (Patient Two) and hit the "Send ADT^A24^ADT_A24" button.
                        5. Make sure the simulator receives and acknowledgment from your system and that the patients are properly linked within your system.
                        6. Take a screenshot of your application or your database as a proof of the good merging of the patients. Retrieve the permanent link to the test report and to Patient2 and copy them in Gazelle Test Management with a comment "Update patient information".

                         

                        Patient Demographic Supplier

                        You will use the PAM Simulator as a Patient Demographic Consumer. The creator of the patients you will send to the simulator will be set up from the sending facility and application values contained in the received HL7 messages. The configuration of this part of the simulator is available under Patient Identification Management/Patient Demographic Consumer/Configuration and messages.

                        1. Patient creation

                        1. Within your system, create two patients. If you are not inspired, you can use the Demographic Data Server application to get demographic data for those patients. First name of the second patient (named Patient2 further, other one will be named Patient1) shall be "Wrong". Note that the simulator accepts patient identifiers from any assigning authority.
                        2. Send a ADT^A28^ADT_A05 message to the simulator for each of the patients.
                        3. In the simulator, hit the "Refresh List" button, you may see the messages you have sent to the simulator. Open those messages and copy and paste the permanent link to test report into Gazelle Test Management.
                        4. Go to the "All patients" page and retrieve the patients you have sent to the simulator. Copy and paste their permanent links into Gazelle Test Management.

                        2. Update patient information

                        1. Within your system, change the first name of Patient1 to "Updated".
                        2. Send a ADT^A31^ADT_A05 message to the simulator to notify the update of Patient1.
                        3. In the simulator, hit the "Refresh List" button, you may see the message you have sent to the simulator. Open the message and copy and paste the permanent link to test report into Gazelle Test Management.
                        4. Go to the "All patients" page and retrieve the patient you have updated. Copy and paste its permanent link into Gazelle Test Management.

                        3. Change patient's identifier list

                        1. Within your system, update the identifier of Patient1 by prefixing the ID by NW.
                        2. Send a ADT^A47^ADT_A30 message to the simulator to notify the update of Patient1.
                        3. In the simulator, hit the "Refresh List" button, you may see the message you have sent to the simulator. Open the message and copy and paste the permanent link to test report into Gazelle Test Management.
                        4. Go to the "All patients" page and retrieve the patient you have updated. Copy and paste its permanent link into Gazelle Test Management.

                        4. Merge patients (if option supported by your SUT)

                        1. Within your system, merge Patient1 and Patient2 in a way that Patient1 remains.
                        2. Send a ADT^A40^ADT_A39 message to the simulator to notify the merging of those patients.
                        3. In the simulator, hit the "Refresh List" button, you may see the message you have sent to the simulator. Open the message and copy and paste the permanent link to test report into Gazelle Test Management.
                        4. Go to "All patients" page and retrieve Patient2. Copy and paste its permanent link into Gazelle Test Management.

                        5. Link patients (if option supported by your SUT)

                        1. Within your system, link Patient with Patient2.
                        2. Send a ADT^A24^ADT_A24 message to the simulator to notify the linking of this patients.
                        3. In the simulator, hit the "Refresh List" button, you may see the message you have sent to the simulator. Open the message and copy and paste the permanent link to test report into Gazelle Test Management.
                        4. Go to "Patient Identification Management/Patient Demographic Consumer/Received Patient links and check that the link has been properly created.

                        12123: Inpatient/Outpatient Encounter Management

                        This test deals with the subset of trigger events defined for the Inpatient/Outpatient Encounter Management option of ITI-31 transaction. As a reminder, here is the list of events your system must support to fulfill the requirements of this option:

                        • Pre-admit patient (A05/A38)
                        • Transfer patient (A02/A12)
                        • Change inpatient to outpatient (A07)
                        • Change outpatient to inpatient (A06)

                        This test is written for both Patient Encounter Supplier and Patient Encounter Consumer, refer to the right section according the actors your system under test supports.

                        Patient Encounter Supplier

                        In this test, we check the capability of your system under test to send messages for notifying the PEC actor of new events. You are asked to test against the PAMSimulator. Your first task is to configured your system under test to tell it to send the messages to the PAMSimulator. To do so, retrieve the configuration of the PEC part of the simulator under Patient Encounter Management/Patient Encounter Consumer/Configuration. Do not forgot to select the right character encoding before specifying the port to your system.

                        1. Initialize test

                        In this first step, you will feed the PAMSimulator with a new patient and encounter.

                        1. Create a new patient and a new encounter (patient class = I) within your system
                        2. Send a ADT^A01^ADT_A01 message to admit the patient
                        3. Retrieve this patient in the simulator (use the All patients menu) and copy and paste the permanent link to the patient in your pre-connectathon logs.

                        2. Transfer patient and cancel movement

                        Once the patient is admitted, we will transfer him/her to a new bed.

                        1. Within your system, update the bed (PL-3) in the patient location.
                        2. Send a ADT^A02^ADT_A02 message to the simulator to transfer the patient to this new bed.
                        3. Check that the simulator acknowledges the message with MSA-1 = AA.
                        4. Tell your system under test that you have made a mistake and want to cancel the transfer.
                        5. Send a ADT^A12^ADT_A12 message to cancel the transfer.
                        6. Check that the simulator acknowledges the message with MSA-1 = AA.
                        7. Go to HL7 messages menu of the simulator and retrieve the two messages you have sent to the simulator. Copy the permanent links to the test report and paste them in your pre-connectathon logs.

                        3. Change inpatient to outpatient

                        In this step, we will change the patient class to outpatient.

                        1. Within your system, change the patient class to outpatient (code = O)
                        2. Send a ADT^A07^ADT_A06 message to the simulator to update the patient class
                        3. Check that the simulator acknowledges the message with MSA-1 = AA.
                        4. Go to HL7 messages menu of the simulator and retrieve the message you have sent to the simulator. Copy the permanent link to the test report and paste it in your pre-connectathon logs.

                        4. Change outpatient to inpatient

                        In this step, we will change back the patient class to inpatient

                        1. Within your system, put the patient class to inpatient (code = I)
                        2. Send a ADT^A06^ADT_A06 message to the simulator to update the patient class
                        3. Check that the simulator acknowledges the message with MSA-1 = AA.
                        4. Go to HL7 messages menu of the simulator and retrieve the mesage you have sent to the simulator. Copy the permanent link to the test report and paste it in your pre-connectathon logs.

                        5. Pre-admit the patient for a new encounter and cancel it

                        This last step is used to check the ability of your system to send a pre-admission notification and its cancellation

                        1. Within your system, pre-admit the patient for a new encounter planned on May, 20th 2012 at noon.
                        2. Send a ADT^A05^ADT_A05 message to the simulator to notify it of this new encounter
                        3. Tell your system under test that you have made a mistake and want to cancel the pre-admission.
                        4. Send a ADT^A38^ADT_A38 message to cancel the pre-admission.
                        5. Check that the simulator acknowledges the message with MSA-1 = AA.
                        6. Go to HL7 messages menu of the simulator and retrieve the two messages you have sent to the simulator. Copy the permanent links to the test report and paste them in your pre-connectathon logs.

                        Patient Encounter Consumer

                        In this step, we check the capability of your system under test to act as a Patient Encounter Consumer for the PAM profile, and for the Inpatient/Outpatient Encounter Management option in particular. We want you to demonstrate that your system is able to integrate the notifications received from the PAM Simulator and to correctly acknowledge them.

                        1. Initialize test

                        1. Access the PAMSimulator
                        2. You can log in using CAS if you want to be set as the creator of the patients and encounters you will generate.
                        3. You may have already registered your system under test within the simulator when performing test #12122.
                        4. Go to Patient Encounter Management/Patient Encounter Supplier section of the PAM Simulator
                        5. Create a new patient and a new inpatient encounter (class code = I/Inpatient)  for this patient. Patient's location must be the following:
                        6. {syntaxhighlighter brush: as3;fontsize: 100; first-line: 1; }Point of care: Nursing station 1-East Room: Room 10 (1-East) Bed: Aisle Facility: British Hospital{/syntaxhighlighter}
                        7. Send a ADT^A01^ADT_A01 message to your system under test (refer to test #12122 for detailed informations)
                        8. Take a screenshot of your application as a proof of the admission of the patient (ensure that the assigned location of the patient is visible)
                        9. Retrieve this patient in the simulator (use the All patients menu) and copy and paste the permanent link to the patient in your pre-connectathon logs.

                        2. Transfer patient to new bed and cancel the movement

                        In this step, we will transfer the patient to a new bed.

                        1. Go to Patient Encounter Management/Patient Encounter Supplier section of the PAM Simulator
                        2. From the "category of event" drop-down menu, select Transfer patient
                        3. From the "action to perform" drop-down list, select INSERT (A02)
                        4. Select the patient you have previously admitted and update his/her new bed. New patient's location must be the following {syntaxhighlighter brush: as3;fontsize: 100; first-line: 1; }Point of care: Nursing station 1-East Room: Room 10 (1-East) Bed: Middle Facility: British Hospital{/syntaxhighlighter}
                        5. Click on the send button and check that your system returns an acknowledgement with MSA-1=AA
                        6. Take a screenshot of your application as a proof of the transfer of the patient (we must see the new location)
                        7. Click on "Perform another test"
                        8. From the "action to perform" drop-down list, select CANCEL(A12) and select the patient's encounter you are working on
                        9. Confirm the movement cancellation to send the ADT^A12^ADT_A12 message to your system under test.
                        10. Take a screenshot of your application as a proof of the cancellation of the transfer (we must see the previous location)
                        11. Go to the HL7 messages section of the simulator, retrieve the two previous messages, copy the permanent links to the test report and paste them in your pre-connectathon logs.

                        3. Change patient class to outpatient

                        This step is used to change the patient class to outpatient (code = O)

                        1. Go back to  Patient Encounter Management/Patient Encounter Supplier section of the PAM Simulator
                        2. From the "category if event" drop-down menu, select Change patient class to outpatient
                        3. Select the patient/encounter you have previously admitted
                        4. Edit the informations about the new outpatient encounter and click on "Send message"
                        5. Check that your system returns an acknowledgement with MSA-1=AA
                        6. Take a screenshot of your application as a proof of the modification of the patient class
                        7. Go to the HL7 messages section of the simulator, retrieve the message you have just sent, copy the permanent link to the test report and paste it in your pre-connectathon logs.

                        4. Change patient class to inpatient

                        This step is used to change the patient class to inpatient (code = I)

                        1. Go back to  Patient Encounter Management/Patient Encounter Supplier section of the PAM Simulator
                        2. From the "category if event" drop-down menu, select Change patient class to inpatient
                        3. Select the patient/encounter you have previously admitted
                        4. Edit the informations about the new inpatient encounter and click on "Send message"
                        5. Check that your system returns an acknowledgement with MSA-1=AA
                        6. Take a screenshot of your application as a proof of the modification of the patient class
                        7. Go to the HL7 messages section of the simulator, retrieve the message you have just sent, copy the permanent link to the test report and paste it in your pre-connectathon logs.

                        5. Pre-admit patient and cancel the pre-admission

                        In this step, we test the capability of your system to pre-admit a patient and to cancel this pre-admission.

                        1. Go to Patient Encounter Management/Patient Encounter Supplier section of the PAM Simulator
                        2. From the "category of event" drop-down menu, select Pre-admit patient
                        3. From the "action to perform" drop-down list, select INSERT (A05)
                        4. Select the patient you have previously admitted and create a new encounter for him/her.
                        5. Click on the "Send message" button and check that your system returns an acknowledgement with MSA-1=AA
                        6. Take a screenshot of your application as a proof of the pre-admission of this patient
                        7. Click on "Perform another test"
                        8. From the "action to perform" drop-down list, select CANCEL(A38) and select the patient's encounter you are working on
                        9. Confirm the movement cancellation to send the ADT^A38^ADT_A38 message to your system under test.
                        10. Take a screenshot of your application as a proof of the cancellation of the pre-admission
                        11. Go to the HL7 messages section of the simulator, retrieve the two previous messages, copy the permanent links to the test report and paste them in your pre-connectathon logs.

                         

                         

                        PM_ITI-10-Receive: PIX Update Notification

                        This tests the ability of your application to receive an ITI-10 PIX update notification message.  This applies only to PIX Consumers that support the PIX Update Notification option.

                        This test is performed with the Patient Manager simulator http://gazelle.ihe.net/PatientManager sending ITI-10 messages to your system under test.

                        Tool documentation is located at http://gazelle.ihe.net/content/patient-manager-user-manual

                        Test Setup

                        1. Before starting test steps, please configure the tool to send to your appliction using the SUT Configurations menu in the tool.
                        2. Go to menu All Patients and, in the drop-down box "Simulated actor", select Patient Identity Cross-Reference Manager.  The page then lists the patients known by the PIX Manager.
                        3. Choose one of the available patients to use for this test.  You should then manually create that in your system (because this test will send you an update).

                        Test Steps

                        In these steps, you will use the Patietn Manager tool to send a PIX update notification to your application. 

                        1. Go to menu PIX --> Patient Identity Cross-Reference Manager --> Cross-references management
                        2. Check the box entitled "Send update notifications"
                        3. Select your System under Test in the drop-down menu.
                        4. Select the list of domains for which you will receive update notifications.
                        5. From the list of all patients, select the patient you have previously entered within your system. (Use the magnifying glass in the Action column)
                        6. The "Selected patient" is displayed at the bottom of the page.
                        7. Next, Drag and drop an identifier from the list of patients above into the 'Selected patient'. This creates a 'cross-reference' between the two patients.  A message is displayed to inform you that a message was sent to your system.
                        8. Make sure the simulator receives an acknowledgment from your system and that the patient is properly updated in your system.
                        9. Take a screenshot of your application or your database as a proof of the good registration of the patient. Retrieve the permanent link to the test report and to the patient and copy them in Gazelle Test Management with a comment "Patient creation". 

                         

                        Evaluation

                        The screen shots demonstrate that you have successfully processed the received message(s).

                        PM_PDQ_Query-Patient_Demographics_Consumer

                        This test applies to Patient Demographic Consumers in the PDQ or PDQv3 Profiles.

                        This test is performed with the Patient Manager simulator http://gazelle.ihe.net/PatientManager acting as a Patient Demographic Supplier responding to these queries.  Note that the test steps are the same no matter the transaction...you will choose the transaction(s) supported by your Consumer.

                        • ITI-21 and ITI 22 - Patient Demograhics (and Visit) Query (HL7v2) 
                        • ITI-47 - Patient Demographics Query (HL7v3)

                        Tool documentation is located at http://gazelle.ihe.net/content/patient-manager-user-manual

                        Test Setup

                        1. Go to menu All Patients and, in the drop-down box "Simulated actor", select PDS - Patient Demographic Supplier.  The page then lists the patients known by the PDQ Supplier simulator.
                        2. Choose one or more of the available patients to use as the target for your query in  this test.  

                        Test Steps

                        In these steps, you will use the Patient Manager as a Patient Demographics Supplier (PDS) Simulator to respond to your PDQ Query.

                        1. Access the Patient Manager tool: http://gazelle.ihe.net/PatientManager
                        2. Go to menu PDQ-->Patient Demographic Supplier
                        3. Next, select HL7v2 Configuration or HL7v3 Configuration
                        4. The tool will now display the configuration details you will need to query the PDS Simulator.  Ensure the status of the Simulator is "Running"
                        5. Perform your query.
                        6. You can use menu HL7 messages to find the query & response captured by the tool.
                        7. Take a screenshot of your application or your database as a proof of receipt of the query response. Retrieve the permanent link to the transaction instance, and paste that as evidence for this test.
                        8. If you support more than one transaction (ITI-21, ITI-22, ITI-47), you should repeat these steps for all you support.

                        Evaluation

                        The permanent link captures the message exchange. The screen shot demonstrates that you have successfully processed the received query response(s).

                        PM_PDQ_Query-Patient_Demographics_Supplier

                        This test applies to Patient Demographics Suppliers in the PDQ or PDQv3 Profiles.

                        This test is performed with the Patient Manager simulator http://gazelle.ihe.net/PatientManager acting as a Patient Demographic Consumer to initiate these queries.  Note that the test steps are the same no matter the transaction...you will choose the transaction(s) supported by your Supplier.

                        • ITI-21 and ITI 22 - Patient Demograhics (and Visit) Query (HL7v2) 
                        • ITI-47 - Patient Demographics Query (HL7v3)

                        Tool documentation is located at http://gazelle.ihe.net/content/patient-manager-user-manual

                        Test Setup

                        1. Before starting test steps, please configure the tool to query your Patient Demographics Supplier appliction using the SUT Configurations menu in the tool to enter your configuration parameters.

                        Test Steps

                        In these steps, you will use the Patient Manager as a Patient Demographic Consumer (PDC) Simulator to initiate a PDQ Query to your Supplier.

                        1. Access the Patient Manager tool: http://gazelle.ihe.net/PatientManager
                        2. Go to menu PDQ-->Patient Demographic Consumer
                        3. Next, select ITI-21/ITI-22, or ITI-47.
                        4. In the 'System under test' drop down list, select the entry for your PDQ Consumer
                        5. Next, enter 'Demographic information' to build a query that will match patient(s) in your Patient Demographic Supplier database.   When you are ready to initiate the query, select the 'Send message' button.
                        6. You can use menu HL7 messages to find the query & response captured by the tool.
                        7. Take a screenshot of your application or your database as a proof of receipt of the query response. Retrieve the permanent link to the transaction instance, and paste that as evidence for this test.
                        8. If you support more than one transaction (ITI-21, ITI-22, ITI-47), you should repeat these steps for all you support.

                        Evaluation

                        The permanent link captures the message exchange. The screen shots demonstrate that you have successfully processed the received message(s).

                        PM-PDQm_Query-Patient_Demographics_Consumer

                        This test applies to initiators of the [ITI-78] Patient Demographics Query for Mobile transaction:  Patient Demographics Consumers in the PDQm or PMIR Profiles.

                        This test is performed with the Patient Manager simulator https://gazelle.ihe.net/PatientManager acting as a Patient Demographic Supplier to  response to PDQm queries.  

                        Test step up

                        The list of patients available on the supplier side are available under the Patients menu. Select "simulated actor" = "PDS" to see which one can be returned in a PDQm query response.

                        The endpoint to contact is available under menu PDQ* > Patient Demographics Supplier> FHIR configuration.

                        Evaluation

                        Verify the conformance of each query issued by your system (blue play icon in the "query" column) and copy the permanent link to the message in your pre-connectathon test in Gazelle Test Management (available from the magnifying glass icon). 

                        The messages received by the simulator are available under HL7 Messages. To restrict the search, either access the page using the "history" icon on the FHIR Configuration page, either sent the following filters in the search criteria panel:

                        • Simulated actor = PDS - Patient Demographic Supplier
                        • Transaction = ITI-78 - Mobile Patient Demographics Query

                        Steps to execute

                        Not all the following test cases might be of relevance for your system under test. The purpose of this test is to make sure you correctly implement the portions of the specifications which are of interest for your system (based on the use cases it supports).

                        Query for a patient pick list

                        For this first step, we assume that the operator wants to retrieve a list of patients based on some demographics traits. You might want to repeat this step with various combinations of parameters among the following ones to see how the supplier understand your query:

                        • given and/or family: Partial or complete (exact) patient name (you might have the ability to provide the various given names of the patient)
                        • identifier: One or more Patient identifiers
                        • birthdate: Date of birth or age range (your system might allow you to input the exact birth date or a date interval)
                        • telecom: One or more contact points (phone number, email etc)
                        • address (or address-city, address-country, address-postalcode, address-state): Partial or complete address
                        • gender: The gender of the patient
                        • active: Whether you want to fetch inactive patients as well - By default the simulator will only send back active records if not otherwise requested in the query
                        • _id: Patient resource's id - The simulator uses the UUID field displayed in the GUI as resource's ID
                        • mathersMaidenName: Mother's maiden name if your system supports the Pediatric Demographics Option.

                        For each query, your system should at least display the number of retrieved entries and some of the demographic traits for each entry in the list.

                        Note that queries with no search parameter will return no entry at all.

                        Response encoding

                        If your system supports both JSON and XML encoding, repeat at least once of the previous query with the second encoding so that you can verify that your system correctly set the requested encoded. You might use the HTTP header or the _format query parameter.

                        Retrieve patient

                        In addition to the query feature, your system shall support the retrieve patient feature. Choose one patient out of the list and directly retrieve the associated resource by performing the retrieve operation.

                        Example: https://gazelle.ihe.net/PatientManager/fhir/Patient/UUID.

                        No Match

                        Send a new query to your system under test, make sure the query parameters do not match any patient in the tool. Your system is expected to inform the final user that no match has been found.

                        Query for patient with identifiers in other domains

                        You must execute the steps below if your system is able to constrain the domains from which patient identifiers are returned from the Patient Demographics Supplier. 

                        Query with no domain

                        If you can turn off the domain restriction in your system. First, choose a combination of parameters that will return at least one patient with identifiers in several domains. Under the Connectathon > Patient Demographics menu, you will find the patients that will be pre-loaded by suppliers during the Connectathon; they are also known by the Patient Demographics Supplier implemented in the tool. Each patient has an identifier in at least four domains.

                        Your system shall receive the patient(s) with identifiers in the IHEBLUE, IHEFACILITY, IHEREF and IHEGREEN domains at least.

                        Query with a known domain

                        Reuse the same query parameters as for the previous test but restrict the domain to the IHEBLUE (urn:oid:1.3.6.1.4.1.21367.13.20.3000) domain.

                        If your query is correctly formatted, the returned patient(s) should only have identifiers with system = urn:oid:1.3.6.1.4.1.21367.13.20.3000.

                        Query with more than one domain

                        If your system supports more than one domain, repeat the operation above: constraint the identification domain to IHEBLUE and IHERED (urn:oid:urn:oid:1.3.6.1.4.1.21367.13.20.1000).

                        Once again, if your query is correctly formatted, the returned patient(s) should only have identifiers with system urn:oid:1.3.6.1.4.1.21367.13.20.3000 and urn:oid:1.3.6.1.4.1.21367.13.20.1000.

                        Query for an unknown domain

                        Reuse the same query parameters once again and restrict the domain to urn:oid:1.3.6.1.4.1.21367.13.20.9999999. This domain is unknown from the Patient Demographics Supplier.

                        If your query is correctly formatted, you should receive HTTP 404 error code with an OperationOutcome resource in which the unknown domain is precised. You might or might not give feedback on such error to the final user. No entry shall be displayed to the user (none will be returned by the Patient Demographics Supplier).

                        Paging

                        Execute this step if your system supports the paging mechanism, meaning that it can add the _count parameter to the query. In this step we assume that the user is able to set the number of records to fetch at one time. If your system does not provide this ability (default quantity, or quantity to choose from a list), simply adapt the test data below.

                        Set search parameters in a way that they will select at least 3 entries (usually given=rob is a good candidate) and ask for only two records at a time. If your query is correctly formatted, the received Bundle should contain only 2 entries and 

                        • A link to the next results (2 or less);
                        • A total number of results higher to 2.

                        Ask for the next batch of results. You should be able to see at least one more patient.

                        PM-PDQm_Query-Patient_Demographics_Supplier

                        This test applies to responders to the [ITI-78] Patient Demographics Query for Moblie transaction:   Patient Demographics Suppliers in the PDQm Profile or Patient Identity Registry in the PMIR Profiles

                        This test is performed with the Patient Manager simulator https://gazelle.ihe.net/PatientManager acting as a Patient Demographic Consumer to initiate these queries.  

                        Test set up

                        There is no prerequisite in terms of data to be load into your system. As such, choose relevant values for the various parameters based on the patient demographics known by your system under test so that matches are returned.

                        First of all, register your system under test within the tool as a FHIR Responder under SUT Configurations > FHIR Responders. Make sure to select IHE => ITI-78 (PDQm Consumer) in the list of usages.

                        Access the Patient Demographics Consumer for PDQm in Patient Manager from menu PDQ* > Patient Demographics Consumer > [ITI-78] Patient Demographics Query FHIR.

                        Evaluation

                        Verify the conformance of each response issued by your system (blue play icon in the "response" column) and copy the permanent link to the message in your pre-connectathon test in Gazelle Test Management (available from the magnifying glass icon).  Also verify that the response format expected in the query matches the response format (XML vs JSON) returned by your system. 

                        Steps to execute

                        Capability Statement

                        Upload the capability statement of your system (showing at least the PDQm Supplier features) in the pre-connectathon test in Gazelle Test Management.

                        Query for a patient pick list

                        For this first step, we assume that the consumer actor wants to retrieve a list of patients based on some demographics traits. You might want to repeat this step with various combinations of parameters among the following ones to test the behavior of your system:

                        • Partial or complete patient name (given and/or family)
                        • One or more Patient identifiers
                        • Date of birth or age range
                        • One or more contact points (phone number, email etc)
                        • Partial or complete address
                        • Gender
                        • Whether you want to fetch inactive patients as well

                        Note that for parameters of type string, "exact" modifier will be added to the query if the wildcard is not used in the field (when valued). If you want to search of patients with a given starting with Ro, enter Ro* in the form.

                        If your system supports the Pediatric Demographics Option, you might also want to make sure that your system supports the mothersMaidenName search extension.

                        For this step, you are asked not to modify the default parameters in the "Additional information" section of the page.

                        Once you have flll out the form, push the "Send message" button. If matches are returned by your system, they will be displayed at the bottom of the page.

                        After you have retrieved a first batch of patients. You should also be able to use the resource ID search parameter (_id). You can find the value to use in the response returned by your system in Bundle.entry.resource.Patient.id.value.

                        Response encoding

                        Your system under test shall be able to support at least both JSON and XML encodings. Previous step has been executed using "XML" as format to be returned. Return the step above at least one but select Response format = json before sending the message to your system.

                        Retrieve patient

                        Access the detail of the response content and for one of the entries, access the URL displayed in field entry.fullUrl.value. You should retrieve the content of the Patient resource.

                        No Match

                        Send a new query to your system under test, make sure the query parameters do not match any patient in your database. Your system is expected to send back a Bundle resource with Bundle.total.value = 0.

                        Query for patient with identifiers in other domains

                        In this step, we focus on the domain restriction feature. We assume that your system manages at least one domain for patient identification.

                        Query with no domain

                        First, choose a combination of parameters that will return at least one patient. If your system supports multiple identifier domains, make sure the returned patients will show at least identifiers from two different domains. DO NOT restrict the search to a particular domain. We are interested in knowing what are the identifiers known for this patient.

                        Query with a known domain

                        Repeat the search below but in the "Additional information" section, add the identifier of one of the domains for which the returned patient has a PID assigned. Click on "Add domain to list" for the tool to take the value into account.

                        Your system is expected to return the patients that are a PID assigned to this domain, only one PID shall appear for each of them.

                        Query with more than one domain

                        If your system supports more than one domain, repeat the operation above to add a second domain in the list of domains that are returned.

                        Once again, your system is expected to return the patients with PID in the mentioned domains. No other PID shall be returned.

                        Query for an unknown domain

                        Repeat the test but first clean up the list of domains to return and add "1.3.6.1.4.1.21367.13.20.9999"  instead (this domain might not be known by your system, otherwise choose another value).

                        No entry shall be returned. Refer to section 3.78.4.1.3 / Case 4 for details on the expected response (Code HTTP 404 with OperationOutcome or HTTP 200 with no content).

                        Paging

                        The Patient Demographics Supplier shall represent the incremental responses as specified in FHIR Paging. 

                        Empty the query form and enter parameters that will allow your system to match more than two patients. 

                        In the "Additional information" section, check the box for option "Limit number of responses" and set the limit value to "1".

                        Click on "Send message", the tool shall display the single entry that is returned by your system and the following message "More results are available on supplier side". As for the other steps, copy the link to this message in Gazelle Test Management.

                        Click on "Get next results".

                        In the "Returned patients" section, the number of pages increases. A single patient is displayed.

                        PM_PIX_Query-PIX_Consumer

                        This test applies to Patient Identity Consumers who initiate queries in the PIX, PIXv3, PIXm, or PMIR Profiles.

                        This test is performed with the Patient Manager simulator http://gazelle.ihe.net/PatientManager acting as a Patient Identity Cross-Reference Manager (PIX Manager) responding to these queries.  Note that the test steps are the same no matter the transaction...you will choose the transaction(s) supported by your Consumer.

                        • ITI-9- PIX Query (HL7v2) 
                        • ITI-45 - PIX Query (HL7v3)
                        • ITI-83 - PIXm Query (HL7 FHIR®©)

                        Tool documentation is located at http://gazelle.ihe.net/content/patient-manager-user-manual

                        Test Setup

                        1. Go to menu All Patients and, in the drop-down box "Simulated actor", select PAT_IDENTITY_X_REF_MANAGER.  The page then lists the patients known by the PIX Manager simulator.
                        2. Choose one or more of the available patients to use as the target for your query in  this test.  

                        Test Steps

                        In these steps, you will use the Patient Manager as a PIX Manager Simulator to respond to your PIX Query.

                        1. Access the Patient Manager tool: http://gazelle.ihe.net/PatientManager
                        2. Go to menu PIX-->Patient Identity Cross-Reference Manager
                        3. Next, select HL7v2 Configuration, HL7v3 Configuration, or FHIR Configuration
                        4. The tool will now display the configuration details you will need to query the PIX Manager Simulator.  Ensure the status of the Simulator is "Running"
                        5. Perform your query.
                        6. You can use menu HL7 messages to find the query & response captured by the tool.
                        7. Take a screenshot of your application or your database as a proof of receipt of the query response. Retrieve the permanent link to the transaction instance, and paste that as evidence for this test.
                        8. If you support more than one transaction (ITI-9, ITI-45, ITI-83), you should repeat these steps for all you support.

                        Evaluation

                        The permanent link captures the message exchange. The screen shot demonstrates that you have successfully processed the received query response(s).

                        PM_PIX_Query-PIX_Manager

                        This test applies to Patient Identifier Cross-Reference Managers (PIX Managers) in the PIX, PIXv3, PIXm, or PMIR Profiles.

                        This test is performed with the Patient Manager simulator http://gazelle.ihe.net/PatientManager acting as a PIX Consumer to initiate these queries.  Note that the test steps are the same no matter the transaction...you will choose the transaction(s) supported by your Supplier.

                        • ITI-9 - PIX Query (HL7v2) 
                        • ITI-45 - PIX Query (HL7v3)
                        • ITI-83 - PIXm Query (HL7 FHIR®©)

                        Tool documentation is located at http://gazelle.ihe.net/content/patient-manager-user-manual

                        Test Setup

                        1. Before starting test steps, please configure the tool to query your PIX Manager application using the SUT Configurations menu in the tool to enter your configuration parameters.

                        Test Steps

                        In these steps, you will use the Patient Manager as a PIX Consumer Simulator to initiate a PIX Query to your PIX Manager

                        1. Access the Patient Manager tool: http://gazelle.ihe.net/PatientManager
                        2. Go to menu PIX-->Patient Identity Consumer
                        3. Next, select ITI-9, or ITI-45, or ITI-83
                        4. In the 'System under test' drop down list, select the entry for your PIX Manager
                        5. Next, enter values for the Query parameters (eg for Patient ID and assigning authority) to build a query that will match patient(s) in your PIX Manager database.   When you are ready to initiate the query, select the 'Send message' button.
                        6. You can use menu HL7 messages to find the query & response captured by the tool.
                        7. Retrieve the permanent link to the transaction instance, and paste that as evidence for this test.
                        8. If you support more than one transaction (ITI-9, ITI-45, ITI-83), you should repeat these steps for all you support.

                        Evaluation

                        The permanent link captures the message exchange. 

                        PM_RAD-1_RAD-12-Receive: Register new patient and update

                        This tests the ability of your application to receive RAD-1 and RAD-12 patient registration and update messages.

                        This test is performed with the Patient Manager simulator http://gazelle.ihe.net/PatientManager sending messages to your system under test.

                        Tools documentation is located at http://gazelle.ihe.net/content/patient-manager-user-manual

                        Before starting your tests, please configure the tool to send to your appliction using the SUT Configurations menu in the tool.

                        Test Steps


                        1. Patient creation

                        In this step, you are expected to send a new patient to your application .

                        1. Go to menu ADT-->[RAD-1] Patient Registration
                        2. Select your System under Test in the drop-down menu.
                        3. Select the Category of Event and Action to perform to initiate an AO1, A04 or A00
                        4. Select the "generate patient" option. 
                        5. Within the "Patient Generation with DDS" panel, set the creation criteria of your choice and click the "Generate patient"  button. If you need your patient to have a specific identifier, then click on the "Modify patient data"  button and go to the Patient's identifiers tab and update the identifiers in a way your system under test will accept the patient.
                        6. Finally, hit the "Send" button. 
                        7. Make sure the simulator receives an acknowledgment from your system and that the patient is properly registered in your system.
                        8. Take a screenshot of your application or your database as a proof of the good registration of the patient. Retrieve the permanent link to the test report and to the patient and copy them in Gazelle Test Management with a comment "Patient creation". 

                        2. Update patient information

                        In this step, you are expected to update the first name of the new and send the notification to your system under test.

                        1. Go to menu ADT-->[RAD-12] Patient Update
                        2. Select your system under test in the drop-down menu.
                        3. Select the Category of event to be Update Patient Information
                        4. Select the patient you just created 
                        5. Change the patient's first name.  Then hit the Send  button
                        6. Make sure the simulator receives and acknowledgment from your system and that the patient is properly updated in your system.
                        7. Take a screenshot of your application or your database as a proof of the good update of the patient. Retrieve the permanent link to the test report and to the patient and copy them in Gazelle Test Management with a comment "Update patient information".

                        Evaluation

                        The screen shots demonstrate that you have successfully handled the received ADT messages.

                        PM-XCPD_InitGW_DeferredResponse

                        This test applies to the XCPD Initiating Gateway actot that supports the Deferred Response Opiton.  See ITI TF-1: 27.2.2 and ITI TF-2b: 3.55.6.2.

                        This test is performed with the Gazelle Patient Manager simulator: https://gazelle.ihe.net/PatientManager 

                        See also the User Manual for testing XCPD with the Patient Manager here.

                        Instructions

                        The instructions for this test are detailed in the User Manual for the Patient Manager tool, so they are not repeated here. See:  https://gazelle.ihe.net/gazelle-documentation/Patient-Manager/user.html#deferred-response-on-initiating-gateway

                        Evaluation

                        After you perform the test, the Patient Manager tool will produce a Test Report.  Copy the Permanent Link to to Test Report, then paste that link into Gazelle Test Management as the results for this test.

                         

                        PM-XCPD_InitGW_with_XUA

                        The basline requirements of the XCPD profile do not require the actors to also implement XUA; however, this test enables you to test your XCPD Initiating Gateway actor that also support XUA X-Service User

                        This test is performed with the Gazelle Patient Manager simulator: https://gazelle.ihe.net/PatientManager 

                        See also the User Manual for testing XCPD with the Patient Manager here.

                        Instructions

                        The instructions for this test are detailed in the User Manual for the Patient Manager tool, so they are not repeated here. See the instructions for XUA over XCPD for the X-Service User here: https://gazelle.ihe.net/gazelle-documentation/Patient-Manager/user.html#xua-over-xcpd

                        Evaluation

                        After you perform the test, find your result in the Patient Manager tool under menu XUA >  X-Service User logs.  Copy. the URL for your result and paste it into Gazelle Test Management as the result for this test.

                        PM-XCPD_RespGW_DeferredResponse

                        This test applies to the XCPD Responding Gateway actot that supports the Deferred Response Opiton.  See ITI TF-1: 27.2.2 and ITI TF-2b: 3.55.6.2.

                        This test is performed with the Gazelle Patient Manager simulator: https://gazelle.ihe.net/PatientManager 

                        See also the User Manual for testing XCPD with the Patient Manager here.

                        Instructions

                        The instructions for this test are detailed in the User Manual for the Patient Manager tool, so they are not repeated here. See: https://gazelle.ihe.net/gazelle-documentation/Patient-Manager/user.html#deferred-response-on-responding-gateway

                        Evaluation

                        After you perform the test, the Patient Manager tool will produce a Test Report.  Copy the Permanent Link to to Test Report, then paste that link into Gazelle Test Management as the results for this test.

                         

                        PM-XCPD_RespGW_with_XUA

                        The baseline requirements of the XCPD profile do not require the actors to also implement XUA; however, this test enables you to test your XCPD Responding Gateway actor that also support XUA X-Service Provider

                        This test is performed with the Gazelle Patient Manager simulator: https://gazelle.ihe.net/PatientManager 

                        See also the User Manual for testing XCPD with the Patient Manager here.

                        Instructions

                        The instructions for this test are detailed in the User Manual for the Patient Manager tool, so they are not repeated here. See the instructions for XUA over XCPD for the X-Service Provider here:  https://gazelle.ihe.net/gazelle-documentation/Patient-Manager/user.html#xua-over-xcpd

                        Evaluation

                        After you perform the test, find your result in the Patient Manager tool under menu XUA >  X-Service Provider logs.  Copy. the URL for your result and paste it into Gazelle Test Management as the result for this test.

                        Order Manager tests

                        This section contains test cases performed with the the Order Manager tool.

                        Tool: http://gazelle.ihe.net/OrderManager

                        Tool user manual: https://gazelle.ihe.net/gazelle-documentation/Order-Manager/user.html

                        30001: LTW Order Placer sends notifications

                        This test will be performed against the Order Filler part of the OrderManager Gazelle simulator. Here, we are only checking the initiator part of the Order Placer; that means that, in this test, your system is only asked to send messages to the simulator to create and cancel orders. The receiver part of the Order Placer (used to receive notifications from the Order Filler) is tested in the test #30002.

                        First step to perform: retrieve the configuration of the Order Filler to which send messages: 

                        1. Go to the OrderManager simulator
                        2. Log onto the application using the CAS login (it uses your Gazelle -European instance- credentials)
                        3. Select the Laboratory domain (bottom-right drop-down menu)
                        4. Go to Order Management --> Order Filler --> Order Filler configuration
                        5. Select the character set encoding you will use to send messages to the simulator and enter the given configuration within your Order Placer.

                        Pre-requisites

                        The LAB-1 (Lab Placer Order Management) transaction defines three structures of messages (OML^O21^OML_O21, OML^O33^OML_O33, OML^O35^OML_O35). As an initiator in this test, you are free to use the structure  your system under test supports. Please, add a comment into the pre-connectathon test instance you have started to tell us which structures your system uses.

                        Your Order Placer is assumed to be coupled with a PAM actor, that means that you should be able either to enter a new patient/encounter into your system or to receive a new patient/encounter from a PAM/PDS or PAM/PES actor. If you need to populate your system with a patient and encounter using an external system, you are free to use the PAMSimulator tool to do so.

                        Order creation

                        1. Enter a new order within your system under test
                        2. Send a message to the Order Filler part of the simulator. ORC-1 must be valued with "NW".
                        3. Go to the "HL7 messages" part of the simulator and retrieve the message you have sent. Click on the message id (left-hand column) to get its permanent link and the validation result.
                        4. Copy and Paste the "permanent link to the test report" into Gazelle
                        5. Go to Browse Data/All orders and retrieve the order you have sent. Copy and paste its permanent link into Gazelle

                        Order cancellation

                        1. Within your system, cancel the previous order and send the notification to the simulator. ORC-1 must be valued with "CA".
                        2. Go to the "HL7 messages" part of the simulator and retrieve the message you have sent. Click on the message id (left-hand column) to get its permanent link and the validation result.
                        3. Copy and Paste the "permanent link to the test report" into Gazelle

                        30002: LTW Order Placer receives notifications

                        The purpose of this test is checking that the Order Placer is able to integrate messages received from the Order Filler in the context of the LAB-2 (creation of new orders) and LAB-1 (cancellation and status updates) transactions. We will also check that the acknowledgements are properly built.

                        Pre-requisites

                        Before beginning this test, do not forgot to check that the configuration (IP address, port, application/facility names) of your system under test is enter within the simulator. 

                        Your Order Placer is assumed to be coupled with a PAM actor, that means that you should be able either to enter a new patient/encounter into your system or to receive a new patient/encounter from a PAM/PDS or PAM/PES actor. If you need to populate your system with a patient and encounter using an external system, you are free to use the PAMSimulator tool to do so. The Order Manager must also know this patient/encounter, to share the patient between the PAMSimulator and the Order Manager tools, read the tutorial available here. If you have already performed the test #30001, you can use the same patient.

                        We strongly recommend to read the tutorial available here.

                        1. Assign a number to an order created by the Order Filler (LAB-2)

                        In this step, you show the ability of your Order Placer to accept the creation of orders by the laboratory (ORC-1 = "SN") and to assign placer order numbers to these orders. The acknowledgement message (ORL of the corresponding message structure, with ORC-1="NA") must carry the placer order number (ORC-2).

                        As a receiver in this test, your Order Placer shall be able to integrate all of the three message structures defined in the technical framework. As a consequence, you are asked to perform this step three times.

                        1. Go to Order management --> Order Filler --> Notify order placer of orders
                        2. Select your system under test in the drop-down list
                        3. Select "Create a new order" as action to perform
                        4. Select one of the three message structures
                        5. Select the patient for whom you want to create a new laboratory order
                        6. Fill the order and click on the "send message" button
                        7. Retrieve the permanent link to the test report and paste it in Gazelle.
                        8. Take a screenshot of your application as a proof of the good integration of the message. Upload it in Gazelle.
                        9. Play this step again with another message structure

                        2. Integrate the cancellation of an order on the laboratory initiative (LAB-1)

                        In this step, you show the ability of your Order Placer to intefration te cancellation of orders by the laboratory (ORC-1="OC") and to acknowledge it (ORL of the corresponding message structure, with ORC-1 = "OK")

                        1. Go to Order management --> Order Filler --> Notify order placer of orders
                        2. Select your system under test in the drop-down list
                        3. Select "Cancel an existing order" as action to perform
                        4. Select the message structure to use
                        5. Select the laboratory order to cancel and when the pop-up raises, click on the "Yes" button.
                        6. Retrieve the permanent link to the test report and paste it in Gazelle
                        7. Take a screenshot of your application as a proof of the good integration of the message. Upload it in Gazelle.

                        3. Integration an order status change notified by the Order Filler (LAB-1)

                        In this step, you show the ability of your Order Placer to integrate a change of status of an order (OBR-25 must change of calue, this must be visible in your application screenshot), notified by the Order Filler (ORC-1="SC") and to acknowledge it (ORL of the corresponding message structure, witch ORC-1 = "OK")

                        1. Go to Order management --> Order Filler --> Notify order filler of orders
                        2. Select your system under test in the drop-down list
                        3. Select "Update order status" as action to perform
                        4. Select the message structure to use
                        5. Select the laboratory order to update and when the pop-up raises, select the new status and click on the "Send update notification" button.
                        6. Retrieve the permanent link to the test report  and paste it in Gazelle
                        7. Take a screenshot of your application as a proof of the good integration of the message. Upload it in Gazelle

                        30003: LTW Order Filler receives notifications

                        This test is dedicated to the LAB-1 transaction from the point of view of the Order Filler. In this test, we will check that the Order Filler is able to integrate the notifications of creation and cancellation of orders received from the Order Placer. In this test, you are asked to use the Order Manager tool as an Order Placer to send messages to your system under test.

                        Pre-requisite

                        Before beginning this test, do not forget to check that the configuration (IP address, port, application/facility names) of your system under test is enter within the simulator. 

                        Your Order Filler is assumed to be coupled with a PAM actor, that means that you should be able either to enter a new patient/encounter into your system or to receive a new patient/encounter from a PAM/PDS or PAM/PES actor. If you need to populate your system with a patient and encounter using an external system, you are free to use the PAMSimulator tool to do so. The Order Manager must also know this patient/encounter, to share the patient between the PAMSimulator and the Order Manager tools, read the tutorial available here. If you have already performed the test #30004, you can use the same patient.

                        We strongly recommend to read the tutorial available here.

                        1. Laboratory Order creation

                        As a receiver in this test, your Order Filler shall be able to integrate all of the three message structures defined in the technical framework. As a consequence, you are asked to perform this step three times.

                        1. Go to Order management --> Order Placer --> Notify order filler of orders
                        2. Select your system under test in the drop-down list
                        3. Select "Create a new order" as action to perform
                        4. Select one of the three message structures
                        5. Select the patient for whom you want to create a new laboratory order
                        6. Fill the order and click on the "send message" button
                        7. Retrieve the permanent link to the test report and paste it in Gazelle.
                        8. Take a screenshot of your application as a proof of the good integration of the message. Upload it in Gazelle.
                        9. Play this step again with another message structure

                        2. Cancellation of a laboratory order

                        In this step, your Order Filler shall prove its ability to accept and integrate an order cancellation sent by the Order Placer (ORC-1="CA") and to acknowledge it (ORC-1="CR").

                        1. Go to Order management --> Order Placer --> Notify order filler of orders
                        2. Select your system under test in the drop-down list
                        3. Select "Cancel an existing order" as action to perform
                        4. Select the message structure to use
                        5. Select the laboratory order to cancel and when the pop-up raises, click on the "Yes" button.
                        6. Retrieve the permanent link to the test report and paste it in Gazelle
                        7. Take a screenshot of your application as a proof of the good integration of the message. Upload it in Gazelle.

                        30004: LTW Order Filler sends notifications

                         

                        This test will be performed against the OrderPlacer part of the OrderManager Gazelle simulator. Here, we are only checking the initiator part of the Order Filler; that means that, in this test, your system is only asked to send messages to the simulator to create and cancel orders and to update the status of orders. The receiver part of the Order Filler (used to receive notifications from the Order Placer) is tested in the test #30003.

                        First step to perform: retrieve the configuration of the Order Placer to which send messages: 

                        1. Go to the OrderManager simulator
                        2. Log onto the application using the CAS login (it uses your Gazelle -European instance- credentials)
                        3. Select the Laboratory domain (bottom-right drop-down menu)
                        4. Go to OrderManagement--> Order Placer --> Order Placer configuration
                        5. Select the character set encoding you will use to send messages to the simulator and enter the given configuration within your Order Filler.

                        Pre-requisites

                        The LAB-1 (Lab Placer Order Management) and LAB-2 (Lab Filler Order Management) transactions define three structures of messages (OML^O21^OML_O21, OML^O33^OML_O33, OML^O35^OML_O35). As an initiator in this test, you are free to use the structure  your system under test supports. Please, add a comment into the pre-connectathon test instance you have started to tell us which structures your system uses.

                        As described in the Technical Framework  (TF-LAB volume 1), your system under test is assumed to implement actors from the PAM or PDQ profile in addition of the Order Filler actor from the LTW integration profile. That means that your system is able either to create a new patient and encounter or to receive those information from an external system. If your system under test implements PEC and/or PDC actors from the PAM profile, feel free to use the PAMSimulator tool to receive new patient and encounter for initializing this test.

                        Order creation

                        As a first step for this test, you will have to create two new orders (the status of the first one will be updated in step 2 and the second one will be cancelled in step 3).

                        1. Enter a new order within your system under test
                        2. Send a message to the Order Placer part of the simulator. ORC-1 must be valued with "SN".
                        3. Check the acknowledgment received by your system. (ORC-1="NA") with a placer order number provided in ORB-2.
                        4. Go to the "HL7messages" part of the simulator and retrieve the message you have sent. Click on the messageid(left-hand column) to get its permanent link and thevalidationresult.
                        5. Copy and Paste the "permanent link to the test report" into Gazelle
                        6. Go to Browse Data/All orders and retrieve the order you have sent. Copy and paste its permanent link into Gazelle.
                        7. Perform steps 1 to 6 a second time in order to create a second order (could be for the same patient).

                        Update order status

                        In this second step, the status of the fist placed order will be updated to "A" (Some but not all results available).

                        1. Update the status of the previous order to "A".
                        2. Send a notification to the Order Placer part of the simulator. ORC-1 must be valued with "SC", ORC-5="A", OBR-25="P".
                        3. Refresh the permanent page of the order and check its new status.
                        4. Go to the "HL7 messages" part of the simulator and retrieve the message you have sent. Click on the message id (left-hand column) to get its permanent link and the validation result.
                        5. Copy and Paste the "permanent link to the test report" into Gazelle
                        6. Do not drop your database after this test; you may want to use this order to send results to the Order Result Tracker (test #30006)

                        Order cancellation on Order Filler's initiative

                        This third test is dedicated to the cancellation of the second order you have sent.

                        1. Within your system, cancel the order and send a notification (ORC-1="SC", ORC-5="CA", OBR-23="X") to the Order Placer part of the simulator.
                        2. Refresh the permanent page of the order and check it has been cancelled.
                        3. Go to the "HL7 messages" part of the simulator and retrieve the message you have sent. Click on the message id (left-hand column) to get its permanent link and the validation result.
                        4. Copy and Paste the "permanent link to the test report" into Gazelle

                        30005: LTW Work order management

                        This test is used to test the capability of the Order Filler and Automation Manager actors to manage work orders. For both actors, the test will be performed against the Order Manager Gazelle tool.

                         

                        Order Filler

                        First step to perform: retrieve the configuration of the Automation Manager to which send messages: 

                        1. Go to the OrderManager simulator
                        2. Log onto the application using the CAS login, it uses your Gazelle (EU-CAT) credentials
                        3. Select the Laboratory domain (bottom-right drop-down menu)
                        4. Go to Order Management --> Automation Manager --> Automation Manager configuration
                        5. Select the character set encoding you will use to send messages to the simulator and enter the given configuration within your Order Filler.

                        Pre-requisites

                        The LAB-4 (Work Order Management) transaction defines three structures of messages (OML^O21^OML_O21, OML^O33^OML_O33, OML^O35^OML_O35). As an initiator in this test, you are free to use the structure  your system under test supports. Please, add a comment into the pre-connectathon test instance you have started to tell us which structures your system uses.

                        Your Order Filler is assumed to be coupled with a PAM actor, that means that you should be able either to enter a new patient/encounter into your system or to receive a new patient/encounter from a PAM/PDS or PAM/PES actor. If you need to populate your system with a patient and encounter using an external system, you are free to use the PAMSimulator tool to do so.

                        1. Order creation

                        1. Enter a new work order within your system under test
                        2. Send a message to the Automation Manager part of the simulator. ORC-1 must be valued with "NW".
                        3. Go to the "HL7 messages" part of the simulator and retrieve the message you have sent. Click on the message id (left-hand column) to get its permanent link and the validation result.
                        4. Copy and Paste the permanent link to the test report into Gazelle
                        5. Go to Browse Data/All work orders and retrieve the order you have sent. Copy and paste its permanent link into Gazelle

                        2. Order cancellation

                        1. Within your system, cancel the previous work order and send the notification to the simulator. ORC-1 must be valued with "CA".
                        2. Go to the "HL7 messages" part of the simulator and retrieve the message you have sent. Click on the message id (left-hand column) to get its permanent link and the validation result.
                        3. Copy and Paste the permanent link to the test report into Gazelle

                        Automation Manager

                        In this test, you will use the Order Manager tool to create a work order to send to your system under test. Before starting this test, make sure you have properly registered your system in the "SUT Configurations" section of the tool and that your system under test is reachable from the Internet (no firewall prevents it from receiving messages from our tools).

                        Pre-requisites

                        We strongly recommend to read the tutorial available here.

                        1. Work Order creation

                        As a receiver in this test, your Automation Manager shall be able to integrate all of the three message structures defined in the technical framework. As a consequence, you are asked to perform this step three times (if your system does not support all of the three messages, please leave a comment in Gazelle to explain which one it supports, and why it does not support all of them).

                        1. Go to Order management --> Order Filler --> Notify Automation Manager of work orders
                        2. Select your system under test in the drop-down list
                        3. Select "Create a new order" as action to perform
                        4. Select one of the three message structures
                        5. Select the patient for whom you want to create a new work order
                        6. Fill the order and click on the "send message" button
                        7. Retrieve the permanent link to the test report and paste it in Gazelle.
                        8. Take a screenshot of your application as a proof of the good integration of the message. Upload it in Gazelle.
                        9. Play this step again with another message structure

                        2. Cancellation of a work order

                        In this step, you will cancel the first work order received by your Automation Manager.

                        1. Go to Order management --> Order Placer --> Notify order filler of orders
                        2. Select your system under test in the drop-down list
                        3. Select "Cancel an existing order" as action to perform
                        4. Select the message structure to use
                        5. Select the work order to cancel and when the pop-up raises, click on the "Yes" button.
                        6. Retrieve the permanent link to the test report and paste it in Gazelle
                        7. Take a screenshot of your application as a proof of the good integration of the message. Upload it in Gazelle.

                        30006: LTW Test result management

                        The aim of this test is to prove the capability of the Order Filler, Automation Manager and Order Result Tracker to manage laboratory test results. In other words, we check that your system is able to send, receive and/or integrate the messages defined in Order Result Management (LAB-3) and Test Result Management (LAB-5). 

                        Those tests have to be performed against the Order Manager Gazelle tool, which will play the role of Order Result Tracker, Order Filler or Automation Manager according to the cases.

                        Order Filler

                        The Order Filler actor is involved in both LAB-3 (as initiator) and LAB-5 (as receiver) transactions. In this test, we check that your system is able to integrate the test results sent by the Automation Manager (role played by the Order Manager Gazelle tool) and to send order results to the Order Result Tracker (role played by the Order Manager tool).

                        Order Result Management (LAB-3)

                        This part of the test will use the Order Manager as an Order Result Tracker. First step to perform: retrieve the configuration of the Order Result Tracker to which send messages: 

                        1. Go to the OrderManagersimulator
                        2. Log onto the application using the CAS login, it uses your Gazelle (EU-CAT) credentials
                        3. Select theLaboratorydomain (bottom-right drop-down menu)
                        4. Go to Results Management --> Order Result Tracker
                        5. Select the character set encoding you will use to send messages to the simulator and enter the given configuration within your Order Filler.

                        Pre-requisites

                        The LAB-3 (Order Result Management) transaction defines two structures of messages (OUL^R22^OUL_R22 and ORU^R01^ORU_R01). As an initiator in this transaction, you are free to use the structure  your system under test supports.

                        You are assumed to perform this test after working on test #30004 so that you can reuse the order previously placed in the Order Placer. If you have followed the instruction of the test #30004 you shall have the values: ORC-5="A" and OBR-25="P". 

                        1. The Order Filler notifies the Order Result Tracker of the clinical validation of the results

                        1. Retrieve the laboratory order to use (or create a new one with the previously given characteristics) and add two observations relative to the order (OBX-11 = "P" or OBX="F")
                        2. Send a notification to the Order Result Tracker 
                        3. Follow menu Browse data --> Order Result Tracker's data and retrieve the order you have just sent to the tool
                        4. Copy and paste the permanent link to this order in the pre-connectathon instance started in Test Management
                        5. Go to the "HL7 messages" part of the simulator and retrieve the message you have sent. Click on the message id (left-hand column) to get its permanent link and the validation result.
                        6. Copy and Paste the "permanent link to the test report" into Gazelle

                        2. The Order Filler notifies the Order Result Tracker of the deletion of the test

                        1. Cancel the previous order from your Order Filler.
                        2. Send a notification to the Order Result Tracker. OBR-25 must be valued with "X", OBX-11 with "X" and ORC-5 with "CA"
                        3. Check that the tool has received the message and properly integrated it.
                        4. Go to the "HL7 messages" part of the simulator and retrieve the message you have sent. Click on the message id (left-hand column) to get its permanent link and the validation result.
                        5. Copy and Paste the "permanent link to the test report" into Gazelle

                        Test Results Management (LAB-5)

                        This part of the test will use the Order Manager part as an Automation Manager. If you have already performed tests #30003 and #30004, your system under test is already registered within the tool. 

                        Pre-requisites

                        The LAB-5 (Test Results Management) transaction defines two structures of messages (OUL^R22^OUL_R22 and OUL^R23^OUL_R23). As a responder in this transaction, your system under test must be able to integrate both. Go to Results management --> Automation Manager.

                        1. The Automation Manager transmits multiple results related to a specimen to the Order Filler

                        1. Select your system under test in the drop-down list located at the top of the page
                        2. Select "Specimen centered result reporting" as Message structure to use
                        3. Click on "Create a new result from scratch". The list of available patients is displayed, pick one known by your Order Filler (You may have already used patients for previous tests)
                        4. Fill the specimen, put the specimen available (tick the "specimen available?" checkbox) and ask for created only one container. Then click on "Create an order for specimen".
                        5. Fill the new work order. Result status must be equal to "Results stored; not yet verified" and the order status to "Some but not all results available".
                        6. Click on "Add to current specimen". Then, click on "Add observations".
                        7. Add an observation relative to the work order. Observation result status must be equal to "Results entered - not verified".
                        8. Finally, send the message and check that it is properly received and integrated by your system under test.
                        9. Go to the "HL7 messages" part of the simulator and retrieve the message you have sent. Click on the message id (left-hand column) to get its permanent link and the validation result.
                        10. Copy and Paste the "permanent link to the test report" into Gazelle
                        11. As a proof of the correct integration of the message within your system, take a screenshot of your application (showing the order and observation) and upload it in Test Management.

                        2. The Automation Manager transmits multiple results related to one or more specific containers with one specimen to the Order Filler

                        1. Click on "Perform another test"
                        2. Select "Reporting results related to one or more containers for a specific specimen" as Message structure to use
                        3. Select the same patient as for the previous test
                        4. Fill the specimen; it must be available (tick "specimen available ?"). Then click on "create a container for specimen"
                        5. For the created container, click on "Add an order for this container"
                        6. Fill the new order. Result status must be equal to "Results stored; not yet verified" and the order status to "Some but not all results available".
                        7. Click on "Add to current specimen" and then click on "Add observations"
                        8. Add an observation relative to the work order. Observation result status must be equal to "Results entered - not verified".
                        9. Finally, send the message and check that it is properly received and integrated by your system under test.
                        10. Go to the "HL7 messages" part of the simulator and retrieve the message you have sent. Click on the message id (left-hand column) to get its permanent link and the validation result.
                        11. Copy and Paste the "permanent link to the test report" into Gazelle
                        12. As a proof of the correct integration of the message within your system, take a screenshot of your application (showing the order and observation) and upload it in Test Management.

                        Automation Manager

                        The Automation Manager is involved in LAB-5 transaction as an initiator. In this test, we check the capability of your system to send messages to the Order Filler part of the Order Manager Gazelle tool. LAB-5 transaction defines two message structures (OUL^R22^OUL_R22 and OUL^R23^OUL_R23); as an initiator for this transaction, your system under test must support one out of these two structures. If your system supports both, please repeat this test twice so that we can check the conformance of the messages produced by your system.

                        Pre-requisites

                        This part of the test will use the Order Manager as an Order Filler. First step to perform: retrieve the configuration of the Order Filler to which send messages: 

                        1. Go to the OrderManager simulator
                        2. Log onto the application using the CAS login, it uses your Gazelle (EU-CAT) credentials
                        3. Select the Laboratory domain (bottom-right drop-down menu)
                        4. Go to Results Management --> Order Filler --> Order Filler configuration
                        5. Select the character set encoding you will use to send messages to the simulator and enter the given configuration within your Order Filler.

                        The Automation Manager transmits multiple results to the Order Filler

                        1. Within your application, create a new work order with the following information OBR-25 = "R" and ORC-5="A"
                        2. Add an observation to this order with OBX-11 = "R"
                        3. Send the appropriate message to the Order Filler part of the Order Manager
                        4. Follow menu Browse data --> Order Filler's data and retrieve the work order you have just sent to the tool
                        5. Copy and paste the permanent link to this order in the pre-connectathon instance started in Test Management
                        6. Go to the "HL7 messages" part of the simulator and retrieve the message you have sent. Click on the message id (left-hand column) to get its permanent link and the validation result.
                        7. Copy and Paste the "permanent link to the test report" into Gazelle

                         

                        Order Result Tracker

                        The Order Result Tracker is involved in LAB-3 transaction as a responder. In this test, we check the capability of your system to integrate the messages received from an Order Filler (role played by the Order Manager Gazelle tool). This transaction defines two message structures (OUL^R22^OUL_R22 and ORU^R01^ORU_R01); as a responder your system must support both of them.

                        Pre-requisites

                        This test used the Order Manager tool as an Order Filler. In order to tell the simulator to which system send messages, your first action will be to create a new configuration for your system under test within the Order Manager. Go to the "SUT configurations" section to do so.

                        Then, go to Results management --> Order Filler --> Send test results to your SUT to start the test.

                        1. The Order Filler transmits specimen centered results to the ORT

                        1. Select your system under test in the drop-down list located at the top of the page
                        2. Select "Specimen centered result reporting" as Message structure to use
                        3. Click on "Create a new result from scratch". The list of available patients is displayed, select an existing one or create a new one.
                        4. Fill the specimen, put the specimen available (tick the "specimen available?" checkbox) and ask for created only one container. Then click on "Create an order for specimen".
                        5. Fill the new order. Result status must be equal to "Results stored; not yet verified" and the order status to "Some but not all results available".
                        6. Click on "Add to current specimen". Then, click on "Add observations".
                        7. Add an observation relative to the order. Observation result status must be equal to "Results entered - not verified".
                        8. Finally, send the message and check that it is properly received and integrated by your system under test.
                        9. Go to the "HL7 messages" part of the simulator and retrieve the message you have sent. Click on the message id (left-hand column) to get its permanent link and the validation result.
                        10. Copy and Paste the "permanent link to the test report" into Gazelle
                        11. As a proof of the correct integration of the message within your system, take a screenshot of your application (showing the order and observation) and upload it in Test Management.

                        2. The Order Filler transmits order centered results to the ORT

                        1. Click on "Perform another test"
                        2. Select "Order centered result reporting" as Message structure to use
                        3. Select the same patient as for the previous test
                        4. Fill the new order. Result status must be equal to "Results stored; not yet verified" and the order status to "Some but not all results available". Then click on "Create a specimen for order".
                        5. Fill the specimen; it must be available (tick "specimen available ?") and ask for created only one container. Then click on "Add to current order"
                        6. Finally click on "Add observations"
                        7. Add an observation relative to the order. Observation result status must be equal to "Results entered - not verified".
                        8. Finally, send the message and check that it is properly received and integrated by your system under test.
                        9. Go to the "HL7 messages" part of the simulator and retrieve the message you have sent. Click on the message id (left-hand column) to get its permanent link and the validation result.
                        10. Copy and Paste the "permanent link to the test report" into Gazelle
                        11. As a proof of the correct integration of the message within your system, take a screenshot of your application (showing the order and observation) and upload it in Test Management.

                        30601 : LAW Normal Process (for Analyzer Manager)

                        This test concerns the Analyzer Manager actor. You will need to validate with EVSClient tool the IHE conformance of your HL7 messages.

                        Instructions

                        As your system implements the Analyzer Manager actor, you will need to test the HL7 messages used in the LAB-27, LAB-28 and LAB-29 transactions.
                        Your system must be able to send HL7 messages (to the Analyzer actor) of types :

                        • RSP^K11^RSP_K11, for the LAB-27 transaction (Query for AWOS)
                        • OML^O33^OML_O33, for the LAB-28 transaction (Analytical Work Order Step Broadcast)
                        • ACK^R22^ACK_R22, for the LAB-29 transaction (AWOS Status Change)   

                        To test the IHE conformance of your HL7 messages, go to the EVSCLient tool at this location : EVSClient
                        If it is your first time with this tool, please read the user manual : EVSClient User Manual
                            
                        In the EVSClient tool, go to the HL7 menu entry and choose "HL7v2" then click on "Message Validation".
                        Paste your message to validate in the box. (You can hit the "Guess" button to preset the Message Profile OID.)
                            
                        For example, for the RSP^K11^RSP_K11 message :

                        1. Copy the RSP^K11^RSP_K11 HL7 message from your system. Paste it in the box in the EVSClient page. 
                        2. Select the Message profile OID according to your message. In this example : 
                          • Affinity Domain : IHE
                          • Domain : Laboratory
                          • Actor : Analyzer Manager
                          • Transaction : LAB-27
                          • Message Type : RSP^K11^RSP_K11
                          • HL7 Version : HL7v2.5.1  
                        3. Once you have selected the Profile OID, press the "Validate" button. If the Validation Result is not "PASSED", it means your HL7 message is not according to the LAW Profile of the Laboratory  Technical Framework. If the Validation Result is "PASSED", copy the "Permanent link" and paste it in Gazelle as the result of this test. For further details, see this tutorial : How enter your pre-connectathon test result in Gazelle (only for the LAW, LCSD and LBL Profiles of the Laboratory Domain) ?


                        Do this for all messages and don't forget to copy/paste the "Permanent link" of the validation result to Gazelle.

                        Evaluation

                        • The validation status shall be "PASSED" for all messages.
                        • The message type shall be the right message type, according to IHE.
                        • The Acknowledgment code of the RSP^K11^RSP_K11 and the ACK^R22^ACK_R22 messages shall be "AA" (MSA-1).
                        • Each of the three messages must be validated at least one time and the permanent link must be paste to Gazelle.

                        30602 : LAW Normal Process (for bi-directional Analyzer)

                        This test concerns the bi-directional Analyzer actor. You will need to validate with EVSClient tool the IHE conformance of your HL7 messages.

                        Instructions



                        As your system implements the Analyzer actor and supports the bi-directional communication option, you will need to test the HL7 messages used in the LAB-27, LAB-28 and LAB-29 transactions.
                        Your system must be able to send HL7 messages (to the Analyzer Manager actor) of types :

                        • QBP^Q11^QBP_Q11, for the LAB-27 transaction (Query for AWOS)
                        • ORL^O34^ORL_O34, for the LAB-28 transaction (Analytical Work Order Step Broadcast)
                        • OUL^R22^OUL_R22, for the LAB-29 transaction (AWOS Status Change)

                        To test the IHE conformance of your HL7 messages, go to the EVSCLient tool at this location : EVSClient
                        If it is your first time with this tool, please read the user manual : EVSClient User Manual
                           
                        In the EVSClient tool, go to the HL7 menu entry and choose "HL7v2" then click on "Message Validation".
                        Paste your message to validate in the box, and hit the "Guess" button to preset the Profile OID.
                           
                        For example, for the QBP^Q11^QBP_Q11 message :

                        1. Copy the QBP^Q11^QBP_Q11 HL7 message from your system. Paste it in the box in the EVSClient page.
                        2. Select the Message profile OID according to your message. In this example : 
                          • Affinity Domain : IHE
                          • Domain : Laboratory
                          • Actor : Analyzer
                          • Transaction : LAB-27
                          • Message Type : QBP^Q11^QBP_Q11
                          • HL7 Version : HL7v2.5.1
                        3. Once you have selected the Profile OID, press the "Validate" button. If the Validation Result is not "PASSED", it means your HL7 message is not according to the LAW Profile of the Laboratory Technical Framework. If the Validation Result is "PASSED", copy the "Permanent link" and paste it in Gazelle as the result of this test. For further details, see this tutorial : How enter your pre-connectathon test result in Gazelle (only for the LAW, LCSD and LBL Profiles of the Laboratory Domain) ?

                        Do this for all messages and don't forget to copy/paste the "Permanent link" of the validation result to Gazelle.

                        Evaluation

                        • The validation status shall be "PASSED" for all messages.
                        • The message type shall be the right message type, according to IHE.
                        • The Acknowledgment code of the ORL^O34^ORL_O34 message shall be "AA" (MSA-1).
                        • Each of the three messages must be validated at least one time and the permanent link must be paste to Gazelle.

                        30603 : LAW Normal Process (for Analyzer)

                        This test concerns the Analyzer actor. You will need to validate with EVSClient tool the IHE conformance of your HL7 messages.

                        Instructions

                        As your system implements the Analyzer actor and supports the bi-directional communication option, you will need to test the HL7 message used in the LAB-29 transaction.
                        Your system must be able to send HL7 messages (to the Analyzer Manager actor) of type :    
                        OUL^R22^OUL_R22, for the LAB-29 transaction (AWOS Status Change)
                           
                        To test the IHE conformance of your HL7 messages, go to the EVSCLient tool at this location : EVSClient
                        If it is your first time with this tool, please read the user manual : EVSClient User Manual
                           
                        In the EVSClient tool, go to the HL7 menu entry and choose "HL7v2" then click on "Message Validation".
                        Paste your message to validate in the box, and hit the "Guess" button to preset the Profile OID.
                           
                        For example :

                        1. Copy the OUL^R22^OUL_R22 HL7 message from your system. Paste it in the box in the EVSClient page.
                        2. Select the Message profile OID according to your message. In this example :
                          • Affinity Domain : IHE
                          • Domain : Laboratory
                          • Actor : Analyzer
                          • Transaction : LAB-29
                          • Message Type : OUL^R22^OUL_R22
                          • HL7 Version : HL7v2.5.1
                        3. Once you have selected the Profile OID, press the "Validate" button. If the Validation Result is not "PASSED", it means your HL7 message is not according to the LAW Profile of the Laboratory Technical Framework. If the Validation Result is "PASSED", copy the "Permanent link" and paste it in Gazelle as the result of this test. For further details, see this tutorial : How enter your pre-connectathon test result in Gazelle (only for the LAW, LCSD and LBL Profiles of the Laboratory Domain) ?

                        Evaluation

                        • The validation status shall be "PASSED" for all messages.
                        • The message type shall be the right message type, according to IHE.

                        30604 : LAW : Query for AWOS (for Analyzer Manager)

                        This test concerns only the Analyzer Manager actor. You will need to communicate with the LAW Simulator (which is included in the Order Manager Simulator), in order to simulate the LAB-27 transaction of the LAW Profile.

                        Instructions

                        Access the LAW Simulator tool at this location : Order Manager

                        If it is your first time with this tool, please read the user manual : Order Manager User Manual

                        Please be reminded that if you are logged in your configurations will be private, otherwise it will be public. 

                        In this test, the LAW Simulator plays the role of the "Analyzer". It is used to query the SUT (System Under Test) for an AWOS related to the specimen. 

                         

                        The SUT implements the Analyzer Manager (Analyzer Manager acts as a responder in this test) :

                        1. First at all, check that the selected "Domain for testing" (at the bottom right of the page) is "Laboratory".
                        2. Create or update the configuration corresponding to the SUT.
                        3. Then, go to the "Order Management" menu entry, choose "Analyzer", and click on the "Query Analyzer Manager for AWOS" page to begin the test.
                        4. In the "SUT Configurations" drop-down list, select the configuration matching the SUT.
                        5. Select the "Query Mode" and fill the required parameters values (See the description of the LAB-27 transaction in the LAW Technical Framework Supplement for further details about the usage of the parameters). 
                        6. Then, hit the "Send Query" button. The LAW Simulator will send the query to the SUT.
                        7. Check the simulator has properly received your acknowledgement message.

                        The messages exchanged between the simulator and the SUT can be found in the message table on the very same page or from the HL7 Message menu entry.

                        Hit the link on the left side  of the raw (first column of the table), to display the Permanent Link to the test report for the corresponding HL7 message. (You can access to the hl7v2 report tutorial for more details.)

                        If the validation report status is passed for the message and the response, copy the "test result link" and paste it in Gazelle as the result of this test.

                        For further details, see this tutorial : How enter your pre-connectathon test result in Gazelle (only for the LAW, LCSD and LBL Profiles of the Laboratory Domain which send HL7v2 messages)?

                        Evaluation

                        • The validation status must be passed for the two messages of the transaction. The message type must be the right message type, according to IHE.
                        • The Acknowledgment code must be "AA" (MSA-1) in the acknowledgment message.
                        • The QPD segment of the RSP^K11^RSP_K11 message must contain the same information that the QPD segment of the QBP^Q11^QBP_Q11 message

                        30605 : LAW Query for AWOS (for Analyzer)

                         

                        This test concerns only the Analyzer actor. You will need to communicate with the LAW Simulator (which is included in the Order Manager Simulator), in order to simulate the LAB-27 transaction of the LAW Profile.

                        Instructions

                        Access the LAW Simulator tool at this location : Order Manager

                        If it is your first time with this tool, please read the user manual : Order Manager User Manual

                        In this test, the LAW Simulator will be used to respond to the SUT (System Under Test) query.

                         

                        As the SUT implements the Analyzer (Analyzer acts as an initiator in this test) :

                        1. First at all, check that the selected "Domain for testing" (at the bottom right of the page) is "Laboratory".
                        2. You must be log in to be able to continue this test.
                        3. Go to the "Order Management" menu entry and choose "Analyzer Manager", then "Analyzer Manager Configuration" page to begin the test.
                        4. In the charset drop-down list, select the desired charset. 
                        5. Use the Ip Address and the Port linked to this charset to send your message to the LAW Simulator. 
                        6. Don't forget to hit the "Refresh List" button after to have send your message. In this test, the Analyzer SUT queries the Analyzer Manager simulator for a WOS related to a specimen.
                        7. Check the simulator has properly received and acknowledged your message.

                        The messages exchanged between the simulator and the SUT can be found in the message table on the very same page or from the HL7 Message menu entry.

                        Hit the link on the left side of the raw (first column of the table), to display the Permanent Link to test report for the corresponding HL7 message. (You can access to the hl7v2 report tutorial for more details.)

                        If the validation report status is passed for the message and the response, copy the "test result link" and paste it in the comment section of the pre-connectathon log return page as the result for this test.

                        For further details, see this tutorial : How enter your pre-connectathon test result in Gazelle (only for the LAW, LCSD and LBL Profiles of the Laboratory Domain which send HL7v2 messages)?

                        Evaluation

                        • The validation status must be passed for the two messages of the transaction. The message type must be the right message type, according to IHE.
                        • The Acknowledgment code must be "AA" (MSA-1) in the acknowledgment message.

                        30606 : LAW Analytical Work Order Step Broadcast (for Analyzer Manager)

                        This test concerns only the Analyzer Manager actor. You will need to communicate with the LAW Simulator (which is included in the Order Manager Simulator), in order to simulate the LAB-28 transaction of the LAW Profile.

                        Instructions

                        Access the LAW Simulator tool at this location : Order Manager

                        If it is your first time with this tool, please read the user manual : Order Manager User Manual

                        In this test, the LAW Simulator will receive the AWOS from the SUT, saves all information and responds with an acknowledgment.

                         

                        As the SUT implements the Analyzer Manager (Analyzer Manager acts as an initiator in this test) :

                        1. First at all, check that the selected "Domain for testing" (at the bottom right of the page) is "Laboratory".
                        2. You must be log in to be able to achieve this test.
                        3. Then, go to the "Order Management" menu entry and choose "Analyzer", then "Analyzer Configuration" page to begin the test.
                        4. In the "SUT Configurations" drop-down list, select the configuration matching the SUT.
                        5. Use the Ip Address and the Port linked to this charset to send your message to the LAW Simulator.
                        6. There are two different steps in this test.
                          1.  
                            1. Send a "new AWOS" message to the Analyzer. This message is used to broadcast the AWOS from your SUT to the Analyzer simulator.
                            2. Don't forget to hit the "Refresh List" button after to have send your message.
                            3. Check the simulator has properly received and acknowledged your message.
                          1.  
                            1. Send a "cancel AWOS" message to the Analyzer to cancel the AWOS sent in the first step.
                            2. Don't forget to hit the "Refresh List" button after to have send your message.
                            3. Check the simulator has properly received, and acknowledged your message.
                          • First Step : Send "new AWOS" to the Analyzer simulator.
                          • Second Step : Cancel the AWOS sent in the first step.

                        The messages exchanged between the simulator and the SUT can be found in the message table on the very same page or from the HL7 Message menu entry.

                        Hit the link on the left side of the raw (first column of the table), to display the Permanent Link to test report for the corresponding HL7 message. (You can access to the hl7v2 report tutorial for more details.)

                        If the validation report status is passed for the message and the response, copy the "test result link" and paste it in the comment section of the pre-connectathon log return page as the result for this test.

                        For further details, see this tutorial : How enter your pre-connectathon test result in Gazelle (only for the LAW, LCSD and LBL Profiles of the Laboratory Domain which send HL7v2 messages)?

                        Evaluation

                         

                        • For all steps :
                          • The validation status must be passed for the two messages of the transaction. The message type must be the right message type, according to IHE.
                          • The Acknowledgment code must be "AA" (MSA-1) in the acknowledgment message.
                        • For the First Step :
                          • In the OML^O33^OML_O33 message, the ORC-1 must be filled with "NW".
                          • In the ORL^O34^ORL_O34 message, the ORC-1 must be filled with "OK" and ORC-5 with "SC" or "IP".
                        • For the Second Step :
                          • In the OML^O33^OML_O33 message, the ORC-1 must be filled with "CA".
                          • In the ORL^O34^ORL_O34 message, the ORC-1 must be filled with "CR" and ORC-5 with "CA".

                        30607 : LAW Analytical Work Order Step Broadcast (for Analyzer)

                        This test concerns only the Analyzer actor. You will need to communicate with the LAW Simulator (wich is included in the Order Manager Simulator), in order to simulate the LAB-28 transaction of the LAW Profile.

                        Instructions

                        Access the LAW Simulator tool at this location : Order Manager

                        If it is your first time with this tool, please read the user manual : Order Manager User Manual

                        Please be reminded that if you are logged in your configurations will be private, otherwise it will be public. 

                        In this test, the LAW Simulator will be used to send to the SUT (System Under Test) an AWOS related to a specimen.

                         

                        As the SUT implements the Analyzer (Analyzer acts as a responder in this test) :

                        1. First at all, check that the selected "Domain for testing" (at the bottom right of the page) is "Laboratory".
                        2. Create or update the configuration corresponding to the SUT.
                        3. Then, go to the "Order Management" menu entry, choose "Analyzer Manager", and click on the "Analytical Work Order Step Broadcast" page to begin the test.
                        4. In the SUT Configurations drop-down list, select the SUT configuration.
                          1.  
                            1. For the "Action to perform", select "Create a new order". 
                            2. Select the patient to begin the test. Of course, you can create a new patient and a new encounter.(See the Order Manager user manual to have more information). 
                            3. Once the patient and the encounter have been selected, go to the "Create specimen" panel and fill all required fields. It is possible to fill some fields randomly in hitting the "Fill with random values" button. 
                            4. Hit the "create an order for specimen" button to save the specimen.
                            5. Then, in the "Create order for current specimen" panel, fill all required fields. Once again, it is possible to fill some fields randomly in hitting the "Fill with random values" button.
                            6. For the Result status, choose the value : "Final results: results stored and verified. Can only be changed with a corrected result.".
                            7. At least, hit the "Add to current specimen and send message" button to send the order to the selected SUT.
                            8. Check the simulator has properly received your acknowledgement message.
                          1.  
                            1. If you always are in the "Test Report" panel, hit the "Perform another test" button, otherwise return to the "Order Management" menu entry, choose "Analyzer Manager", and click on the "Analytical Work Order Step Broadcast" page.
                            2. For the "Action to perform", select "Cancel an existing order".
                            3. In the "Order Selection" panel, select the Order sent in the first step. You can use the filter option to search a specific Order.(See the Order Manager user manual to have more information). 
                            4. A modal panel will ask you to confirm the order cancellation. Say "Yes". The LAW Simulator will send the cancellation order to the SUT.
                            5. Check the simulator has properly received your acknowledgement message.
                          • First Step : Create and send a new order.
                          • Second Step : Cancel the AWOS sent in the first step.

                        The messages exchanged between the simulator and the SUT can be found in the message table on the very same page (see the "Test report" panel) or from the HL7 Message menu entry.

                        Hit the link on the left side  of the raw (first column of the table), to display the Permanent Link to the test report for the corresponding HL7 message. (You can access to the hl7v2 report tutorial for more details.)

                        If the validation report status is passed for the message and the response, copy the "test result link" and paste it in Gazelle as the result of this test.

                        For further details, see this tutorial : How enter your pre-connectathon test result in Gazelle (only for the LAW, LCSD and LBL Profiles of the Laboratory Domain which send HL7v2 messages)?

                        Evaluation

                        • For all steps :
                          • The validation status must be passed for the two messages of the transaction. The message type must be the right message type, according to IHE.
                          • The Acknowledgment code must be "AA" (MSA-1) in the acknowledgment message.
                        • For the First Step :
                          • In the ORL^O34^ORL_O34 message, the ORC-1 must be filled with "OK" and ORC-5 with "SC" or "IP".
                        • For the Second Step :
                          • In the ORL^O34^ORL_O34 message, the ORC-1 must be filled with "CR" and ORC-5 with "CA".

                        30608 : LAW AWOS Status Change (for Analyzer Manager)

                         

                        This test concerns only the Analyzer Manager actor. You will need to communicate with the LAW Simulator (which is included in the Order Manager Simulator), in order to simulate the LAB-29 transaction of the LAW Profile.

                        Instructions

                        Access the LAW Simulator tool at this location : Order Manager

                        If it is your first time with this tool, please read the user manual : Order Manager User Manual

                        Please be reminded that if you are logged in your configurations will be private, otherwise it will be public.

                        In this test, the LAW Simulator will be used to send to the SUT (System Under Test) the test results.

                         

                        As the SUT implements the Analyzer Manager (Analyzer Manager acts as a responder in this test) :

                        1. Create or update the configuration corresponding to your system under test; go to SUT Configurations > HL7 Responders
                        2. Then, go to Laboratory > Analyzer > [LAB-29] AWOS Status Change
                        3. In the "SUT Configurations" drop-down list, select the configuration matching the SUT.
                        4. Select a specimen to work on it. You can use the filter option to search a specific Specimen. (See the Order Manager user manual to have more information).
                        5. In the "Specimen details" panel, you can Add an observation to this specimen (this is optional in this test). You will need to fill all required fields. For the "Observation result status" choose : "Final results". Then, hit the "Add observation to specimen" button.
                        6. Now, in the "Related orders" panel, add an observation for each order. Hit the "related observations" link, then hit the "Add an observation to this order" button. You will need to fill all required fields. For the "Observation result status" choose : "Final results". Then, hit the "Add observation to order" button.
                        7. Always in the "Related orders" panel, for each order, in the Status and Result Status columns : Update the "Status" to "Order is completed" and the "Result status" to "Final results" using the "Update" button.  
                        8. Finally, hit the "Send Message" button. The LAW Simulator will send the message to the SUT.
                        9. Check the simulator has properly received your acknowledgement message.

                        The messages exchanged between the simulator and the SUT can be found in the message table on the very same page or from the HL7 Message menu entry.

                        Hit the link on the left side  of the raw (first column of the table), to display the Permanent Link to the test report for the corresponding HL7 message. (You can access to the hl7v2 report tutorial for more details.)

                        If the validation report status is passed for the message and the response, copy the "test result link" and paste it in Gazelle as the result of this test.

                        For further details, see this tutorial : How enter your pre-connectathon test result in Gazelle (only for the LAW, LCSD and LBL Profiles of the Laboratory Domain which send HL7v2 messages)?

                        Evaluation

                        • The validation status must be passed for the two messages of the transaction. The message type must be the right message type, according to IHE.
                        • The Acknowledgment code must be "AA" (MSA-1) in the acknowledgment message.

                         

                        30609 : LAW AWOS Status Change (for Analyzer)

                        This test concerns only the Analyzer actor. You will need to communicate with the LAW Simulator (which is included in the Order Manager Simulator), in order to simulate the LAB-29 transaction of the LAW Profile.

                        Instructions

                        Access the LAW Simulator tool at this location : Order Manager

                        If it is your first time with this tool, please read the user manual : Order Manager User Manual

                        In this test, the LAW Simulator will be used to respond to the SUT (System Under Test) query.

                         

                        As the SUT implements the Analyzer Manager (Analyzer Manager acts as an initiator in this test) :

                        1. First at all, check that the selected "Domain for testing" (at the bottom right of the page) is "Laboratory".
                        2. You must be log in to be able to continue this test.
                        3. Go to the "Result Management" menu entry and choose the "Analyzer Manager Configuration" page to begin the test.
                        4. In the charset drop-down list, select the desired charset. 
                        5. Use the Ip Address and the Port linked to this charset to send your message to the LAW Simulator.
                        6. In this test, the Analyzer SUT needs to send test results related to a specimen.
                        7. Don't forget to hit the "Refresh List" button after to have send your message.
                        8. Check the simulator has properly received and acknowledged your message.

                        The messages exchanged between the simulator and the SUT can be found in the message table on the very same page or from the HL7 Message menu entry.

                        Hit the link on the left side of the raw (first column of the table), to display the Permanent Link to test report for the corresponding HL7 message. (You can access to the hl7v2 report tutorial for more details.)

                        If the validation report status is passed for the message and the response, copy the "test result link" and paste it in the comment section of the pre-connectathon log return page as the result for this test.

                        For further details, see this tutorial : How enter your pre-connectathon test result in Gazelle (only for the LAW, LCSD and LBL Profiles of the Laboratory Domain which send HL7v2 messages)?

                        Evaluation

                        • The validation status must be passed for the two messages of the transaction. The message type must be the right message type, according to IHE.
                        • The Acknowledgment code must be "AA" (MSA-1) in the acknowledgment message.

                        OM_MWL

                        In this test, the OrderManager tool is a DICOM Modality Worklist SCP, and your application (most commonly an Acquisition Modality actor) is the MWL SCU.

                        The test is a sanity check of your worklist query capabilities for pre-Connectathon testing.   In some cases, the Order Manager is used as a MWL SCP during a Connectathon, and this test helps prepare you for that.

                        Prerequisite

                        Refer to the OrderManager User Manual for instructions.

                        Test Steps

                        As a MWL SCU in this test, your application will query the OrderManager (MWL SCP) for a worklist.

                        1. First create a worklist entry in the Order Manager tool:
                          1. TO CREATE A WORKLIST ENTRY FROM AN ORDER ALREADY IN THE TOOL: In the OrderManager, select menu Radiology-->Order Filler-->Create a DICOM worklist (or Eye care-->Order Filler-->Create a DICOM worklist).  Then select one order from the table of existing order entries
                          2. TO CREATE NEW ORDER FOR A WORKLIST ENTRY: In the OrderManager, start by selecting  Radiology-->Order Filler-->[RAD-3] Create...orders (or Eye-->Order Filler-->[RAD-3] Create...orders).  Then proceed through the steps to create the order, schedule it, then create the worklist for your new order.
                        2. You will be prompted to provide your application's "Station AE Title" prior to selecting the "Generate worklist" button
                        3. Capture the "Permanent link" to your worklist details (ie the URL)
                        4. Use your application to query the OrderManager for modality worklist 
                        5. Create a screen capture from your application showing the worklist received.
                        6. For your pre-Connectathon test results in gazelle, upload the screen capture file and copy the permanent link into the comment box as the results for this
                          test.

                        Evaluation

                        The link to the worklist entry & the screen shot demonstrate that you have successfully received worklist.

                         

                        OM_RAD-2-Receive

                        This tests the RAD-2 transaction from the point of view of the Order Filler as system under test.

                        In this test, we will check that your Order Filler is able to integrate the creation and cancellation of orders received from the OrderManager tool playing the role of the Order Placer actor. 

                        Prerequisite

                        1. CONFIGURATION: Before beginning this test, do not forget to check that the configuration (IP address, port, application/facility names) of your system under test is entered within the OrderManager tool.  Use the "SUT Configurations" menu. 
                        2. TEST PATIENT: Your Order Filler is assumed to be coupled with a ADT actor; that means that you should be able either to receive a new patient from an ADT or Patient Demographics Supplier actor.  To populate your system with a patient, use the PatientManager tool create a patient and an send ADT message to your system.  In the steps below, you will use that patient to create an order in the OrderManager.  Follow these  instructions for using PatientManager to create a patient that is then used by the OrderManager.

                        Refer to the OrderManager user manual and these details about sending RAD-2.

                        Test Steps

                        As a receiver in this test, your Order Filler shall be able to integrate all of the three message structures defined in the technical framework. As a consequence, you are asked to perform three steps.

                        1. Create a new order

                        In this step, your Order Filler proves its ability to accept and integrate an order creation sent by the Order Placer (ORC-1="NW") 

                        1. Go to menu 'Radiology' or 'Eye care', then --> Order Placer --> [RAD-2] Create/Cancel orders
                        2. Select your system under test in the drop-down list
                        3. Select "Create a new order" as action to perform.  By default, you will get an HL7 v2.3.1 ORM.  If you want a HL7 v2.5.1 OMG messages, click that box.
                        4. Select the patient for whom you want to create a order
                        5. Fill the order and click on the "Send message" button
                        6. Retrieve the permanent link to the test report and paste it into Gazelle Test Management as the results for this test.
                        7. Take a screenshot of your application as a proof of the good integration of the message. Upload it in Gazelle.

                        2. Cancel an existing order

                        In this step, your Order Filler proves its ability to accept and integrate an order cancellation sent by the Order Placer (ORC-1="CA") and to acknowledge it (with an ACK message).

                        1. Go to Radiology --> Order Placer --> [RAD-2] Create/Cancel orders
                        2. Select your system under test in the drop-down list
                        3. Select "Cancel an existing order" as action to perform
                        4. Select the  order to cancel and when the pop-up raises, click on the "Yes" button.
                        5. Retrieve the permanent link to the test report and paste it in Gazelle.
                        6. Take a screenshot of your application as a proof of the good integration of the message. Upload it in Gazelle.

                        3. Stop the fulfillment of an order (DC)

                        In this step, your Order Filler proves its ability to accept the discontinue of an ongoing order sent by the Order Placer (ORC-1="DC") and to acknowledge it (with an ACK message).

                        1. Repeat step 1. Create a new order
                        2. On your Order Filler, starts processing the order. The Order Placer shall be notified that the order is "in progress": send a "Update Order Status" (RAD-3) message to the Order Placer actor of the tool (you will find its connection information under Radiology > Order Placer > Configuration). 
                        3. Go to Radiology --> Order Placer --> [RAD-2] Create/Cancel orders
                        4. Select your system under test in the drop-down list
                        5. Select "Stop the fulfillment of an order (DC)" as action to perform
                        6. Select the  order to discontinue and when the pop-up raises, click on the "Yes" button.
                        7. Retrieve the permanent link to the test report and paste it in Gazelle.
                        8. Take a screenshot of your application as a proof of the good integration of the message. Upload it in Gazelle.

                        Evaluation

                        The permanent links to the test report & the screen shots demonstrate that you have successfully handled the received order messages.

                        OM_RAD-3-Receive

                        This tests the RAD-3 transaction from the point of view of the Order Placer as system under test.

                        In this test, we will check that your Order Placer is able to integrate the creation and cancellation of orders received from the Order Manager tool playing the role of the Order Filler actor. 

                        Prerequisite

                        1. CONFIGURATION: Before beginning this test, do not forget to check that the configuration (IP address, port, application/facility names) of your system under test is entered within the Order Manager tool.  Use the "SUT Configurations" menu. 
                        2. TEST PATIENT: Your Order Filler is assumed to be coupled with a ADT actor; that means that you should be able either to receive a new patient from an ADT or Patient Demographics Supplier actor.  To populate your system with a patient, use the Patient Manager tool create a patient and an send ADT message to your system.  In the steps below, you will use that patient to create an order in the Order Manager.  Follow these  instructions for using Patient Manager to create a patient that is then used by the Order Manager.

                        Refer to the Order Manager user manual.

                        Test Steps

                        As a receiver in this test, your Order Placer shall be able to integrate all of the three message structures defined in the technical framework. As a consequence, you are asked to perform three steps.

                        1. Create a new order

                        In this test, your Order Placer receives an ORM (v2.3.1) or OMG (v2.5.1) from the Order Manager acting as Order Filler.  You respond with an ORR (v2.3.1) or ORG (v2.5.1)

                        1. Go to menu 'Radiology' or 'Eye care', then --> Order Filler --> [RAD-3] Create/Update/Cancel orders
                        2. Select your system under test in the drop-down list
                        3. Select "Create a new order" as action to perform.  By default, you will get an HL7 v2.3.1 ORM.  If you want a HL7 v2.5.1 OMG messages, click that box.
                        4. Select the patient for whom you want to create a order
                        5. Fill the order and click on the "Send message" button
                        6. Retrieve the permanent link to the test report and paste it into Gazelle Test Management as the results for this test.
                        7. Take a screenshot of your application as a proof of the good integration of the message (a new order). Upload it in Gazelle.

                        2. Cancel an existing order

                        In this test, your Order Placer receives an ORM-Cancel (v2.3.1) or OMG-Cancel (v2.5.1) from the Order Manager acting as Order Filler.  You respond with an ACK.

                        1. Go to menu 'Radiology' or 'Eye care', then --> Order Filler --> [RAD-3] Create/Update/Cancel orders
                        2. Select your system under test in the drop-down list
                        3. Select "Cancel an existing order" as action to perform
                        4. Select the  order to cancel and when the pop-up raises, click on the "Yes" button.
                        5. Retrieve the permanent link to the test report and paste it in Gazelle.
                        6. Take a screenshot of your application as a proof of the good integration of the message (a cancelled order). Upload it in Gazelle.

                        3. Update order status

                        In this test, your Order Placer receives an ORM-Status update (v2.3.1) or OMG-Status update (v2.5.1) from the Order Manager acting as Order Filler.  You respond with an ACK.

                        1. Go to menu 'Radiology' or 'Eye care', then --> Order Filler --> [RAD-3] Create/Update/Cancel orders
                        2. Repeat step 1. Create a new order to place a new order in your Order Placer
                        3. Select your system under test in the drop-down list
                        4. Select "Update order status" as action to perform
                        5. Select the  order to updpate and when the pop-up raises, click on the "Yes" button.
                        6. Retrieve the permanent link to the test report and paste it in Gazelle.
                        7. Take a screenshot of your application as a proof of the good integration of the message (an updated order). Upload it in Gazelle.

                         Evaluation

                        The permanent links to the test report & the screen shots demonstrate that you have successfully handled the received order messages.

                         

                        OM_RAD-4_Rad-13-Receive

                        This tests the RAD-4 (Procedure Scheduled) and RAD-13 (Procedure Updated) transactions from the point of view of the Image Manager as system under test.

                        In this test, we will check that your Image Manager is able to integrate the scheduling and cancelllation of procedures received from the Order Manager tool playing the role of the Order Filler actor. 

                        You may use the Order Manager to send

                        • RAD-4 and RAD-13:  HL7 v2.3.1 orders (SWF), or
                        • RAD-4 and RAD-13:  HL7 v2.5.1 orders (SWF.b)

                        Prerequisite

                        1. CONFIGURATION: Before beginning this test, do not forget to check that the configuration (IP address, port, application/facility names) of your system under test is entered within the Order Manager tool.  Use the "SUT Configurations" menu. 
                        2. TEST PATIENT:   In the steps below, you will use a test patient to create and schedule an order in the Order Manager.  Follow these  instructions for using Patient Manager to create a patient that is then used by the Order Manager.

                        Refer to the Order Manager user manual.

                        Test Steps

                        As a receiver in this test, your Image Manager shall be able to integrate all of the message structures defined in the technical framework. As a consequence, you are asked to perform these four steps.

                        1. Procedure Scheduled [RAD-4]

                        In this step, your Image Manager proves its ability to accept and integrate a new scheduled procedure sent by the Order Filler.

                        1. Go to menu 'Radiology' or 'Eye care', then --> Order Filler --> [RAD-4/13] Schedule/update procedures
                        2. Select your system under test in the drop-down list
                        3. Select "Procedure Scheduled" as action to perform.  
                        • By default, you will get an HL7 v2.3.1 ORM.  If you want an HL7 v2.5.1 OMG messages, click that box.
                      • Select an existing order in the tool, or create an order from scratch
                      • Complete the "Scheduled Procedure Step" information and then select the "Send message" button
                      • Retrieve the permanent link to the test report and paste it into Gazelle Test Management as the results for this test.
                      • Take a screenshot of your application as a proof of the good integration of the message (eg. this may be evidence of a database entry). Upload it in Gazelle.
                      • 2. Procedure update:  cancel order request (CA) [RAD-13]

                        In this step, your Order Image Manager proves its ability to accept and integrate an order cancellation sent by the Order Filler (ORC-1="CA") and to acknowledge it.

                        1. Go to menu 'Radiology' or 'Eye care', then --> Order Filler --> [RAD-4/13] Schedule/update procedures
                        2. Select your system under test in the drop-down list
                        3. Select "Procedure update: cancel order request" as action to perform.  
                          1. By default, you will get an HL7 v2.3.1 ORM.  If you want an HL7 v2.5.1 OMI message, click that box. 
                        4. Select the procedure to cancel and when the pop-up raises, click on the "Yes" button.
                        5. Retrieve the permanent link to the test report and paste it in Gazelle.
                        6. Take a screenshot of your application as a proof of the good integration of the message. Upload it in Gazelle.

                        3. Procedure update: discontinue order request (DC) [RAD-13]

                        In this step, your Image Manager proves its ability to accept the discontinue of an ongoing order sent by the Order Filler (ORC-1="DC") and to acknowledge it.

                        1. Go to menu 'Radiology' or 'Eye care', then --> Order Filler --> [RAD-4/13] Schedule/update procedures
                        2. Select your system under test in the drop-down list
                        3. Select "Procedure update: discontinue order request" as action to perform.  
                          1. By default, you will get an HL7 v2.3.1 ORM.  If you want an HL7 v2.5.1 OMI message, click that box. 
                        4. Select the procedure to cancel and when the pop-up raises, click on the "Yes" button.
                        5. Retrieve the permanent link to the test report and paste it in Gazelle.
                        6. Take a screenshot of your application as a proof of the good integration of the message. Upload it in Gazelle.

                        4. Procedure update: change order request (XO) [RAD-13]

                        In this step, your Image Manager proves its ability to accept the procedure update/change order request (order still scheduled or in progress) sent by the Order Filler (ORC-1="XO") and to acknowledge it.

                        1. Go to menu 'Radiology' or 'Eye care', then --> Order Filler --> [RAD-4/13] Schedule/update procedures
                        2. Select your system under test in the drop-down list
                        3. Select "Procedure update: change order request" as action to perform.  
                          1. By default, you will get an HL7 v2.3.1 ORM.  If you want an HL7 v2.5.1 OMI message, click that box. 
                        4. Select the procedure to cancel and when the pop-up raises, click on the "Yes" button.
                        5. Retrieve the permanent link to the test report and paste it in Gazelle.
                        6. Take a screenshot of your application as a proof of the good integration of the message. Upload it in Gazelle.

                        5. Procedure update: order has been completed (XO) [RAD-13]

                        In this step, your Image Manager proves its ability to accept the procedure update/order completed sent by the Order Filler (ORC-1="XO") and to acknowledge it.

                        1. Go to menu 'Radiology' or 'Eye care', then --> Order Filler --> [RAD-4/13] Schedule/update procedures
                        2. Select your system under test in the drop-down list
                        3. Select "Procedure update: order has been completed" as action to perform.  
                          1. By default, you will get an HL7 v2.3.1 ORM.  If you want an HL7 v2.5.1 OMI message, click that box. 
                        4. Select the procedure to cancel and when the pop-up raises, click on the "Yes" button.
                        5. Retrieve the permanent link to the test report and paste it in Gazelle.
                        6. Take a screenshot of your application as a proof of the good integration of the message. Upload it in Gazelle.

                        Evaluation

                        The permanent links to the test report & the screen shots demonstrate that you have successfully handled the received order messages.

                        PDI tests

                        Pre-connectathon testing for systems implementing the PDI (Protable Data for Images) Profile as a Portable Media Creator are perfomed using the PDI Media Tester tool and associated test plans originally developed by Northwestern University. 

                        Location of tool and test plan documentation:

                        Specific instructions for the Portable Media Creator actor are in the test cases at the link above..

                         

                        PDI-Media_Tester_PMC

                        We use the Portable Media Tester application and test plans developed by Northwestern to test PDI media created by a Portable Media Creator.

                        Connectathon-related Considerations

                        When you prepare your media for Connectathon testing, you should include on your media DICOM objects that represent the range of images, structured reports, GSPS objects, Key Image Notes, etc that can be produced by your application.   Including a full set (rather than one image) enhances the interoperability testing with your Portable Media Importer test partners.

                        Testing Instructions

                        1. Install the Portable Media Tester application available on Google Drive in:  IHE Documents > Connectathon > tools > RAD-PDI-Media-Tester
                        2. Access ihe_pdi_testplan_2019.doc and TestsCases.xls from the IHE_PDI_Document_TestCases.zip file in that directory.
                        3. Follow the instructions in the test plan document to install the Portable Media Tester application.
                        4. Follow the instructions in the test plan document to execute all test cases (1901, 1903...1915).
                        Evaluation
                         
                        Capture and submit your results:
                        1. When you complete the test, look at the bottom left hand corner of the application GUI to see the location where the log files are written. Go to that directory and retrieve two files: grade_pdi_media.txt and error_pdi_media.txt.
                        2. Upload the two .txt files along with a screen capture of RSNA PDI Media Tester as your final result into Gazelle Test Management as the results for this pre-Connectathon test.
                        3. Change the status of the test to Verified by vendor

                        Sample Exchange tests

                        Gazelle Test Management has a feature that allows participants in a testing session to share sample objects with other participants.

                        In Gazelle Test Management, a "sample" is any object or message that an application creates and is used by another application. Typical samples include: 

                        • DICOM objects
                        • CDA documents
                        • FHIR respources
                        • ...others...

                        Gazelle Test Management uses profiles and actors selected during System Registration to determine which systems are 'creators' of samples and which systems are 'consumers' of samples

                        Creators upload a file containing their sample into Gazelle Test Management (Testing-->Sample exchange).  Consumers find samples uploaded by their Creator peers.  Consumers download the samples and are able to test them with their application.

                        Test cases follow below...

                        AIR_Sample_Exchange

                        Overview

                        The AIR Evidence Creator will submit samples AI Results (SR objects) produced by its system and the associated DICOM images. The goal of this test is to prepare Consumer actors so they are not surprised during Connectathon.

                        In order to facilitate this testing, Creators....please submit your samples 2-3 weeks before the usual preparatory test deadlines

                        Sample  Creator(s) Consumer(s)

                        DICOM SR(s)

                        DICOM Segmentation, Parametric Map or KOS IOD(s)

                        DICOM image(s)

                        AIR Evidence Creator

                        AIR Image Mgr, Image Display, Imaging Doc Consumer

                         

                        Instructions for Evidence Creators:

                        FIRST, prepare the sample(s)-->

                        Evidence Creator system must provide a sample that includes both DIOM SRs and image objects.  Although it is likely that an Acquisition Modality (ie not Evidence Creator) will be the source of images for which AIR SR objects are created, we ask the Evidence Creator to submit them together as a 'sample'

                        The AI result SR objects(s) Evidence Creator should be representative of what the system would created during expected clinical use when an AI algorithm is run.

                        1. You will upload sample SR object(s) into Gazelle Test Management under Testing-->Sample exchange in the "AIR" entry.   If your Evidence Creator implements more than one AI algorithm (i.e. produces different AI result SRs depending on the algoritm), then the Evidence  Creator shall create separate Sample entries for each AI algorithm.
                        2. In the sample entry, identify the IOD and primitives for the sample.  You will see "attributes" you can select; choose all that apply to this sample:
                        • IOD - Comprehensive 3D SR Storage (1.2.840.10008.5.1.4.1.1.88.34)
                        • IOD - Segmentation Storage (1.2.840.10008.5.1.4.1.1.66.4)
                        • IOD - Parametric Map Storage (1.2.840.10008.5.1.4.1.1.30)
                        • IOD - Key Object Selection Document Storage (1.2.840.10008.5.1.4.1.1.88.59)
                        • Primitive - Qualitative Findings
                        • Primitive - Measurements
                        • Primitive - Locations
                        • Primitive - Regions
                        • Primitive - Tracking Identifiers
                        • Primitive - Parametric Maps

                        3. Validate the SR using Gazelle EVS (green arrow on the sample page).  You must select the DICOM validator from Pixelmed.  During the Connectathon, we will be looking for a validation result with no errors (warnings are OK).

                        4. Evidence Creator of the AI result SR object should also upload the images of the study to which the SR pertains.

                        5. If you have this ability, render the objects you produced as a reference. The goal during Connectathon is for an Image Display partner to examine your rendered images and SR objects and compare that to what their software produces. Perform a screen capture and/or save as JPEG or other reasonable format.  Upload the screenshot onto the sample page in Gazelle Test Management.

                        Repeat Steps 1 - 5 for each AI algorithm and resulting AI sample.

                        Finally, create a short txt file indicating you have completed the upload step. Upload '''that''' txt file into Gazelle Test Management as the result file for this preparatory test.

                        There is no specific evaluation for this test.  Feedback comes as your partner Image Display and Image Manager partners test with your images in their lab.

                         

                        Instructions for Consumers:

                        1. Find sample SR objects and images uploaded by other vendors in Gazelle Test Management under Testing-> Sample exchange.    On the Samples available for rendering tab under the AIR entry, you willl find samples and screen captures submitted your Evidence Creator test partners.  This page will evolve as vendors add samples, so be patient. 
                        2. Retrieve the DICOM or zip files created by the other vendors.  Refer to these help pages for details on this task.
                        3. Examine/import/render them so that you are confident your software understands the content.
                        4. When you are finished, create a .txt or other file that lists the samples you tested with, and a brief comment on your result (eg "ok", or "could not render"). Upload that txt file into Gazelle Test Management as the result file for this preparatory test.
                        5. If you find issues with the images, send an email to the Connectathon Technical Manager.
                        6. The evaluation of this test is performed by examining the text you provided to make sure you made a good faith effort to review the sample images. If you find problems with some samples, we will give your feedback to the Evidence Creator test partner.
                        7. The goal is no surprises.

                        Finally, for both Creators & Consumers

                        1. In Gazelle Test Management, find the entry for this test instance, and change the status to "Verified by Vendor".  This is a signal to the Technical Manager that you have completed this task.

                         

                        APPC_Sample_Privacy_Consent_Docs

                        Overview

                        The APPC profile enables creating Patient Privacy Policy Consent documents of many, many variations.  For Connectathon testing, we have defined tests based on use cases documented in the APPC Profile.   

                        It will help APPC Content Creators & Content Consumers to be familiar with the APPC tests prior arriving at Connectathon.

                        (Note:  We currently have no tool for evaluating APPC documents)

                         

                        Instructions 

                        Before Connectathon -- for APPC Content Creators:

                        (1) Read Connectathon test APPC_10_Read_This_First.  Find this on your main Connectathon page in gazelle (menu Connectathon-->Connectathon).  This test specifies policies, organizations, facilities, and providers that you will need to encode in the APPC Policy Consent documents that you will create for Connectathon testing.

                        (2) Read the APPC Connectathon tests for Case1, Case5, Case6, and "Other".   For Connectathon, you will be required to create 3 documents.  

                        • You must perform any 2 of these 3 tests: Case1, Case5, Case6.

                        • You must perform test APPC_Other_Use_Case.

                        (3) We highly encourage Content Creators to create these documents prior to arriving at Connectathon.

                        • Create the APPC documents according to the instructions in the test cases.
                        • Upload your documents into the samples area of Gazelle Test Management under menu Testing-->Sample exchange. 
                          • On the 'Samples to share' tab, find the entry for APPC.   Upload your APPC sample.
                          • Under "Actions", use the green triangle icon the validate your APPC document using Gazelle EVSClient.  We expect a validation result with no errors.
                        • Providing your APPC policy consent document early enables Content Consumers to test with your document in advance.

                         

                        Before Connectathon -- for APPC Content Consumers:

                        (1) Read Connectathon test APPC_10_Read_This_First.  Find this on your Test Execution page in Gazelle Test Management.  This test specifies policies, organizations, facilities, and providers that Content Creators will use in the policy consent documents they create

                        (2) Above, we asked Content Creators to provide their sample APPC documents in advance of Connectathon, so...

                        (3) Check for sample APPC documents provided by Content Creators:

                        • In Gazelle Test Management, look under menu Testing-->Sample exchange.  
                        • Check the 'Samples available for rendering' tab under the 'APPC' entry.

                         

                        Evaluation

                        Finally, for both Creators & Consumers:

                        1. In Gazelle Test Management, ind the entry for this test instance, and change the status to "Verified by Vendor".  This is a signal to the Technical Manager that you have completed the task associated with your actor.

                         

                        CDA_Document_Sample_Exchange

                        Overview

                        This is a family of tests. Each of the tests (40180-01, 02, 03, ...) is for a specific document type, but the instructions are the same.  We have different test numbers to let us keep track of the document type in Gazelle Tests Management.

                        Please refer to the list of document types in the table below.

                        • Content Creators: You will create a document defined in a IHE profile -- CDA format and other formats -- and provide that as a sample for other participants to examine.
                        • Content Consumers:  You will import the samples into your system and test with them prior to the Connectathon to ensure that you can process the document.

                        The goal is 'No Surprises' at the Connectathon

                         

                        Instructions for Content Creators

                        Creators....please upload your samples two weeks before the normal due date for Preparatory tests. This is to allow other participants to review your result.

                        1. Create a document according to the test number (see table below). Name the file using this convention:

                        • SYSTEMNAME_TESTNUMBER_type.xml, for example
                        • EHR_XYZmed_40180-01_XDSMS_Referral.xml
                        • Get your system name right, get the test number right, use common sense to express the type.

                        2. Upload the document into Gazelle Test Management under menu Testing-->Sample exchange.

                        • Click on the Samples to share tab for your system and add the appropriate document.

                        3. In Gazelle Test Management for this test instance, upload a brief note (txt file) indicating this task is done and upload the file as the results for this test.

                        4. Finally, change the status of the test to "Verified by Vendor".  This is a signal to the Technical Manager that you have completed the task associated with your actor.

                        5. Repeat these instructions for each document type you can create.

                         

                        Instructions for Content Consumers

                        1. Find samples uploaded by other vendors for test 40180-xx in Gazelle Test Management under menu Testing -> Sample exchange on the Samples available for rendering tab. (When a Content Creator or Form Filler has uploaded a sample, you will see a small triangle in front of their system name.)  This page will evolve as vendors add samples, so be patient. The deadline for Creators to submit samples is typically two weeks prior to the Preparatory test deadline. Technical Managers of each testing event publish the deadline.

                        2. Retrieve the documents created by the other vendors. "Process/render" them so that you are confident your software understands the content.

                        For Content Consumer actors, "Process/render" means to apply one or more of the options:

                        • View
                        • Document Import
                        • Section Import
                        • Discrete Data Import

                        For Form Manager actors, "Process/render" means to take the prepop data and make it part of the form.

                        3. You will perform one or more of those actions on the sample documents and then provide evidence that you have performed this action. That evidence will be some screen capture or database dump from your system.  Upload that evidence into Gazelle Test Management as the results for this Preparatory test.

                         

                        Document Types for this sample-sharing test:

                        Preparatory
                        test number

                        IHE Profile Document type
                        40180-01  XDS-MS Referral Document
                        40180-02  XDS-MS Discharge Summary
                        40180-03  ED Referral
                        40180-04  XPHR Extract
                        40180-05  XPHR Update
                        40180-06  Antepartum History and Physical
                        40180-07  Antepartum Summary
                        40180-08  Antepartum Laboratory Report
                        40180-09  Antepartum Education
                        40180-10  Triage Note
                        40180-11  ED Nursing Note
                        40180-12  Composite Triage and ED Nursing Note
                        40180-13  ED Phyisician Note
                        40180-14  Immunization Content
                        40180-15  Sharing Lab Report (XD-LAB)
                        40180-16  ITI - Basic Patient Privacy Consent acknowledgement (not CDA, but BPPC)
                        40180-17  ITI - XDS-SD Scanned Document
                        40180-18  Labor/Delivery Admission History and Physical
                        40180-19  Labor/Delivery Summary
                        40180-20  Maternal Discharge Summary
                        40180-21  EMS Transfer of Care
                        40180-22  Patient Plan of Care (PPOC)
                        40180-26  eNursing Summary
                        40180-27  Newborn Discharge Summary
                        40180-28  Postpartum Visit Summary
                        40180-29  EMS Transport Summary
                        40180-30  Interfacility Transport Summary
                        40180-31  RECON
                        40180-32  Patient Care Plan (PtCP)
                        40180-33  RIPT
                        40180-34  International Patient Summary (IPS) - CDA, CDA Complete, and CDA Occ Data Health Options
                        40180-100  QRPH - CRD: Clinical Research Document
                        40180-101  QRPH - DSC: Drug Safety Content
                        40180-106  QRPH - PRPH-Ca: Physician Reporting to a Public Health Repository-Cancer Registry
                        40180-108  QRPH - QRPH - BFDR-E - Birth and Fetal Death Reporting - Enhanced
                        40180-109  QRPH - EHDI - HPoC: UV Realm: Hearing Plan of Care UV Realm
                        40180-110  QRPH - EHDI - HPoC: US Realm: Hearing Plan of Care US Realm
                        40180-111  QRPH - HW: Healthy Weight 
                        40180-113  QRPH - QME-EH: Quality Measure Execution - Early Hearing
                        40180-114  QRPH - VRDR: Vital Records Death Reporting 
                        40180-200  CARD - CIRC: Cardilogy Imaging Report Content
                        40180-201  CARD - CRC: Cath Report Content
                        40180-202  CARD - RCS-C: Registry Content Submission - Cardiology
                        40180-203  CARD - EPRC-I/E: Electrophysiology Report Content - Implant/Explant
                        40180-300  EYECARE - GEE: General Eye Evaluation Content
                        40180-301  EYECARE - EC-Summary: Eye Care Summary Record Content

                        DICOM_Object_Sample_Exchange

                        Overview

                        In this “test”, Creators of DICOM objects submit a sample data set that will be reviewed by other Consumer test partners. The goal of this test is to prepare Consumer actors (Image Manager, Image Display) so they are not surprised during Connectathon events.

                        In order to facilitate this testing, Creators....please submit your samples 2-3 weeks before the usual Preparatory test deadline

                        Sample  Creator(s) Consumer(s)

                        DICOM image(s)

                        DICOM SR(s)

                        Acquisition Modality in various profiles

                        Evidence Creator or Modality in various profiles

                        Image Mgr, Image Display, Dose Reporter or Consumer

                         

                        Instructions for DICOM Image or SR Creators

                          1. Determine a representative set of image(s), or a Structured Report, that would help other actors understand your content. We do not expect you to generate images for every combination of parameters; pick one set of parameters. However, if your system supports more than one SOP class, or can use more than one transfer syntax, we expect you to upload representative samples for each.  Create file(s) of these objects (i.e. .dcm files).
                          2. If you have this ability, render the images or SR that you produced as a reference. The goal is for an Image Display actor to examine your rendering and compare to what their software produces. Perform a screen capture and/or save as JPEG or other reasonable format.
                          3. Upload sample objects and the screen capture snapshot into Gazelle Test Management under menu Testing-> Sample exchange.
                          • On the Samples to share tab, upload your sample objects(s) under the DICOM_OBJECTS entry. 
                          • On the "summary" tab for the sample, you will see a list of Attributes.  Choose the attribute that best describes your sample (e.g. "NMI" for a Nuc Med image, "SWF.b" for a general sample for that profile, etc)  Refer to these help pages for details on managing samples in Gazelle Test Management.
                          • Under "Actions", use the green triangle icon the validate your DICOM object(s) using Gazelle EVS.  We expect a validation result with no errors (warnings are OK).  During the Connectathon, the monitor will be looking for this result.

                        Some image sets are too large to upload, and Gazelle reports an error. If you encounter this, please contact the Connectathon Technical Project Manager for instructions on alternatives.

                        Create a short txt file indicating you have completed the upload step. Upload '''that''' txt file into Gazelle Test Management as the result file for this preparatory test.

                        You may submit more than one set.

                        Instructions for DICOM Consumers

                        1. Find sample images uploaded by other vendors in Gazelle Test Management under Testing -> Sample exchange.    On the Samples available for rendering tab you will find DICOM_OBJECTS submitted by other vendors.  (When a Creator has uploaded a sample, you will see a small triangle in front of their system name.)  This page will evolve as vendors add samples, so be patient. 
                        2. Find the samples that are relevant for your storage or display application.
                        3. Retrieve the DICOM or zip files created by the other vendors.  (Refer to these help pages for details on this task in Gazelle.)
                        4. Examine/import/render them so that you are confident your software understands the content.
                        5. When you are finished, create a .txt or other file that lists the samples you tested with, and a brief comment on your result (eg "ok", or "could not render"). Upload that txt file into Gazelle Test Management as the result file for this preparatory test.
                        6. The evaluation of this test is performed by examining the text you provided to make sure you made a good faith effort to review the sample images. If you find problems with some samples, we will give your feedback to the Modality test partner.

                        Evaluation

                        Creators and Consumers:  In Gazelle Test Management, find the entry for this test instance, and change the status to "Verified by Vendor".  This is a signal to the Technical Manager that you have completed this task.

                        There is no specific evaluation for this test. Feedback comes as Consumers partners test with DICOM objects in their lab.  The goal is no surprises.

                        FHIR_Resource_Sample_Exchange

                        Overview

                        In this “test”, Creators of FHIR Resoruce submit a sample data set that will be reviewed by other Consumer test partners. The goal of this test is to prepare Consumer actors so they are not surprised during Connectathon events.

                        In order to facilitate this testing, Creators....please submit your samples 2-3 weeks before the usual Preparatory test deadline

                        Sample  Creator(s) Consumer(s)
                        FHIR Resource
                        (json or xml format
                        Content Creator Content Consumer

                         

                        Instructions for Content Creators

                        These are generic instructions for uploading samples into Gazelle Test Management.  Individual IHE Profiles and FHIR IGs contain definitions and constraints for specific content types.

                          1. Determine a representative set of your content according to the profile/IG requirements
                          2. If you have this ability, render the content that you produced as a reference. The goal is for a Content Consumer actor to examine your rendering and compare to what their software produces. Perform a screen capture and/or save as JPEG or other reasonable format.
                          3. Upload sample objects and the screen capture snapshot into Gazelle Test Management under menu Testing-> Sample exchange.
                          • On the Samples to share tab, upload your sample objects(s) under the profile specific entry (e.g. "IPS-FHIR" for this IHE International Patient Summary (IPS)).
                          • On the "summary" tab for some samples, you will see a list of Attributes.  As appropriate, select the applicable attribute entries. 
                          • Refer to these help pages for details on managing samples in Gazelle Test Management.

                        Create a short txt file indicating you have completed the upload step. Upload '''that''' txt file into Gazelle Test Management as the result file for this preparatory test.

                        You may submit more than one set.

                        Instructions for Content Consumers

                        1. Find sample images uploaded by other vendors in Gazelle Test Management under Testing -> Sample exchange.    On the Samples available for rendering tab you will find profile-specific entries for samples submitted by other vendors.  (When a Creator has uploaded a sample, you will see a small triangle in front of their system name.)  This page will evolve as vendors add samples, so be patient. 
                        2. Find the samples that are relevant for your storage or display application.
                        3. Retrieve the DICOM or zip files created by the other vendors.  (Refer to these help pages for details on this task in Gazelle.)
                        4. Examine/import/render them so that you are confident your software understands the content.
                        5. When you are finished, create a .txt or other file that lists the samples you tested with, and a brief comment on your result (eg "ok", or "could not render"). Upload that txt file into Gazelle Test Management as the result file for this preparatory test.
                        6. The evaluation of this test is performed by examining the text you provided to make sure you made a good faith effort to review the sample images. If you find problems with some samples, we will give your feedback to the Modality test partner.

                        Evaluation

                        Creators and Consumers:  In Gazelle Test Management, find the entry for this test instance, and change the status to "Verified by Vendor".  This is a signal to the Technical Manager that you have completed this task.

                        There is no specific evaluation for this test. Feedback comes as Consumers partners test with DICOM objects in their lab.  The goal is no surprises.

                        EYECARE_Sample_Exchange

                        Overview

                        Prior to the Connectathon, it is very beneficial for participants to have access to sample DICOM objects, HL7 messages,CDA documents, etc produced by their test partners' applications.  We use Gazelle as an intermediary to exchange samples.  In Gazelle Test Management, a "sample" is any object or message that an application creates and is used by another application.

                        This table lists samples from IHE Eye Care domain profiles U-EYECARE, GEE, and EC-Summary.  Beneath the table are instructions for Creators to upload samples into Gazelle prior to the Connectathon, and instructions for Consumers who want to download samples to test in advance of Connectathon.  Note that due to the size of some DICOM objects, we have separate instructions for exchange of DICOM samples.

                        The deadline for sharing samples is typically 2-3 weeks prior to the Connectathon.  The due date will be announced by the Technical Project Manager.

                        Sample  Type Creator(s) Comsumer(s)
                        EYECARE-15 Patient Registration ADT^A04 Pat Registration Src Pat Registration Cons, DSS/OF in Model I & III
                        EYECARE-16 Appt Scheduling SIU^S* Appt Scheduler Appt Consumer
                        EYECARE-17 Charge Posting DFT^P03 DSS/OF w/ Chg Posting option Chg Processor
                        EYECARE-19 Pat Demog Update ADT^A08  DSS/OF in Model I & III Img Mgr in Model I & III
                        EYECARE-20 Merge Pat IDs ADT^A40 Pat Registration Src w/ Merging option Pat Reg Cons, DSS/OF, Img Mrg w/ Merging option
                        EYECARE-21 Procedure Scheduling OMG^O19 DSS/OF in Model I & III   Img Mgr in Model I & III 
                        EYECARE-22 Procedure Status Upd OMG^O19 Img Mgr w/ Proc Status Upd HL7 option DSS/OF w/ Proc Status Upd HL7 option
                        EYECARE-23 Refractive Meas (no PatID) XML Refractive Measurement Source (RMS) Refractive Measurement Consumer (RMC)
                        EYECARE-24 Refractive Meas (valid PatID) XML RMSI, RMS w/ Pat Device List option RMC w/ Pat Device List option 
                        GEE Document CDA Content Creator Content Consumer
                        EC Summary Document CDA Content Creator Content Consumer

                         

                        Instructions for Creators

                        1. Capture the sample content as produced by your system.  Create a file containing the contents of that message.
                        2. In Gazelle Test Management, select menu Testing--> Sample exchange
                        3. On the "systems" dropdown list, select your system name
                        4. On the "Samples to share" tab, find the entry for your sample, matching Column 1 above
                        5. Upload the file containing your sample.  You may refer to this help page if you have never used Gazelle to upload samples

                        Instructions for Consumers

                        1. In Gazelle Test Management, select menu Testing-->Sample exchange
                        2. On the "systems" dropdown list, select your system name
                        3. On the "Samples available for rendering" tab, you will see an entry for each sample type for which you are a Consumer.
                        4. The contents of this tab changes over time, as Creators upload samples.  Entries with a "+" sign contain samples from Creators. 
                        5. Download available samples.  You may refer to this help page if you have never downloaded samples from Gazelle.

                        Evaluation

                        1. In Gazelle Test Management, find the entry for this test instance, and change the status to "Verified by Vendor".  This is a signal to the Technical Manager that you have completed this task.

                         

                        GSPS_Sample_Exchange

                        Overview

                        The goal of this “test” is to provide samples for other vendors to display. You should submit a “representative sample” of the data produced by your system.

                        GSPS objects are supported in the Consistent Presentation of Images profile. CPI Evidence Creator actors create GSPS objects (requirement) and may optionally produce images. Likewise, CPI Acquisition Modalities actors create images and GSPS objects.

                        Each system (Modality or Evidence Creator) should submit samples of the Image and/or GSPS objects. The goal of this test is to prepare Consumer actors (Image Manager, Image Display) so they are not surprised during Connectathon.

                        In order to facilitate this testing, Creators....please submit your samples 2-3 weeks before the Preparatory test deadlines

                        Sample  Creator(s) Consumer(s)

                        DICOM image(s)

                        DICOM GSPS(s)

                        CPI Acquisition Modality and Evidence Creator

                        Image Mgr, Image Display

                         

                        Instructions for DICOM Image and GSPS Creators:

                        FIRST, prepare the samples-->

                        Both Modality and Evidence Creator systems must provide samples that include both images and GSPS objects.  Acquisition Modalities will be the source of images for which GSPS objects will be created. 

                        The GSPS objects and images created by the Modality or Evidence Creator should be of the same genre as those normally generated by the Evidence Creator during expected clinical use.

                        To insure adequate testing of capabilities, the set of GSPS objects you create should include at least 15 elements drawn from the following GSPS capabilities:

                        1. a selection of display area (i.e., zoom)
                        2. a spatial transformation (rotation and/or flip)
                        3. a discernable adjustment to the Modality LUT, if appropriate to the image object
                        4. a discernable adjustment to the VOI LUT
                        5. a discernable adjustment to the Presentation LUT
                        6. a setting of display shutters (extra worthless brownie points to people who do bitmap shutters)
                        7. text annotation boxes
                        8. an anchor line associating a text box located in display coordinates with an anchor point in image coordinates
                        9. graphical objects in image coordinates
                        10. graphical objects in screen coordinates
                        11. specification of multiple graphical layers with different suggested colors
                        12. frame selection, if the associated image object is multi-frame
                        13. specification of mask and contrast frames for subtraction, if appropriate for the image object
                        14. a bitmap overlay

                        SECOND, upload your samples -->

                          • Upload sample image and GSPS objects and the screen capture snapshot(s) into Gazelle Test Management under Testing--> Sample exchange. (Refer to these help pages for details on managing samples in Gazelle).
                              • On the Samples to share tab, upload your sample image(s) under the GSPS entry.  
                              • Under "Actions", use the green triangle icon the validate your DICOM object(s) using Gazelle EVSClient.  We expect a validation result with no errors.
                          • If you have this ability, render the objects you produced as a reference. The goal is for an Image Display actor to examine your rendered images/GSPS objects and compare to what their software produces. Perform a screen capture and/or save as JPEG or other reasonable format.  Upload the screenshot into the sample page in Gazelle Test Management.
                          • Create a short txt file indicating you have completed the upload step. Upload '''that''' txt file into Gazelle Test Management as the result file for this preparatory test.
                          • There is no specific evaluation for this test.  Feedback comes as your partner Image Display and Image Manager partners test with your images in their lab.

                        Instructions for DICOM Image and GSPS Consumers:

                          1. Find sample images & GSPS objects uploaded by other vendors in Gazelle Test Management under Testing -> Sample exchange.    On the Samples available for rendering tab under the GSPS entry, you willl find samples and screen captures submitted by other vendors.  This page will evolve as vendors add samples, so be patient. 
                          2. Retrieve the DICOM or zip files created by the other vendors.  Refer to these help pages for details on this task in Gazelle.
                          3. Examine/import/render them so that you are confident your software understands the content.
                          4. When you are finished, create a .txt or other file that lists the samples you tested with, and a brief comment on your result (eg "ok", or "could not render"). Upload that txt file into gazelle as the result file for this preparatory test.
                          5. If you find issues with the images, send an email to the Connectathon Technical Manager.
                          6. The evaluation of this test is performed by examining the text you provided to make sure you made a good faith effort to review the sample images. If you find problems with some samples, we will give your feedback to the Modality test partner.
                          7. The goal is no surprises.

                          Finally, for both Creators & Consumers

                          1. In Gazelle Test Management, find the entry for this test instance, and change the status to "Verified by Vendor".  This is a signal to the Technical Manager that you have completed this task.

                           

                          IMR_Sample_Report_and_Study

                          Overview

                          The purpose of this exchange of IMR samples is to enable testing prior to Connectathon week and reduce 'surprises'.

                          In this “test”, Creators of IMR reports submit a sample data set that will be reviewed by other Consumer test partners. The goal of this test is to prepare Consumer actors (Image Manager, Image Display) so they are not surprised during Connectathon events.

                          In order to facilitate this testing, Creators....please submit your samples 2-3 weeks before the usual Preparatory test deadline. (During Connectathon, your Report Creator will also be required to create a report based on a study provided by the Connectathon Techincal Manager.)

                          Sample  Creator(s) Consumer(s)

                          IMR Bundle

                          Report Creator

                          Report Reader

                          Rendered Report Reader

                          Report Repository

                          DICOM study

                          The Report Creator must also provide the DICOM study(ies) associated with its report(s).  Each study will include images, and may optionally contain Stuctured Report (SR), different types of Presentation States (PR), or Segmentation objects.

                          Providing this sample ensures that the Report Creator has access to images (that the Image Manager will store) that are compatible with its reporting application.

                          Image Manager

                          Instructions for Report Creators

                            1. Identify a DICOM imaging study(ies) to report on.
                            2. Create diagnostic report and format it as an IMR Bundle (as defined in the Store Multimedia Report [RAD-141] transaction).  
                              --You should include hyperlinks that reference images in the DICOM study.  (We realize that the links won't "work" in the preparatory samples because Report Readers won't have access to your locally-stored study.)
                              --You should create a report that exercises the full features of your reporting application, i.e. if you are able to include the (optional) ServiceRequest, ImagingStudy, and/or ImagingSelection Resources in your Bundle, you should do that.   Robust samples make for better testing!
                            3. Render your report (refer to IMR Section 2:4.141.4.1.2.2.3).  You will submit the diagnostic report in HTML format as a sample.  You may optionally submit a PDF version of the rendering as a sample.
                            4. If you have this ability, perform a screen capture of the rendered report and save it as JPEG or other reasonable format.  The goal is for an Report Reader actor to examine your rendering and compare to what their software produces.
                            5. Submit your sample files into Gazelle Test Managment:
                          • Access Gazelle menu Testing-> Sample exchange.
                            • On the Samples to share tab,  find the IMR entry and upload these:
                              • your IMR Bundle (XML or JSON)
                              • the associated DICOM imaging study (you may upload individual .dcm files or a zip file containing the images)
                              • the rendering of your report in html
                              • (optional) the rendering of your report in PDF
                              • (optional) a screenshot of your rendered report

                          Some image sets are too large to upload, and Gazelle reports an error. If you encounter this, please contact the Connectathon Technical Project Manager for instructions on alternatives.

                           

                          Instructions for Report Readers/Rendered Report Readers/Image Managers

                          1. Find sample images in Gazelle Test Management under Testing -> Sample exchange.    On the Samples available for rendering tab, you will find IMR entries submitted by other vendors.  (When a Report Creator has uploaded a sample, you will see a small triangle in front of their system name.)  This page will evolve as vendors add samples, so be patient. 
                          2. Find the samples that are relevant for your application:  Report Readers will download the IMR Bundle.  Image Managers will download the DICOM study.
                          3. Retrieve the sample files created by the other vendors.  (Refer to these help pages for details on this task in Gazelle.)
                          4. Examine/import/render them so that you are confident your software understands the content.  (We realize that the links won't "work" in the multimedia report because Report Readers won't have access to the Creator's locally-stored study.  During Connectathon week, the Report Creator will link to images hosted on the Connectathon floor.)
                          5. When you are finished, create a .txt or other file that lists the samples you tested with, and a brief comment on your result (eg "ok", or "could not render"). Upload that txt file into Gazelle Test Management as the result file for this preparatory test.
                          6. The evaluation of this test is performed by examining the text you provided to make sure you made a good faith effort to review the sample report and images. If you find problems with some samples, we will give your feedback to the Report Creator test partner.

                          Evaluation

                          Creators and Consumers:  In Gazelle Test Management, find the entry for this test instance, and change the status to "Verified by Vendor".  This is a signal to the Technical Manager that you have completed this task.

                          There is no specific evaluation for this test. Feedback comes as your partner Consumers partners test with the report & images in their lab.  The goal is no surprises.

                          IRA_Sample_Study_for_Reporting

                          Overview

                           

                          In this “test”, the Image Display in the Integrated Reporting Applications (IRA) profile submits a sample DICOM study that will be the subject of reporting using the interactions defined in IRA. In some cases, a Report Creator may want to report on a specific type of DICOM study. If it has this type of restriction, the Report Creator can follow the instructions below to submit sample images.

                          The goal of this test is to enable Consumer actors (e.g., Report Creator, Evidence Creator) to have access to these studies prior to Connectathon events.

                          In order to facilitate this testing, Creators....please submit your samples 2-3 weeks before the usual Preparatory test deadline.

                          Sample  Creator(s) Consumer(s)

                          DICOM study

                          - The Image Display must provide at least one DICOM study that it will have available for reporting using the IRA profile.  The study will include images, and may optionally contain Structured Report (SR), different types of Presentation States (PR), or Segmentation objects.
                          - If a Report Creator has a specific DICOM study it wants to use to create its report, it should upload a sample as instructed below.

                          Providing this sample ensures that the Consumer systems have access to studies on the Image Display that are compatible with its reporting-related application.

                          Content Creator (grouped w/ Report Creator, Evidence Creator...)

                          Instructions for Creators:

                            1. Identify at least one DICOM imaging study that can be reported on.
                            2. Submit your sample files for that study into Gazelle Test Managment:
                          • Access Gazelle menu Testing--> Sample exchange.
                            • On the Samples to share tab,  find the IRA-study entry and upload the associated DICOM imaging study (you may upload individual .dcm files or a zip file containing the DICOM objects)

                          Some image sets are too large to upload, and Gazelle reports an error. If you encounter this, please contact the Connectathon Technical Project Manager for instructions on alternatives.

                           

                          Instructions for Consumers:

                          1. Find sample images uploaded in Gazelle Test Management under Testing -> Sample exchange.    On the Samples available for rendering tab, you will find IRA-study entries submitted by other vendors.  (When an Image Displayvhas uploaded a sample, you will see a small triangle in front of their system name.)  This page will evolve as vendors add samples, so be patient. 
                          2. Retrieve the sample files created by the other vendors.  (Refer to these help pages for details on this task in Gazelle.)
                          3. Examine/import/render them so that you are confident your software understands the content.  (We realize that the links won't "work" in the multimedia report because Report Readers won't have access to the Creator's locally-stored study.  During Connectathon week, the Report Creator will link to images hosted on the Connectathon floor.)
                          4. When you are finished, create a .txt or other file that lists the samples you tested with, and a brief comment on your result (eg "ok", or "could not render"). Upload that txt file into Gazelle Test Management as the result file for this preparatory test.

                          Evaluation

                          Creators and Consumers:  In Gazelle Test Management, find the entry for this test instance, and change the status to "Verified by Vendor".  This is a signal to the Technical Manager that you have completed this task.

                          There is no specific evaluation for this test.  The goal is no surprises.

                          KOS_Sample_Exchange

                          Overview

                          The goal of this test is to provide samples for other vendors to display. The KOS creator will submit a “representative sample” of the data produced by its system.

                          KOS objects are supported in the Key Image Notes profile. CPI Evidence Creator actors create GSPS objects (requirement) and may optionally produce images. Likewise, CPI Acquisition Modalities actors create images and GSPS objects.

                          Each system (Modality or Evidence Creator) should submit samples of the Images and the KOS objects containing references to some images. The goal of this test is to prepare Consumer actors (Image Manager, Image Display) so they are not surprised during Connectathon.

                          In order to facilitate this testing, Creators....please submit your samples 2-3 weeks before the usual preparatory test deadlines

                          Sample  Creator(s) Consumer(s)

                          DICOM image(s)

                          DICOM KOS(s)

                          KIN Acquisition Modality and Evidence Creator

                          Image Mgr, Image Display

                           

                          Instructions for DICOM Image and KOS Creators:

                          FIRST, prepare the samples-->

                          Both Modality and Evidence Creator systems must provide samples that include both images and KOS objects.  Acquisition Modalities will be the source of images for which KOS objects will be created. 

                          The KOS objects and images created by the Modality or Evidence Creator should be representative of what the system would created during expected clinical use.

                          Note: In order to identify the creator of the note, it would be most beneficial if the Patient Name chosen reflected the company/product of the system creating the Key Object Note. This is relatively easy for Modalities as they create the original images. This may take more imagination by Evidence Creators as they typically do not create the original images. By having the company/product name in the Patient Name field, other tests will be much easier to perform because your test partners will easily identify your images/KOS..

                          • The first KOS should reference a single image
                          • The second KOS should reference 2-3 images
                          • The title of the Key Object Note should be one of the titles drawn from DICOM CID 7010,
                          • You will upload sample objects into Gazelle Test Management under Testing-->Sample exchange in the "KOS" entry.
                          • Under "Actions", use the green triangle icon the validate your DICOM object(s) using Gazelle EVSClient.  We expect a validation result with no errors.
                          • The creator of the KOS objects should also store all of the images of the study to which each KOS pertains.
                          • There are extra worthless brownie points available for those who create Key Object Notes that refer to individual frames in a multi-frame image, or that reference GSPS objects associated with an image object, or that reference objects outside the current study.
                          • If you have this ability, render the objects you produced as a reference. The goal during Connectathon is for an Image Display partner to examine your rendered images/GSPS objects and compare to what their software produces. Perform a screen capture and/or save as JPEG or other reasonable format.  Upload the screenshot onto the sample page in Gazelle Test Management.
                          • Create a short txt file indicating you have completed the upload step. Upload '''that''' txt file into Gazelle  Test Management as the result file for this pre-preparatory test.
                          • There is no specific evaluation for this test.  Feedback comes as your partner Image Display and Image Manager partners test with your images in their lab.

                          Instructions for DICOM Image and GSPS Consumers:

                          1. Find sample images & GSPS objects uploaded by other vendors in Gazelle Test Management under Testing-> Sample exchange.    On the Samples available for rendering tab under the KOS entry, you willl find samples and screen captures submitted by other vendors.  This page will evolve as vendors add samples, so be patient. 
                          2. Retrieve the DICOM or zip files created by the other vendors.  Refer to these help pages for details on this task in Gazelle.
                          3. Examine/import/render them so that you are confident your software understands the content.
                          4. When you are finished, create a .txt or other file that lists the samples you tested with, and a brief comment on your result (eg "ok", or "could not render"). Upload that txt file into Gazelle Test Management as the result file for this preparatory test.
                          5. If you find issues with the images, send an email to the Connectathon Technical Manager.
                          6. The evaluation of this test is performed by examining the text you provided to make sure you made a good faith effort to review the sample images. If you find problems with some samples, we will give your feedback to the Modality test partner.
                          7. The goal is no surprises.

                          Finally, for both Creators & Consumers

                          1. In Gazelle Test Management, find the entry for this test instance, and change the status to "Verified by Vendor".  This is a signal to the Technical Manager that you have completed this task.

                           

                          MRRT_Sample_Exchange

                          Overview

                          The goal of this “test” is to provide MRRT Report Templates that are 'consumed' by other systems.

                          In order to facilitate this testing, Creators....please submit your samples 2-3 weeks before the usual Preparatory test deadlines

                          Sample  Creator(s) Consumer(s)

                          MRRT Report Template

                          Report Template Creator

                          Report Template Manager, Report Creator

                           

                          Instructions for Creators

                          FIRST, prepare the samples-->

                          We expect the Report Template Creator to create and provide a “representative sample” of the templates that represents the template-creating capabilities of your application AND that incorporate as many of the template structures defined in the MRRT Profile as possible. he better the samples you provide, the better interoperability testing we will have.

                           Using your Report Template Creator application, create a template using this guidance:

                          1. the report template should contain at least on entry under coded_content
                          2. Include at least one TEXT-type field in the report template
                          3. Include at least one NUMBER-type field in the report template
                          4. Include at least one selection list field in the report template
                          5. Include at least one DATE-type field in the report template
                          6. Include at least one TIME-type field in the report template
                          7. You may optionally include a MERGE-type field in your report template
                          8. You may include other content as long as it follows the format specified for MRRT report templates

                          SECOND, upload your samples -->

                          1. Upload sample report templates and optional screen capture snapshot(s) into Gazelle Test Management under Testing --> Sample exchange. On the Samples to share tab, upload your sample image(s) under the MRRT entry.   Refer to these help pages for details on managing samples in Gazelle.
                          2. Assuming you have this ability, render a sample report using the template(s) you produced.  Perform a screen capture and/or save as JPEG or other reasonable format.  Upload the screenshot into Gazelle Test Management.
                          3. Create a short txt file indicating you have completed the upload step. Upload '''that''' txt file into Gazelle as the result file for this pre-Connectathon test.
                          4. There is no specific evaluation for this test.  Feedback comes as your Consumer partners test with your reports in their lab.

                          Instructions for Consumers

                          1. Find sample MRRT Report Templates uploaded by other vendors in Gazelle Test Management under Testing -> Sample exchange.    On the Samples available for rendering tab under the MRRT entry, you willl find samples and screen captures submitted by other vendors.  This page will evolve as vendors add samples, so be patient. 
                          2. Retrieve the files created by the other vendors.  Refer to these help pages for details on this task in Gazelle.
                          3. Examine/import/render them so that you are confident your software understands the content.
                          4. When you are finished, create a .txt or other file that lists the samples you tested with, and a brief comment on your result (eg "ok", or "could not render"). Upload that txt file into Gazelle Test Management as the result file for this preparatory test.
                          5. If you find issues with the images, send an email to the Connectathon Technical Manager.
                          6. The evaluation of this test is performed by examining the text you provided to make sure you made a good faith effort to use the template in your system. If you find problems with some samples, we will give your feedback to the Creator test partner.
                          7. The goal is no surprises.

                          Evaluation

                          1. In Gazelle Test Management, find the entry for this test instance, and change the status to "Verified by Vendor". This is a signal to the Technical Manager that you have completed this task..

                           

                          REM_De-identify_Dose_SR_Sample

                          Overview

                          In the REM profile, Dose Information Reporter systems are required to have the capabililty to de-identify Dose SR objects. See RAD TF-2: 4.63.4.1.2.1 .

                          In this test, Dose Information Reporters perform de-identification and then submits it as a sample into Gazelle Test Management. Dose Register systems can retrieve and test with those samples.

                          In order to facilitate this testing, please submit your samples 2-3 weeks before the usual Preparatory test deadlines

                          Sample  Creator(s) Consumer(s)

                          Two files: one Radiation Dose SR (De-identified) *and* the original SR prior to de-identification

                          REM Dose Information Reporter

                          Dose Registry

                           

                          Instructions for Creators

                          FIRST, prepare the samples-->

                          1. Your task is to de-identify a Dose SR. You may have your own original SR to use, or you may choose to start with a sample Radiation Dose SR submitted into Gazelle Test Management by one of the REM Acquisition Modality partners for preparatory test 'REM_Dose_SR_Sample_Exchange'.
                          2. Place both the original and de-identified Dose SRs in DICOM Part 10 files. These are your 'sample' files.

                          SECOND, upload your samples -->

                          1. Upload DICOM Part 10 file into Gazelle Test Management under Connectathon -> List of samples. On the 'Samples to share' tab, find the entry for REM-De-identify-Dose-SR and add your two samples.
                          2. Assuming you have this ability, render your de-identified sample.  Perform a screen capture and/or save as JPEG or other reasonable format.  Upload the screenshot into Gazelle Test Management.
                          3. Create a short txt file indicating you have completed the upload step. Upload '''that''' txt file into Gazelle Test Management as the result file for this pre-Connectathon test.
                          4. There is no specific evaluation for this test.  Feedback comes as your Consumer partners test with your SRs in their lab.

                          Instructions for Consumers

                          1. Find Dose SRs uploaded by other vendors in Gazelle Test Management under Testing -> Sample exchange.    On the Samples available for rendering tab under the REM-De-identify-Dose-SR entry, you will find samples original and de-identified Dose SRs, and (optional) screen captures submitted by other vendors.  This page will evolve as vendors add samples, so be patient. 
                          2. Retrieve the files created by the other vendors.  Refer to these help pages for details on this task in Gazelle.
                          3. Examine/import/render them so that you are confident your software understands the content.
                          4. When you are finished, create a .txt or other file that lists the samples you tested with, and a brief comment on your result (eg "ok", or "could not render"). Upload that txt file into Gazelle Test Management as the result file for this preparatory test.
                          5. If you find issues with the images, send an email to the Connectathon Technical Manager.
                          6. The evaluation of this test is performed by examining the text you provided to make sure you made a good faith effort to use the template in your system. If you find problems with some samples, we will give your feedback to the Modality test partner.
                          7. The goal is no surprises.

                          Evaluation

                          1. In Gazelle Test Management, find the entry for this test instance, and change the status to "Verified by Vendor". This is a signal to the Technical Manager that you have completed this task.
                          2. The evaluation of this test comes in the form of feedback from vendors who try to import/process/display the contents of your objects. If they find issues, you will be asked to work with those vendors to resolve those issues

                           

                          REM_Dose_SR_Sample_Exchange

                          Overview

                          For the Connectathon, meaningful interoperability testing for the REM profile between Modalities and Dose Info Consumers, Reporters and Registers will rely largely on the quality of the data sets supplied by the Acquisition Modality vendors. The goal of this “test” is to provide REM-compliant Radiation Dose SRs that are 'consumed' by other systems.

                          In order to facilitate this testing, Creators....please submit your samples 2-3 weeks before the usual Preparatory test deadlines

                          Sample  Creator(s) Consumer(s)

                          Radiation Dose SR *plus* image(s) referenced in the SR

                          REM Acquisition Modality

                          Dose Information Consumer, Dose Information Reporter, Image Manager, Dose Registry

                           

                          Instructions for Creators

                          FIRST, prepare the samples-->

                          1. Determine a representative set of Dose SRs for your Acquisition Modality that would help the Consumer actors understand your content. Also, identify the DICOM Image(s) referenced in the Dose SR. (When the Modality creates a Dose SR, it shall cross-reference the source image object(s). )
                          2. Place the SR and image(s) in DICOM Part 10 files. (These are you 'sample' files.)

                          SECOND, upload your samples -->

                          1. Upload sample(s) and optional screen capture snapshot(s) into Gazelle Test Management under Testing --> Sample exchange. On the Samples to share tab, upload your sample image(s) under the REM-Dose-SR-and-images entry.   Refer to these help pages for details on managing samples in Gazelle.
                          2. Note that you can perform DICOM validation on you SR using Gazelle EVS directly from the sample page. This is the same validation performed in preparatory test EVS_Radiation_Dose_SR_Validation.
                          3. Assuming you have this ability, render the sample you produced.  Perform a screen capture and/or save as JPEG or other reasonable format.  Upload the screenshot into Gazelle Test Management.
                          4. Create a short txt file indicating you have completed the upload step. Upload '''that''' txt file into Gazelle as the result file for this Preparatory test.
                          5. There is no specific evaluation for this test.  Feedback comes as your Consumer partners test with your SRs in their lab.

                          Instructions for Consumers

                          1. Find sample Dose SRs uploaded by other vendors in Gazelle Test Management under Testing -> Sample exchange.    On the Samples available for rendering tab under the REM-Dose-SR-and-images entry, you willl find samples and screen captures submitted by other vendors.  This page will evolve as vendors add samples, so be patient. 
                          2. Retrieve the files created by the other vendors.  Refer to these help pages for details on this task in Gazelle.
                          3. Examine/import/render them so that you are confident your software understands the content.
                          4. When you are finished, create a .txt or other file that lists the samples you tested with, and a brief comment on your result (eg "ok", or "could not render"). Upload that txt file into Gazelle Test Management as the result file for this preparatory test.
                          5. If you find issues with the samples, send an email to the Connectathon Technical Manager.
                          6. The evaluation of this test is performed by examining the text you provided to make sure you made a good faith effort to use the template in your system. If you find problems with some samples, we will give your feedback to the Modality test partner.
                          7. The goal is no surprises.

                          Evaluation

                          1. In Gazelle Test Management, find the entry for this test instance, and change the status to "Verified by Vendor". This is a signal to the Technical Manager that you have completed this task.
                          2. The evaluation of this test comes in the form of feedback from vendors who try to import/process/display the contents of your objects. If they find issues, you will be asked to work with those vendors to resolve those issues

                           

                          REM_Previous_Samples

                          Overview

                          We have been testing REM for several years, and we've collected sample Dose SRs from Modalities that have tested REM at various connectathons. To expand the number of samples you have to test with, we point you to this collection of SRs.

                          Instructions

                          •  Access samples from past Connectathons stored in Google Drive under IHE Documents > Connectathon > Samples > Rad-Profiles > REM_Samples: https://drive.google.com/drive/folders/1M3OLxdHU25q8vNKQSr-O3Aip__YOYly0
                            Note that these samples are as they were submitted by the modality vendor. They may or may not pass validation.
                          • We encourage you to download them and test with them in your lab. At the Connectathon, you will be required to test with actual test partners, but we hope these samples provide for expanded testing opportunities.

                          Evaluation

                          There is no log to upload for this test.

                          REM-NM_Sample_Exchange

                          Overview

                          For the Connectathon, meaningful interoperability testing for the REM-NM profile between Radiolpharmceutical Activity Suppliers (RAS) and Modalities as creators of DICOM objects, and Image Managers, Dose Info Consumers, Reporters, Registers as consumers of those objects, will rely largely on the quality of the data sets supplied by the RAS and Modality vendors. In this “test”, you create a sample data set that will be reviewed by other participants. The goal of this test is to prepare other actors (Image Manager, Dose Info Consumer, Dose Info Reporter, Dose Register) so they are not surprised during Connectathon events.

                          In order to facilitate this testing, Creators....please submit your samples 2-3 weeks before the usual Preparatory test deadlines

                          Sample  Creator(s) Consumer(s)

                          Radiopharmaceutical Radiation Dose Structured Reports (RRDSR)

                          Images with dose information encoded

                          REM-NM RAS

                          REM-NM Acquisition Modality

                          Dose Information Consumer, Dose Information Reporter, Dose Registry, Image Manager

                           

                          Instructions for Creators:

                          FIRST, prepare the samples-->

                          1. For RAS systems:  Determine a representative set of Radiopharmaceutical Radiation Dose Structured Reports (RRDSR) for your RAS that would help other actors understand your content. Good samples make for meaningful tests between vendors.
                          2. For Modality systems: Determine a representative set of images for your modality that would help other actors unoderstand your content. In particular, we are looking for samples that demonstrate your ability to include dose information in your image objects per the requirements in RAD TF-2:4.8.4.1.2.5 for the REM-NM profile.
                          3. Place the RRDSR(s) / image(s) in DICOM Part 10 files. (These are you 'sample' files.)

                          SECOND, upload your samples -->

                          1. Upload sample(s) into Gazelle Test Management under Testing --> Sample exchange. On the Samples to share tab, upload your (s) under the REM-NM entry.   Refer to these help pages for details on managing samples in Gazelle.
                          2. Assuming you have this ability, render a sample report using the template(s) you produced.  Perform a screen capture and/or save as JPEG or other reasonable format.  Upload the screenshot into Gazelle Test Management.
                          3. Create a short txt file indicating you have completed the upload step. Upload '''that''' txt file into Gazelle as the result file for this pre-Connectathon test.
                          4. There is no specific evaluation for this test.  Feedback comes as your Consumer partners test with your content in their lab.

                          Instructions for Consumers

                          1. Find sample RRDSRs and images uploaded by other vendors in Gazelle Test Management under Testing -> Sample exchange.    On the Samples available for rendering tab under the REM-NM entry, you willl find samples and screen captures submitted by other vendors.  This page will evolve as vendors add samples, so be patient. 
                          2. Retrieve the files created by the other vendors.  Refer to these help pages for details on this task in Gazelle.
                          3. Examine/import/render them so that you are confident your software understands the content.
                          4. When you are finished, create a .txt or other file that lists the samples you tested with, and a brief comment on your result (eg "ok", or "could not render"). Upload that txt file into Gazelle Test Management as the result file for this preparatory test.
                          5. If you find issues with the images, send an email to the Connectathon Technical Manager.
                          6. The evaluation of this test is performed by examining the text you provided to make sure you made a good faith effort to use the template in your system. If you find problems with some samples, we will give your feedback to the Modality test partner.
                          7. The goal is no surprises.

                          Evaluation

                          1. In Gazelle Test Management, find the entry for this test instance, and change the status to "Verified by Vendor". This is a signal to the Technical Manager that you have completed this task.
                          2. The evaluation of this test comes in the form of feedback from vendors who try to import/process/display the contents of your objects. If they find issues, you will be asked to work with those vendors to resolve those issues.

                           

                          SVS Simulator tests

                          This section contains test cases performed with the Sharing Value Sets Simulator tool.

                          Tool: http://gazelle.ihe.net/SVSSimulator

                          Tool information page: http://gazelle.ihe.net/content/svs-simulator

                          14500: Pointer to SVS Simulator

                          We use this 'test' to inform you of the gazelle SVS simulator available for your testing.  

                          SVS actors simulated:

                          • Value Set Consumer
                          • Value Set Reponsitory.

                          Location of the tool: http://gazelle.ihe.net/SVSSimulator

                          Tool user manual:  https://gazelle.ihe.net/content/svs-simulator

                          We encourage you to test with the simulator prior to the Connectathon.

                          There are no pre-Connectathon results to upload for this 'test'.

                          XDStarClient Simulator tests

                          This section contains test cases performed with the XDStarClient tool.

                          Tool: http://gazelle.ihe.net/XDStarClient

                          Tool information page: http://gazelle.ihe.net/content/xdstarclient

                          XDStarClient-0

                          Test Your Server with Gazelle XDStar Client Simulator

                          We use this test to inform you of the gazelle XDStar Client simulator tool available for your testing.  It simulates 'client' actors in XDS.b, XDR, XCA, MPQ, DSUB, XCPD, XDS.b On-demand Docs option, XCF, XCA-I and XDS-I.b

                          We encourage you to test with the simulator prior to the Connectathon, but there are no pre-Connectathon test results files to upload for this test.

                          ***Note that at this time, the CAS login on the XDStarClient only works with username/passwords for the European instance of gazelle (EU-CAT), not with the North American gazelle (gazelle-na). Until this is implemented, testers will have to create a new user account. This can be done using the CAS login link at the upper-right of the XDStarClient.

                          Instructions

                          1. Access the XDStarClient: http://gazelle.ihe.net/XDStarClient/home.seam Configure your server using the "'SUT Configurations'" menu

                          2. Under the SIMU-Initiators menu, find the message you want to receive:

                          • ITI-18 : Register Stored Query (XDS.b)
                          • ITI-38 : Cross Gateway Query (XCA)
                          • ITI-39 : Cross Gateway Retrieve (XCA)
                          • ITI-41 : Provide and Register Set-b (XDS.b / XDR)
                          • ITI-43 : Retrieve Document Set (XDS.b)
                          • ITI-51 : Multi-Patient Stored Query (MPQ)
                          • ITI-52 : Document Metadata Subscribe (DSUB)
                          • ITI-54 : Document Metadata Publish (DSUB)
                          • ITI-55 : Cross Gateway Patient Discovery (XCPD)
                          • ITI-56 : Patient Location Query (XCPD)
                          • ITI-61 : Register On-Demand Document Entry (XDS.b On-Demand Docs option)
                          • ITI-63 : Cross Gateway Fetch (XCF)
                          • PHARM-1
                          • RAD-55 : WADO Retrieve
                          • RAD-68 : Provide and Register Imaging Document Set - MTOM/XOP
                          • RAD-69 : Retrieve Imaging Document Set #
                          • RAD-75 : Cross Gateway Retrieve Imaging Document Set

                          3. Follow the instructions on the page to send the selected message to the server you have configured

                          Evaluation

                          The purpose of this test is to provide sample messages to test with your server. There are no pre-Connectathon test results files to upload for this test.

                          XUA - STS tests

                          This section contains informatino about testing with the Security Token Service (STS) used with XUA tests.

                           

                          How to enter your Preparatory test result in Gazelle

                          How upload your preparatory test result in Gazelle:

                          1. Log in to Gazelle Test Management for your test event
                          2. Select menu Testing --> Test execution
                          3. Display your list of "Preparatory" tests
                          4. For an individual test, click the + icon to start a new test run. You will be asked to select peers, you can bypass this screen, click on the play icon.
                          5. At the bottom of the page, upload log files and other evidence, and use the chat window to write any necessary information concerning your test result.

                          How to retrieve the test report (HL7 v2.x based simulators)

                          This page is aimed to show you how to retrieve the test report asked in most of the tests performed against one of the Gazelle HL7v2.x simulators. 

                          All the HL7v2.x messages exchanged between your system under test and a Gazelle simulator are logged into the application's database. In the context of the pre-connectathon testing, you will be asked to provide within Test Management the test report as a proof of your success. 

                          Once you are ready to log your result into Gazelle, go to the HL7 messages part of the simulator you have tested against. The common menu icon is HL7 messages menu . If you were already on that page before sending the message, click on the "Refresh List" button. Look for the exchanged message you need (filters are available to restrain and facilitate your search). Note that most recent messages are at the top of the table.

                          Once you have found out the message to log, click on its id (left-hand column).

                          permanent link

                          The permanent page gathering all the information about the selected message will be displayed. Among this information, you will find a linked entitled "Permanent link to test report", it is this link that you are asked to provide within your pre-connectathon test instance.

                          test report permanent link