Q: What does this tool cover?

A: The Tool tests your system's ability to discover organizationally-bound and user-bound certificates following the Certificate Discovery process required by Direct. It also tests that your certificates are discoverable following the Certificate Discovery process.

Q: Is there a User Guide?

A: Yes: The 3.1 User Guide is the latest. Previous user guides can be found in a particular release's wiki space.

Q: Is there a demo version of the Tool already deployed that I can use to test my Direct instance?

A: The demo version of the tool can be found here.

Q: Which browsers are supported by the Tool?

A: The Tool can be used on Chrome, Firefox, and Internet Explorer. Specifically, the Tool has been tested with the following versions: Chrome versions > 32, Firefox versions > 24. Internet Explorer may work for versions > 10.

Q: When I try to download the anchor file in Firefox, it asks me if I want to trust this and a RootCA for my local system. Should I do this?

A: This is not what you want to do. You need to right-click the link and choose "Save Link As...", save it somewhere on your system, and add it to your Direct instance as a trust anchor.

Q: My System doesn't trust the Tool even after uploading your anchor to my anchor store. Is there something wrong with your anchor?

A: We've noticed that some systems (including the Java Reference Implementation) take 5 minutes or longer to fully integrate any new anchors into their system. If you want to shorten this waiting period, restarting your James server should do the trick.

Q: I'm not receiving any response emails but my Direct messages are being sent to the demo site.

A: Check your junk mail or spam folders. Sometimes the messages are routed to these folders. Look out for messages coming from: results@dcdt31prod.sitenv.org.

Q: Which tests are required to demonstrate Meaningful Use Stage 2 (MU2) capabilities?

A: The Tool is divided into two types of tests, Hosting and Discovery. All Discovery tests are required for MU2 certification. However, for Hosting tests - the System Under Test (SUT) only has to take the tests that apply to their implementation - which could be as few as one test (e.g. Address-Bound DNS), 2 tests (Address-Bound DNS and Domain-Bound DNS), or all 4 tests (Address/Domain-Bound for both DNS and LDAP).

In other words, the SUT MUST be able to acquire certificates from any other conformant Direct implementation - regardless of the choices that system made; but for Hosting, the SUT only needs to prove at least one hosting method (systems should test for every hosting method they support, so if your product implements all optional methods - you MUST pass all 4 Hosting test cases).

Q: Why do some tests say that I failed because I didn't follow the correct SRV record priorities?

A: The specifications are written such that the priorities of the SRV Records should be taken into account by initiating Direct implementations. Here is a quote from the specifications regarding this notion:

"From the list of LDAP services the consumer should attempt to contact them based first on the priority value and, if there is more than one with the same priority value, they should then be ordered based on the weight value."

Please note: this is a SHOULD requirement, but not a MUST. Our Tool highlights these discrepancies and warns the consumer when they ignore the priority values.

Q: What is meant by high priority LDAP instances vs. low priority LDAP instances?

A: If you fail a test for choosing the wrong priority valued LDAP server initially, we send a warning message stating that you chose the higher priority valued LDAP server (which is identified in an SRV Record) instead of the lower priority valued LDAP server. When this occurs, your system chooses an LDAP server with a higher valued priority (e.g. "2") instead of a lower valued priority (e.g. "1"). We send our diagnostic information in the human-readable format as opposed to the technical terminology for priority value. See RFC 2782 for more information about SRV records and their priorities.

Q: Where does the code live?

A: The code is in a GitHub repository. The code can be built from source by following the 3.1 Source Build Guide.

Q: How do I submit a defect?

A: To submit a defect, please enter the issue using our JIRA issue tracker.

Q: If I choose to install the Tool locally, do I have to build it from source?

A: No, you can download the latest WAR file (here) and deploy it to your application server.

  • No labels