WiFi Information:

Network: Washington Plaza Meeting

On Mac & iPhone, complimentary code: wpevent1

On PC/Android, complementary code: pool

 

Please use the panel links below to participate in discussions.

Agenda:

Forum Agenda

Day 1 Panels:

Interoperability Context in 2017: Business Cases and Information Blocking

Semantics: Progress, Challenges, and the Hard Problems Still Yet to Solve

Interoperability Networks and Nationwide Infrastructure

Application Programming Interfaces (APIs) + Patient and Provider Applications (Apps)

Third Party Uses: Data Availability, Aggregation, and Services

 

Day 2 Panels:

Fit for Purpose? Identifying the Right Standard(s) for the Right Job

Addressing Interoperability Needs Across the Care Continuum

Summary Panel: Reactions and Reflections

 

  • No labels

8 Comments

  1. Anonymous

    There is marginal value in quantitative assessment of interoperability (measuring transactions fired across an interface or via an API).  What we need is full qualitative assessment (measuring fitness for use).

  2. Anonymous

    When I want to look at what is happening in the market, 3 year old studies and two year old trends aren't going to impress me.  What happening now?  – Keith Boone

  3. Anonymous

    Is there a plan for connecting  the small independent providers (non affiliated)  to the heath networks and making them interoperable ?  Is there any effort/program designed specifically to   support, reduce cost  and motivate them to get connected. 

  4. Anonymous

    Interoperability is is the Exchange, and Use of electronic health data Without Special Effort on the part of the user. 

    This speaks to not simply data Exchange transactions, but the Use (analytics, pop health, etc.) as well as Usability of data

  5. Anonymous

    Do We Need an Interoperability Fire Brigade?

    (Presented at Beyond Boundaries: ONC’s 2017 Technical Interoperability Forum, August 16, 2017, 1pm Public Comment, by Gil Alterovitz)

    "An ounce of prevention is worth a pound of cure."  Did you know this axiom came from Benjamin Franklin’s desire to have a firefighting brigade in Philadelphia.  He wrote about it after his experience with fire in his own building.  As he put it, “you may be forced, (as I once was) to leap out of your Windows, and hazard your Necks to avoid being oven-roasted.”

    Traditionally, standards have been developed only after the various parties have created highly non-interoperable systems, essentially attempting to stop an evolving fire.

    The 21st Cures Act has been a positive development in bringing to the forefront the need for interoperability in health care delivery and efforts to enable precision medicine.

    Now, do we have a special opportunity with precision medicine- where the underlying types of diagnostic tests, therapies, etc are now being developed?  Should standards be developed differently for new technologies in precision medicine, including genomics, internet of things,etc compared to already-existing fields?  What is best way to encourage industry adoption to prevent information blocking?- especially for enabling access to new technologies.

    If standards are developed before technologies are fully implemented, they may be subject to change- leading to increased costs.  If developed after, we literally may be affecting patient lives who may be missing out on these technologies.

    A compromise may be to develop them after core technologies (like sequencing) are available, but before fully implemented clinically- by designing a standard (e.g. for genomics) that has multiple layers for adoption- along with different maturity levels.  That time is now for a number of core technologies.  This is the approach we have used in developing SMART on FHIR Genomics at Harvard Medical School/Boston Children's Hospital and FHIR Genomics within the HL7 Clinical Genomics Workgroup.  Perhaps this approach may be suitable to try out in other new fields as they develop in precision medicine and elsewhere.

    An ounce of prevention is worth a pound of cure.

    So, I leave us with a couple questions: What design considerations and incentives can be created to lead industry toward standardization rather than fragmentation from the very beginning of the technology life cycle? 

    How do we prevent a situation in interoperability- where non-interoperablity fires can arise- and only seen when perhaps too late? 

    So, the key point is: Do we need surveillance means and metrics to evaluate the interoperability state of emerging fields- and/or perhaps we need a framework to intercede with incentives and other means, as needed?

    Do we need an interoperablity fire brigade? 

  6. Anonymous

    A photocopy machine produces exact replicas of the original document.  A fax machine produces exact replicas of the original document.  If not, they would not be trusted and they would be immediately discarded or replaced.

    The sharing of health information has long relied on both photocopies and faxes as a means of sharing content of the original source document.

    So, isn’t this a basic measure of the achievement of interoperability?  Shouldn’t we be able to demonstrate that what goes in (at the point of capture) is the same as what comes out (at each ultimate point of access/use)?

    Since this is seldom the case with our current interoperability schemes, I fear this borders on malpractice.  This results from the fact that we failed to consider the basics, indeed the fundamentals of truth (authenticity) and trust (assurance), starting 30+ years ago.

    Let’s return to the basics and start first to ensure (our priority is) authenticity with a secondary focus on computability.

    It is our fundamental responsibility to support primary use (clinical care, interventions and decision making), indeed data integrity, the integrity of clinical practice and most importantly patient safety.

    For interoperability there is only one source of truth:  information as captured at the source point of origination.  Every output of interoperability must be measured against (be comparable to) the source of truth.  We really cannot improve what is captured at the front-end or make it better at the back-end – despite our application of standards, be they messages, documents, FHIR resources, vocabularies, information models or whatever.

  7. Anonymous

    Thank you for the fabulous forum last week! We did get a lot out of it.

    Since the HIT ecosystem is so vast, and people’s needs are so varied, I would like to suggest that next year the forum format change to more targeted breakout sessions in the afternoon

    instead of having everyone in the same large room on both days.  My initial ideas for breakout sessions are: one for the Providers, the technical folks (FHIR and mobile apps),

    the federal government folks, legal and policy in a fourth and one to showcase what

    projects the ONC is currently doing (the US FHIR core).

     

  8. Anonymous

    CentriHealth (now a wholly owned subsidiary of UnitedHealth Group) recently submitted comments on the proposed ONC Interoperability Standards Measurement Framework, noting the strong bias toward "achievement" of interoperability based on quantitative assessment (e.g., transaction volumes) instead of qualitative assessment (i.e., fitness for use).  The comments, submitted 31 July 2017, are posted here:  https://u10786720.dl.dropboxusercontent.com/u/10786720/CentriHealth-Comments%20on%20ONC%20Interoperability%20Standards%20Measurement-20170731-ALL.pdf