Total Cost of Ownership Models Favor Hybrid Vendor Neutral Archive Configurations

My recently completed white paper  The Anatomy of a Vendor Neutral Archive (VNA) Done Right: The Case for Silo Busting focuses on the subject of Vendor Neutral Archive, but it is more than just another rehash of the technical argument.  Well, there are the obligatory opening paragraphs that present the technical background, but that’s just to make sure we are all on the same page with respect to system descriptions and vocabulary.  The real meat of this paper is a presentation of system architectures; specifically architectures that support business continuity, and a brief look at a real world Cost Model.

Since a dual-sited VNA is both large and complex, requiring geographically separated data centers, the obvious questions are: [1] what are the best deployment options, and [2] what are the associated Total Cost of Ownership (TCO) figures?  The paper considers concepts like Cloud Infrastructure and Software as a Service, because they can have a significant impact on TCO.  In my opinion, organizations that do not already have a remote secondary data center and have limited IT resources need to seriously consider any strategy that simplifies system management and lowers costs.

The Cost Model is very revealing, as it compares a dual-sited, on-premise, self-managed VNA to a dual-sited, on-premise/off-premise, vendor-managed (SaaS) VNA.  The later is now being referred to as a Hybrid VNA.  The model was built for five different organization profiles using comparable configurations, and real world infrastructure and operational costs.

Comparisons - 5 year TCO for Capital and Hybrid VNA

In the Table reproduced here, you can see the encouraging results.  The paper was sponsored by an unrestricted grant from Iron Mountain, and I assure the reader that I was involved in assembling the components of the model and approved every one of the line item costs.

Organizations that are getting serious about deploying a VNA will need a positive cost model to win project approval.  As part of that process, I strongly encourage looking at the Hybrid VNA.

Putting Half of the Vendor Neutral Archive in the Cloud Makes Sense

Organizations looking at deploying a Vendor Neutral Archive have some hard decisions to make.  While there are several motivations for moving all of the enterprise image data, radiology, cardiology, endoscopy, etc. from disparate PACS archives to the consolidated VNA, the economic realities will make it a tough sell in many healthcare organizations. A properly configured VNA, one that provides reliable Disaster Recovery and Business Continuity, should have mirrored Primary and Secondary subsystems located in geographically separate data centers.  That will make a VNA is at least twice as big as all of the organization’s PACS combined!

Furthermore the VNA application suite is considerably more sophisticated than that of the department PACS.  Additional FTE resources, several with specialized expertise, will be required to administer the tag-mapping library, create and manage the retention policies, and monitor overall system performance, the security programs, and storage consumption.

All of which is to say, a properly configured VNA is going to be expensive to deploy and expensive to operate.  Making the economic argument for the VNA is going to be very difficult, because the Total Cost of Ownership of a VNA will almost always be higher than the TCO of a Heterogeneous PACS environment, simply because most department PACS have a weak DR solution, and no Business Continuity solution.  While the VNA will make a number of future data migrations unnecessary, the costs of those future data migrations avoided are typically not allowed in the cost models.

One possible solution to the economic challenge is to leverage Cloud Infrastructure and Software as a Service.  Rather than capitalizing and self-managing the Secondary VNA subsystem in a second geographically remote data center (which most organizations do not now have), the entire Secondary subsystem is operationalized and hosted in a Public (multi-tenant) Cloud Infrastructure.  Additional savings can be realized if the entire VNA, both the on-premise Primary and the off-premise Secondary subsystems, are managed under a Software as a Service contract.  In this scenario, both the on-premise and off-premise storage is delivered and billed on an as-needed basis, and all of the management resources and off-premise hardware infrastructure are shared across multiple organizations.

Hybrid VNAThe VNA configuration with the Primary subsystem on-premise and the Secondary subsystem off-premise in a Cloud is referred to as a Hybrid VNA.  If the organization does not believe that it has the IT resources to manage the on-premise Primary subsystem, there are Hybrid VNA vendors that will manage both on-premise and off-premise subsystems under a Software as a Service contract.  A Hybrid VNA managed entirely under a SaaS contract can have a 30% lower TCO than its capitalized, self-managed, on-premise counterpart.  That 30% savings can be used to make a positive economic argument for deploying the VNA.  For healthcare organizations with limited IT resources and no existing remote data center, the Hybrid VNA may be the only strategy that makes sense.

Role of Cloud Infrastructure in Vendor-Neutral Archive Adoption

With all the recent hoopla around Cloud Infrastructure, I thought it would be worthwhile studying up on the subject, in order to learn how private and public Clouds might impact the adoption of Vendor-Neutral Archives.  While the concept of remote storage has been around for some time, the new twist that makes the subject much more interesting is the use of web services (HTTP) to exchange data with the Cloud.  Coincidentally, there has been an effort underway since early 2010 to develop a web services methodology for communicating (exchanging) medical image data between diagnostic workstations, PACS server, Vendor Neutral Archive, Intelligent Storage Solution, and freestanding UniViewer server.  The proposed web services protocol for medical imaging is called Medical Imaging Network Transport (MINT).  You can read more about MINT on their web site.  It is being suggested that MINT would replace the use of DICOM as the traditional interface between these devices.  The move from DICOM to web services is motivated by significant performance improvements (DICOM communications involves considerable overhead), as well as the opportunity to take full advantage of the rich meta data that must be included with the image data in a web services protocol.  Rather than attempt to summarize my opinions on this subject in this blog, I invite you to read the recent white paper that EMC commissioned me to write on this subject.  I think that you will find the subject somewhat stimulating.

The Dilemma Presented by non-DICOM Image Data Objects

The primary impetus for deploying a PACS-Neutral Archive is the consolidation of the massive volume of Radiology and Cardiology image data objects into a single, centric,  enterprise-class repository.  Important secondary objectives include ending costly image data migrations, supporting image data sharing across disparate PACS, and image-enabling the Electronic Medical Record portal.   In all of these cases, we are focusing on DICOM image data objects.  The DICOM image data object is very well defined and the vast majority of diagnostic medical imaging systems are based on the use of DICOM.  It is a natural then for the PACS-Neutral Archive to focus on the acquisition, management and display of DICOM image data objects.

What is to be done with those non-DICOM image data objects?

There are a number of medical image data Sources (modalities) that produce non-DICOM data objects.  In some cases these objects are the images, and in some cases these objects are clinical information associated with the images or the study.  Object types include PDF, JPEG, MPEG, TIFF, WAV, and other consumer image formats.

What is the best strategy for acquiring, managing, and viewing these non-DICOM image data objects?  My opinion, one that is shared by others, is that we should take advantage of all the benefits of the DICOM standard and convert non-DICOM image data objects to DICOM image data objects.

In my latest White Paper titled Best Practices Strategy for non-DICOM data in Neutral Archive, which was edited and contributed to by a number of leading developers in the industry, I discuss the Best Practices Strategy for Dealing with non-DICOM Image Data Objects in a PACS-Neutral Archive.  The paper includes DICOM conversion methodologies, shortcomings of managing non-DICOM image objects in their native format, and the future role of XDS-I in image data object management.

There’s a lot of confusing information out there on this subject.  This paper is a must read for those that are just planning or already deploying a Neutral Archive.

Three-Step Strategic Plan for Achieving Meaningful Use of Medical Images

These are difficult times for Healthcare’s C-level administrators, as there are a number of major challenges looming on the horizon, appearing as dark clouds threatening to merge into a perfect storm. First and foremost I suppose would be figuring out how to support and encourage Meaningful Use according to the July 13 release of the final Stage 1 guidelines. Still no specific use of the word “images” in the text, but the same two objectives that reference the exchange of “key clinical information” are now codified in the 14 core objectives that hospitals are required to comply with at least six months before November 30, 2011 deadline. That’s the last day for eligible hospitals to register and attest to receive an incentive payment for FY 2011.

The incentives drop for every year of delay, so in this case, delay will be expensive, and effectively cost the organization precious development money.

While one may argue whether medical images should or could be included in the term “key clinical information”, there is no argument that exchanging images with outside organizations and providers based on data copied to CDs is problematic. It’s also expensive (labor and shipping costs). No wonder then that there are now twelve vendors offering either Electronic Image Share appliances or Cloud-based Image services. Should the C-level administrators look into solving this problem at the risk of taking their eyes off of the Meaningful Use issue? If the two issues are mutually exclusive, probably not.

Perhaps the darkest cloud on the horizon, because it is associated with hundreds of thousands of dollars in service fees, is the upcoming PACS data migrations. This cloud might appear to many as faint and unspecified, but make no mistake…it is there, it is coming, and it is going to be bad. Once again, should the C-level administrators spend time worrying about future data migrations, when there is only a year left to get the Electronic Health Record system up and running and meeting those Stage 1 objectives? If these two issues are mutually exclusive, probably not.

Here’s another important date bearing strong negative implications…2015; the year when Medicare payment adjustments begin for eligible professionals and eligible hospitals that are NOT meaningful users of Electronic Health Record (EHR) technology. “Adjustments” is political nice-nice for lowered reimbursements. Medical Images will most certainly be a stated inclusion in the Meaningful Use criteria by that time.

One way to look at the big picture is that there are a maximum of four years of financial incentives available for hospitals that can demonstrate support of Meaningful Use of key clinical information, for every year of eligibility. Deploying an IT and Visualization infrastructure over a five year period that will ultimately deliver all of a patient’s longitudinal medical record data to the physicians and caregivers is going to be expensive. It makes perfect sense to develop a Strategic Plan that goes after every bit of incentive funding available. That plan should and can weave all of the looming challenges into a single cohesive step plan. The aforementioned challenges are not mutually exclusive.

If one takes the position that electronic sharing of medical images outside of the organization is supportive of Stage 1 objectives, Step 1 of the Strategic Plan would be to deploy an electronic Image Share Solution. Whether that solution is an on-site, capitalized appliance or a Cloud-based service is another discussion, as the pros and cons are very organization-specific. Just make sure that the solution has upgrade potential, and is not a dead-end product.

By mid 2011 it’s time to start deploying Step 2 of the Strategic Plan…image-enabling the EHR. This might seem like an early jump on the image access issue, but we don’t know if specific mention of images will show up in the core objectives for Stage 2 or Stage 3, so why risk having to scramble to catch up? Perhaps the easiest way to image-enable the EHR would be to deploy a standalone universal viewer (display application). There are already a number of good universal viewers that require minimal server resources, feature server-side rendering, and require zero or near-zero client software. The IT department develops a simple URL interface between the EHR Portal and the universal viewer, and then individual interfaces between the universal viewer application and all of the image repositories in the enterprise (i.e. PACS). Ah but there’s the rub. All those PACS interfaces are going to be expensive to develop and maintain and replace with each new PACS, and there is no assurance that the universal viewer will be able to interpret all the variances in those disparate PACS headers.

Those of you that have been following my posts on this web site, see where this is going. The best solution, certainly the best long-term solution, is the deployment of a PACS-Neutral Archive and an associated Universal Viewer (aka UniViewer). The EHR is not designed to manage image data, relying instead on interfaces between its Physician Portal and the various established image data repositories in the enterprise. The PNA solves most of the organizations data management problems by consolidating all of the image data into a single “neutral” enterprise repository, which directly supports and encourages Meaningful Use of all the data objects that will constitute the patient’s longitudinal medical record. The problem is, most organizations will not be prepared to deploy a PACS Neutral Archive in 2011, so this would be a bit much to schedule for Step 2.

My Step 2 would be to expand the Image Share solution from Step 1 to include more storage…enough storage to accommodate the image data that the organization will start migrating from each of its department PACS. Of course this would mean making sure that the Image Share solution that is chosen in Step 1 was capable of becoming a PACS-Neutral Archive. At a minimum it would have to support bi-directional tag morphing. By the time the organization has completed the migration of the most recent 12 to 18 months of PACS image data, it will be possible to support Meaningful Use of the most relevant image data both inside and outside the organization. It is important to appreciate that the set of features/functions of a PACS-Neutral Archive required to meet the objectives of Step 2 (while the data is being migrated) is a fraction of the full set of PNA features/functions, so the cost of the software licenses required for Step 2 should be a fraction of the cost of the licenses for a complete PNA. Fortunately there are a few PNA vendors that appreciate this subtlety.

Step 3 could occur out there sometime beyond 2012, when the organization has sufficient funds approved to turn on all of the features and functions of a PNA, and purchase sufficient storage to accommodate all of the enterprise’s image data.

In this Strategic Plan, all of the major challenges looming over the horizon that have to do with images are addressed and solved in three creative yet logical Steps. Using the infrastructure to support and encourage Meaningful Use, in turn qualifies the organization for significant financial incentives that should go a long ways toward financing the Plan.

Hospitals required to demonstrate Electronic Image Sharing in 2011

Despite the key role that medical imaging plays in patient care, the inclusion of medical images in the Meaningful Use criteria for ARRA funding was supposedly all the way out in 2015.  One would think that that would give a healthcare organization plenty of time for planning, choosing a solution, budgeting and picking a vendor.

In theory, there are a number of ways to support Meaningful Use of images through the Physician Portal.  Whether you believe the best approach is [1] an Enterprise Archive with a UniViewer, [2] a multi-department PACS with its UniViewer, or [3] a continuation of individual department PACS, each with their own viewers; four-plus years would seem to be plenty of time to watch what the early adopters deploy and figure out your own strategy.

I think those four years just disappeared…in a puff.

In a recent article, Keith Dreyer, D.O., Ph.D., included a statement in his conclusion that came as something of a surprise to me.   That statement is worth repeating here in its entirety.  The underlines are mine.

“The Centers for Medicare and Medicaid Services proposed rulemaking of December 2009 suggests that providers will be required to demonstrate cross-provider patient medical data sharing by 2011. Furthermore, at least 80% of patient requests for electronic medical data must be able to be delivered within 48 hours. It is expected that medical imaging will be an important component of these requirements. As the federal government begins to require even more communication among all healthcare providers, the need for standards-based technology will undoubtedly become an integral part of the medical imaging IT infrastructure.”

“By taking a proactive approach and deploying technology such as image sharing applications, your department—and organization—will be better prepared for the impending future.”

Since this admittedly came as a surprise to me, I did a search and came up with an article in Healthcare IT News that listed the actual wording of the December rulemaking that Dr. Dreyer was interpreting.  Sure enough, in # 15 and #17 in the list of 23 Stage 1 Meaningful Use criteria, there appears a reference to “diagnostic test results”, and one can easily agree with Dr. Dreyer that that should be interpreted to include the actual images themselves.

What a timely discovery!

Medical Image (data) Sharing is already a hot subject.  By my count there are already 20 companies pitching some version of electronic Image Sharing…data transfer from site A to site B over a Virtual Private Network (VPN) or through an encryption application over the internet.  In most cases, these products are simply replacing the method of data transfer, replacing CDs with a network.  Most of these solutions fail to address a more subtle problem with data exchange between systems.  That problem is data compatibility.

All PACS systems are largely DICOM-conformant, but that conformance in and of itself does not guarantee data compatibility between different PACS.  Image data formatted by PACS A is not necessarily going to be fully compatible with PACS B just because the data is in the DICOM format.  I’ve already posted a piece on this subject on this web site. These new electronic image sharing products/services must be able to perform bi-directional dynamic tag morphing on the image data being transferred between systems in order to assure compatibility on the receiving end.

What makes Dr. Dreyer’s conclusions regarding electronic image sharing in 2011 so interesting is that they link Image Sharing with the larger subject of Meaningful Use by 2015.

I believe Meaningful Use in 2015 will depend on Ease of Use, and that strongly suggests a single consolidated image data repository and a single UniViewer, and the foundation of that concept is dynamic tag morphing…the ability to make image data from disparate PACS compatible with a single viewer.   So the PACS-Neutral Archive and the Image Sharing System have a very important key ingredient in common…Bi-directional Dynamic Tag Morphing.

There may be plenty of time to build the infrastructure necessary to achieve Meaningful Use of image data in 2015, but there’s no point in overlooking opportunities to build the stepping stones of that infrastructure this year.  An Image Sharing solution that includes the tag morphing application might easily be expanded, step-by-step, year-by-year to become the Neutral Archive an organization will need in 2015.

Picking the right Image Sharing solution, the one that grows into Neutral Archive, means having the bigger plan in place for the Neutral Archive.  Getting from 2011 to 2015 with the least number of dead-ends, restarts, forklifts, etc, means taking the time to build the big plan now.  Thank you, Dr. Dreyer, for providing a more immediate motivation.

What’s it going to take to achieve Meaningful Use of Images?

The other day a friend of mine forwarded to me a link to the Imaging Technology News eNews web site.  My friend encouraged me to look on the left bar of the web page and find the invitation to participate in their current survey.  The question was “Will PACS/RIS meet the meaningful use criteria to qualify for incentive dollars?”  If the survey is still running, you can check out the current results here.

Last time I checked, 33% thought that PACS/RIS would meet the criteria and another 30% thought that there’s a good chance it will.

I’d love to see the demographic of the survey participants, and I’d love to see a list of their assumptions.

I’m among the 30% that responded with a solid “no”, convinced that the PACS/RIS as we know it will not qualify for Meaningful Use, because it simply doesn’t have what it takes, and most likely never will.

If the survey participants gave serious thought to the question, they should have realized that the most critical component of what it takes to sustain Meaningful Use will be “ease of use”.  Most physicians are far too busy to learn and remember how to use more than one image viewer.  Most physicians are far too busy to switch back and forth between multiple viewers to assemble a montage of all the relevant clinical information in a single viewing window.  That’s exactly what will happen if we continue on the present path of developing individual URL links between the Physician Portal and the data elements being stored in each of the specialized departmental PACS, and using those department PACS viewers to view the data.  This approach shouldn’t make sense to IT, and it won’t make sense to the physician users.  So the participants must have been assuming that an all-encompassing Enterprise PACS will emerge, a single PACS that will embody all of the specialized department PACS requirements and thereby become the Uni-PACS.

In my opinion, it is highly unlikely that a current generation Radiology or Cardiology PACS or any other departmental PACS for that matter, will evolve in the next few years into an Enterprise Data Repository capable of managing the patient’s longitudinal record of all clinical information.  I seriously doubt that they will be able to manage all of the image information, much less all of the non-DICOM and non-image data objects.

Managing all of this clinical data is probably the easier part.  The harder part will be providing all of the expected display and processing applications that are specialized for each of the contributing imaging departments.  This is not to say that some of the larger vendors won’t try to become an all-encompassing enterprise PACS, or at least claim to be the Whopper of PACS, but I don’t see that happening.

In my opinion, the more likely scenario will be the Enterprise Neutral Archive fulfilling the role of the Enterprise Data Repository, and the (interfaced or embedded) UniViewer will provide the unified set of viewing tools that the physicians will use to access and view all of a patient’s clinical information, both the image and the non-image data being managed by that Neutral Archive.

Today, more and more Health Care organizations are “getting it”.  They see all of the advantages of separating the “archive” data management applications from the departmental PACS.  And it’s a natural to add a viewer to this new generation Archive.   Sooner or later, each of the PACS vendors will “get it”, and at that moment the push will be on in their R&D groups to further differentiate their department PACS products with the specialized applications unique to that department.  Their PACS will have to become an even better, specialized tool for each department, because the Neutral Archive will have already become the tool of choice for the Enterprise.  Meaningful Use will be much easier to achieve if the physicians know they only have to go to one repository and only have to use one viewing application to assemble all of the relevant clinical information in a single viewing session.  Get it?

Next Generation PACS will be Smaller

I read an article today in the Health Imaging & IT electronic publication.  In this article on the next generation PACS, the author states his belief  that the next generation system will have to become bigger, become all-encompassing, become a PACS for every department; or at least be able to interface with the other systems across the enterprise.  For good measure, the article mentions the need for a web product good enough to support meaningful use.

There’s nothing much new here, in fact the vision is distorted.

The major PACS vendors have been working on their Enterprise PACS for some time now, assuming that the “enterprise” consists of Radiology and Cardiology.  How’s that been working out?  How many vendors have achieved fully functional Radiology and Cardiology application packages that run on a single platform with a consolidated Directory database and can exchange image data with each other?  After all this time, there are perhaps two, depending on one’s interpretation of the adjectives I used in the definition.  History suggests that folding in Pathology, Ophthalmology, Dental, etc. is going to take some time.  I don’t think we can afford to wait.

As for interfacing with other systems across the enterprise…that certainly sounds easier for the major PACS vendors to achieve than trying to be pretty good at all those individual department PACS applications.  Unfortunately that’s not going to be easy either, because there are simply too many idiosyncrasies in the way the individual vendors have implemented DICOM.  Don’t misunderstand, the implementations are largely “conformant”, they’re simply not completely compatible.  You know that, right?

I offer as simple irrefutable evidence two well known issues:  [1] data exchange between PACS via CD is problematic, and [2] replacement of one generation PACS by another requires a costly and time-consuming data migration.

I’m making an issue of this issue again, because it is my opinion that the next generation PACS is not going to become the bigger Enterprise PACS, nor is it going to suddenly start playing nice with the other PACS.

In my opinion, the next generation PACS is going to get a lot smaller, focusing on and becoming very good at supporting a specific imaging department’s workflow and providing its diagnostic tools.  Some of this functionality will most likely migrate up-stream to the actual modalities and their associated workstations, making this generation PACS even smaller.  The next generation PACS will also lose a lot of weight.  There will be the appropriate but minimal working storage, but certainly nothing like the TeraBytes of girth in the current systems.  As for short-term and long-term archiving…nothing.  That’s not where to put archiving.

Basically the next generation of PACS will be individual department-specific applications sitting on their own dedicated servers, each embellished with the logo of that department’s favorite vendor, and interfaced to a PACS-Neutral Enterprise Archive.

The Neutral Archive will dynamically manage all those cross-vendor idiosyncrasies, which the PACS vendors should really  appreciate, because that means they can stop pretending that they are going to fix the problem they created in the first place.  The PACS vendors can go back to doing what they do well, building work flow and diagnostic tools.  The Neutral Archive vendors will take over the significant task of managing all of the data from across the enterprise, assuring full interoperability between the PACS, and providing the level of Information Lifecycle Management that is long overdue in this industry.

As for the holy grail…enterprise-wide access to all of the enterprise data through the EMR Portal using a single viewer…the PACS Vendors can give up trying to figure that one out as well.  Most of their “Web Viewer” solutions can barely lift a radiology image.  There are some truly good “UniViewers” as I call them on the market, and more in the works.  What’s more,  they’re simple, standalone applications that don’t have to be embedded into the bowels of the Archive.  They could be as easily changed as a tie, albeit more expensive than a tie, but you get my point.

My point is that rather than looking for PACS to become more than they already are, and rather than taking up pitch forks in the name of DICOM convergence, think small.  It’s time to think specialization.  Award true excellence that has been surgically applied to a specific task: a department-specific PACS, a Neutral Enterprise Archive, and a UniViewer for the Portal.  Think “meaningful use”.

Next Step in Image Sharing…Beyond the CD/DVD

The task of getting a patient’s medical images into the hands of the Specialists, Surgeons and Primary Care physicians becomes considerably more complicated, when those images are produced in an “outside” Organization.  The practice of forwarding film-based images ahead of or with the patient is increasingly rare, having been replaced with the conveyance of digital copies of the patient’s images on CD or DVD.   While this method of data exchange between organizations is considered more efficient and less expensive than forwarding films, the problems associated with data exchange using the CD/DVD are now legendary.

Aside from such obvious issues as viewing software and media compatibility, the principal problem is frequently basic data incompatibility.  The DICOM standard allows a significant degree of “customization” of the DICOM image data header by the PACS vendors.  In a white paper recently written by Dr. Wayne DeJarnette, titled Context Management and Tag Morphing in the Real World and posted on their informational web site, there are 10 examples sited where certain key pieces of information stored in the DICOM header need to be created, modified or moved in order for one PACS to be able to properly interpret the data created by another PACS.  I highly recommend reading this paper to catch up on the subject generally referred to as Tag Morphing.

Apparently the problems associated with sharing medical image data using CD/DVD media has reached critical mass, because a number of solutions in the form of Data Exchange Servers and Data Exchange Services have recently entered the market.  The focus of these new products and fee-per-study services is clearly to provide an end to the pains of data exchange.  Unfortunately there is now yet another set of issues.

Clearly the  most exciting solution to the data exchange issue is the “Image Share Service in a Cloud”.  How can one not get excited about anything in a cloud?  I counted a half dozen such “cloud” service solutions being exhibited at the 2009 RSNA or being advertised since.  The simple, high-level summary description of this fee-per-study service is as follows.  An authorized organization/user accesses the upload application through a secure web site.  A couple of simple clicks and data insertions later and a patient’s medical image data is uploaded to a central server in the cloud.  There are a number of methods for announcing the availability of these images in the cloud to the intended recipient, ranging from email notification to a phone call.  The authorized organization/user then accesses the secure web site hosted by the cloud server and is granted access to only those images intended for their use.

Here is the interesting part.  The intended user then downloads to their PC a very small piece of client software, in some cases there is no client software (zero), and this allows the user to view the images on their PC.  Most of the display applications I have seen associated with this version of the cloud service are based on what is called “server-side rendering”, meaning all of the image rendering, processing, etc. being directed by the user is actually being executed on the server in the cloud.  The result of this rendering, an HTML page, is all that is actually downloaded to the user’s PC.  The actual image data itself is not downloaded to the user’s PC.  The actual image data itself does not leave the secure server in the cloud, making this a very HIPAA-compliant application.

The current state of server-side rendering display applications allows for support of full-fidelity (loss-less) images and a full range of image processing features (2D, 3D, even Orthopedic templating), so the display application associated with most of these cloud-based image exchange services should be well received by a wide range of physicians seeking access to a patient’s images that were produced in an “outside” organization.

What I find most interesting about this approach to image sharing is that this solution totally avoids the data incompatibility problems encountered when an organization attempts to actually import digital image data from an “outside” PACS into their local PACS.  Instead of importing “outside” study data into the local PACS, so the images can be accessed and viewed by the physicians using the local PACS web viewer, the cloud solution depends on its own embedded display application to access and display the image data.  Just like all PACS that customize the image header of incoming image data, the cloud server only has to make the incoming study data produced by the contributing PACS completely compatible with its own display application.  Moreover these new server-side rendering display applications frequently offer a wider range of features and functions than the incumbent local PACS “web Viewer”.   It’s a clever solution that simply avoids the data incompatibility problem.  As mentioned, this version of image sharing is available as either a purchased/leased “appliance” or a fee-per-study cloud-based service.

However clever this solution appears, it is important to remember that this version of image sharing does not solve the data incompatibility problem.  If an organization wishes to assimilate a patient’s image study data created by an outside organization into that patient’s local longitudinal medical record (acquire the outside study data into the local PACS and add the study to the patient’s local folder); the data must first be modified,  more specifically the DICOM headers must first be modified, to satisfy the idiosyncrasies of the receiving PACS.  That means executing Tag Morphing of the type and complexity mentioned in the DeJarnette white paper.  Unless the contributing and receiving organizations have only a few studies a day to exchange, a manual approach to this Tag Morphing would be too labor intensive to be practical, not to mention fraught with the potential for human error.   In short the exchange of study data between two different organizations, and especially between disparate PACS requires an appliance or a fee-per-study service that can automatically execute Dynamic Tag Morphing on the incoming DICOM image data headers, prior to exporting the data to the recipient PACS.  Any solution that does not support this key process, is naively  relying on “DICOM conformance”, and we already know the problems with that approach.

In summary, I think an appliance or a cloud-based service that can provide the physicians with HIPAA-compliant internet access and display of a patient’s outside images is a significant advance over the CD/DVD method of data exchange.  I think the display-only approach is a clever way to avoid the problems inherent in exchanging data between disparate PACS.  The participating organizations simply need to understand their needs and make sure that the chosen solution will meet their expectations.  Products or Services that suggest that actual data exchange between the PACS is an option should be expected to provide evidence that their product or service supports Dynamic Tag Morphing.  Otherwise the organizations will likely end up right back where they are today with their CDs and DVDs.

Note: Currently there’s not a lot of information available on DICOM Tag Morphing out there on the web.  In addition to the DeJarnette paper already mentioned, you might want to focus your favorite search engine on “vendor-neutral archive”, as I’m sure any of those vendors can provide additional info on this subject.

Essential Ingredients of a PACS-Neutral Archive

Whether its name is PACS-Neutral or Vendor-Neutral, the Enterprise Archive whose major feature is data exchange between disparate PACS is sometimes hard to identify.  The situation is similar to the Christmas shopper trying to differentiate the real designer handbag or watch from the fakes.  Simply publishing a list of the “real” Neutral Archives is not sufficient, because products change, and entry to the list could be as easy as making a few key changes.  A better way to identify the real Neutral Archive is to use a carefully conceived check list.

I came across just such a check list at this year’s RSNA.  It was a piece of booth literature created by Acuo Technologies.  While I completely agree with a good deal of the Acuo list, I have made some additions and changes to make it my own (and the elements from the Acuo list are reprinted here with their permission).  I offer the following check list to those that are interested in understanding what constitutes a fully functional Neutral Archive.

Bi-Directional Dynamic DICOM Tag Morphing – On-the-fly conversion/mapping of data elements in a DICOM header to facilitate data exchange between disparate PACS.

DICOM – Full conformance with latest DICOM SOP Classes (SCU and SCP) with no exceptions.

Non-DICOM – Methodology for accepting and managing non-DICOM data objects, most notably J-PEG and PDF objects, preferably via Web Services and optional support of WADO for viewing .

ILM Methodology – Intelligent Information Lifecycle Management (ILM) – data movements internal and external to the system based on meta data associated with the study data (study date, study type, patient age, etc.) Separate ILM strategy for each organizational node (facility, department) .  Automated, user-defined Data Purge mechanism with manual supervision. Node-specific Retention Flags that would over-ride a Purge operation.

Unique ID – Ability to make the ID associated with incoming data  unique (throughout multi-facility enterprise) without need for Master Patient Index (MPI).  Internal Pseudo-MPI capabilities tie individual facility MRNs to the same patient (in a multi-facility organization).

Pre-fetching / Auto-routing – HL-7 or DMWL enabled Pre-fetching of relevant prior data being managed by the Neutral Archive and Auto-routing of that data to the appropriate department PACS either directly or through the Archive’s local Facility Cache.

Flexible Architecture Server and storage Hardware Agnostic, Primary and Secondary Mirrored Subsystems,  Active-Active or Active-Passive modes  with Automated Failover and Automated Reconciliation between the two subsystems.

Data Integrity –  Synchronized updating of meta-data (patient/study level changes). The Neutral Archive will propagate updates received from RIS(s) or via manual update to all destinations that received studies in order for all available patient information to stay in sync.

Storage Reclamation – Ability to reclaim storage space following media migration or data purge.

Data Compression – Preferred compression syntax is JPEG 2000 Lossless but capable of negotiating any DICOM supported compression syntax.  No proprietary compression syntax.

XDS-I Manifest – Automated creation of XDS-I manifest for all data objects ingested by the Neutral Archive, with Optional XDS-I Registry and Repository available when needed.

Transaction Logging – HIPAA compliant Logging and Reporting by Organizational Node

Experience – Neutral Archive vendor has years of experience in the business: Millions of studies migrated between all of the major PACS, a dozen or more instances of the Neutral Archive deployed

The reader will notice that I did not include on the list one of my favorite items: the UniViewer.  While a unified multi-modality viewing application is a highly desirable option, and the key to image-enabling the EMR, I don’t think the viewing application itself needs to be an integrated component of the Neutral Archive.  That’s because it can easily be a free-standing add-on.  There are a number of zero-client, server-side rendering display applications that run on their own server platforms and simply access the image data being managed by the Neutral Archive.  There seems to be no advantage to forcing the Neutral Archive vendor to pick one at the expense of the others.  It seems totally appropriate to let the Health System make this selection, knowing that there are no complicating interface issues.

A final comment…The above list is by no means a complete description of, or specification of, a PACS-Neutral Archive.  Gray Consulting has created a very comprehensive Request For Proposal document that accurately defines a Neutral Archive.  The Technical section of this RFP contains 22 categories of questions that span 120 pages.  Anyone interested in a thorough investigation of this subject is welcome to contact Gray Consulting.