UK Imaging Informatics Group
PACS & VNA Scoring Matrix PreviousNext
UK Imaging Informatics Group > Questions & Answers > PACS Procurement & Business Cases >
Message/Author
 Link to this message Darren James Ferguson  posted on Wednesday, July 24, 2013 - 08:49 am Edit Post Delete Post Print Post
Hi,
I'm wondering if anyone has a scoring matrix I could tailor for a procurement I'm working for PACS & VNA?
 Link to this message Herman Oosterwijk  posted on Wednesday, July 24, 2013 - 02:42 pm Edit Post Delete Post Print Post
Yes, I did a white paper on this subject which has a scoring table attached:
http://otechimg.com/publications/pdf/What_is_a_VNA_anyway.pdf
 Link to this message Neelam Dugar  posted on Thursday, July 25, 2013 - 08:47 am Edit Post Delete Post Print Post
SCORING OF BIDDER RESPONSES AND DEMONSTRATIONS
We used the below criteria to score the BIDDER written responses to our 50 items on the functional specification for PACS, RIS and VNA (our spec was based on the Group Spec on this website) and also the 25 items on our scripted demo.

• Fully meets the requirements & very good (6)
• Meets the requirements & satisfactory (5)
• Meets the requirements (with development) - will be ready for go live (4)
• Meets the requirements (with development) – will develop within contract lifetime (defined timeframe provided) (3)
• Meets the requirement but is poor design from user perspective/cumbersome and almost unintuitive (2)
• Partly meet the requirements—no defined plans for development (1)
• Does not meet the requirements (0)

The whole exercise for choosing our PACS, VNA and RIS suppliers was based on the above criteria and unbiased and for approx 75 items.

We took the total score for each supplier (who met our minimum specification). We divided the total cost of the solution over 7years and divided this by the total score. This gave us the value for functionality. In this way we were not being forced to just buy the cheapest solution but there was a balance between cost and functionality.

OJEU was indeed a great exercise and much was learnt in the process.
 Link to this message Darren James Ferguson  posted on Thursday, July 25, 2013 - 10:55 am Edit Post Delete Post Print Post
Thanks for the information
 Link to this message James Heesom  posted on Wednesday, July 31, 2013 - 12:31 pm Edit Post Delete Post Print Post
[Spam post removed by Moderator]






(Message edited by alexpeck on April 07, 2014)
 Link to this message Shaun Smale  posted on Wednesday, July 31, 2013 - 04:15 pm Edit Post Delete Post Print Post
Darren,
This is some good detail in this article from Herman. I have two high level recommendations to add before you jump deep into a scoring matrix. Firstly the scoring technique must be able to separate out the responses, and secondly the questions must be closed and focused. In my experience, both much easier said than done.
There are many statistical techniques to separate out responses that you may wish to explore. A simple 5 point score against a generic set of questions, particularly when considering the diverse opinions of what a VNA should be, will often end up with very close results. Neelam’s suggestions are a good set of classifications but I would recommend that you need to carefully prepare your questions and more importantly weight the responses to create a separation in scores. Grouping and scoring for each section is also important.
As to the questions for the ubiquitous VNA, I wish to comment on scoring the functionality; putting aside the sections on service, supplier credibility, etc, for now. I believe that you should divide the functionality into two main groups and score them independently, one around viewing and sharing of the data in the VNA and secondly about storing and protecting the data in the VNA, making it available for sharing.
The first group, focused on sharing, has the greater air time, with discussions around data types, unified views, MPI, XDS registries and so on. This is the more difficult group littered with complexities and very dependent on the interaction of many parties. Many of the responses in this area may involve some level of development or have dependencies on other external feature that may not be delivered.
The second group gets less attention but is fundamental to the strategic management of all healthcare data and support for the sharing functionality. The storage and protection of all data and applications is vital for data availability anywhere, anytime, even in the event of a disaster. It is a given that a VNA solution must protect the data from misuse, corruption and loss, but it also forms part of a strategic data management policy to reduce cost, future proofing and to prevent ‘lock-in’ to a given application or hardware solution. By abstracting the application from the storage and using multi-copy as opposed to replication it is possible to change either the application or the storage media without moving the data and most importantly without interrupting the clinical service. By using backup, archive and lifecycle management together, the solution can be tailored specifically to the healthcare model where the majority of data is inactive. A strategic data management solution should be able to present the data in a standards based format back to an alternative application or location to underpin sharing, but also to ensure continuity of the clinical service during planned and un-planned outages. The same platform can protect specific configuration data and clinical databases to facilitate the fast restoration of a clinical application following a disaster.
So in summary; you could use Neelam’s classification to group the question based on the likely type of answers and weight them accordingly. With features unlikely to be realised without a lot collaboration and development at one end and ‘must haves now’ with a higher weighting at the other.
Good luck
Commercial interest declared – BridgeHead Software.
 Link to this message Roy Burnett  posted on Wednesday, July 31, 2013 - 10:47 pm Edit Post Delete Post Print Post
Hi Shaun

Could you expand on your comments:

'By abstracting the application from the storage and using multi-copy as opposed to replication it is possible to change either the application or the storage media without moving the data and most importantly without interrupting the clinical service'?
Thanks.
 Link to this message Shaun Smale  posted on Friday, August 02, 2013 - 05:44 pm Edit Post Delete Post Print Post
Hello Roy,
SAN replication has its place for dynamic constantly changing data, particularly if you are after a hot fail over of applications that have a high level of database activity and files that are in regular use. However it is also important to regularly back-up this data (held on the SANs). Not so much to protect against loss (there will be the second SAN copy at a disparate location), but against data corruption. Bit and file level corruption happen for many reasons and there are tools and techniques to reduce them. The risk is from replication of the corrupted data, overwriting the good data. Regular back-up will reduce the amount of data lost and this depends on how far back in time the backups extend.

Multi copy has a number of distinct advantages for static unchanging data, particularly in healthcare where there is a lot of inactive data held for long periods for legislative reasons. When the file is received, separate copies are made and stored on disparate media. If one copy is corrupted or lost, the second, third or even the fourth copy, can be used instead. The significant advantage of multi-copy is that the copies do not need to be all on the same media type. This introduces the concept of vendor agnostic storage. The hardware can be selected to optimise and balance the performance, cost and risk associated with long term storage of healthcare data. This removes lock in to a single vendor or storage platform and facilitates the replacement or augmentation of storage media completely transparently to the application that uses the data. Note: the back-up files mentioned above can also be treated as static data, with one copy off-site for disaster recover.

The data management layer abstracts the data from the application and manages the media, security and lifecycle management of the multiple copies across the available storage. The application becomes independent of the underling storage technologies, drivers, ‘s etc. To replace a storage device, simply make a new copy on the new storage media and retire the old copy. Each copy is validated against the original using a checksum; no need to stop the application, update drivers, API’s or network paths etc.

In our vendor neutral archive, the data files are securely stored and protected in their native format. If that format is standards based (and not encrypted or compressed by the source application) e.g. DICOM part 10, PDF, Jpeg, etc., they can be made available to another application. However this alone does not amount to a full sharing solution and although these files do not need to be moved as part of a data migration, it does not negate the need for metafile migration when PACS applications are changed. The advantage of a part 10 DICOM file structure is that metafile migration can be fast and can be performed ahead of the cutover with no need to move large files. Once the instance UID has been transferred as part of the metadata migration, the new PACS can access them straight away.

Sharing requires more than accessing the standards-based file, it requires translation and mapping of critical metadata between the source application and the replacement application. This is particularly relevant to PACS where a lot of integration is involved and metadata is only held and maintained in its database. The files are stored unchanged, often in a proprietary archive, by the PACS as received from the source modality. All subsequent updates and corrections are only held in the PACS database. Files can be shared via the PACS application which updates the DICOM header using the stored metadata as it is exported.

Hence moving to a DICOM archive is only the first step towards a fast cutover between PACS systems or sharing via a central archive. Metadata must still be migrated before studies can be safely shared or migrated between PACS systems. In practice there are two options:

1. Migrated from PACS to PACS maintaining a pointer to the file in the archive and only accessing the file via the PACS (and its copy of the metadata)

2. Migrate the metadata from the old PACS to the DICOM archive and/or new PACS and maintain synchronised meta-data between all systems involved.

The goal is to underpin the clinical service through uninterrupted availability of all healthcare data at any location. Availability is delivered by strategically managing the data independently of the underlying technology and the applications that use it. Standardisation or normalisation of, Identifiers, coded content and standards-based format will deliver a unified view of that data.

Commercial interest declared
Shaun Smale – BridgeHead Software
 
Add Your Message Here
Post:
Username: Posting Information:
This is a private posting area. Only registered users may post messages here.
Password:
Options: Automatically activate URLs in message
Action: