UK Imaging Informatics Group
PACS and Multislice CT PreviousNext
UK Imaging Informatics Group > Questions & Answers > PACS Procurement & Business Cases >
Subtopic Last Post Last Poster Posts
Archive through January 10, 200510-01-05  09:02 amPhil McAndrew15
 
Message/Author
 Link to this message Neelam Dugar  posted on Monday, January 10, 2005 - 10:00 pm Edit Post Delete Post Print Post
1. storage of images is a fraction of the price to what they were a few years ago. (provided the vendor is using of the shelf hardware for storage) so quite rightly this in future will not be the limiting factor for storing data (raw or thin slices). question is do we really need to store raw data forever?
2. network is also a fraction of the cost that they used to be. We have recently upgraded our hospital backbone from 1GB/s to 720GB/s. (Currently wireless networking for home-use is available at 100mbps.)
3. PCs cost a fraction of the price of what they used to. If PACS vendors are using off the shelf hardware for workstations then prices for PACS will continue to fall. Grey-scale monitors cost a fraction of what did a few years ago.
4. Web is the way to go. Some vendors supply a totally web-based PACS solution for radiologists/clinicians alike. You would be surprised at the speed of image access. 90 slices of CT on 100Mbps hospital network is available instantaneously.
 Link to this message Chris Dube  posted on Tuesday, January 11, 2005 - 09:37 am Edit Post Delete Post Print Post
I have questions about the backbone speed you gave for your network. You say that your network backbone is now 720GB/s. I would be interested to find out what technology you are using, Ethernet or ATM. At the moment the maximum for Ethernet is 10GB over fibre. What kind of switches are you using for your network? A 10Gig backbone is still very expensive for most Hospital Trusts in the UK. From a costing point of view how much does it cost to install a 720GB backbone?
 Link to this message Neelam Dugar  posted on Tuesday, January 11, 2005 - 05:19 pm Edit Post Delete Post Print Post
I am no technical expert. This is what I have been told by our IT department. They have spent a large sum of money upgrading the network last year. This was to be ready for PACS. However, with delay upon delay with NPFIT, I am not sure whether we will be able to test the network for PACS.
 Link to this message Neelam Dugar  posted on Friday, January 14, 2005 - 10:23 am Edit Post Delete Post Print Post
Does anybody know in what format the raw data from these multislice CT is stored? Is it in dicom format such that software like Voxar can use that data to do 3D recons? Or can the raw data only be analysed by the CT vendors workstation as it is in a proprietary format?
 Link to this message Colin J Roberts  posted on Friday, January 14, 2005 - 11:51 am Edit Post Delete Post Print Post
Before I begin I would like to declare that I work for Barco within the Voxar product group. I am not concerned in sales activity with the company.

The DICOM 3.0 standard is the most widely used today and you can be fairly sure that all vendors modern CT and MR scanners are capable of offering output in DICOM 3.0 format.

The raw helical data from the MDCT is reformatted on the CT console, initially into slices that are stored on the PACS or modality. I do not know of any current scanner that doesn't reformat the helical data into slices prior to making them available for viewing.

The slices are made available to the scanner workstation following this process so there is no real difference between the Vendor's modality workstation and any other workstation would operate as they essentially get the same data to work with.

In cases where a vendor has total control of an imaging platform (i.e they are responsible for all the major capital equipment in a hospital) they may have the ability to use there own proprietary file format to store and transfer the slices.

This is becoming less common in the modern digital age as clinicians and IT departments are not as willing to accept 'closed' systems as they once were.

It is also safe to say that ensuring connectivity between Vendor's systems was one of the key drivers in the development of the original DICOM standard.
 Link to this message Stephen Davies  posted on Saturday, February 12, 2005 - 07:04 pm Edit Post Delete Post Print Post
The responses to my original questions have been extremely helpful and I'm grateful for those who took the time to respond. I now have a further question based upon the premise that we will report multislice CT examinations on our multi modality workstations. We would aim to transfer a refined dataset, for example, 5 mm reconstructions in axial and coronal planes for standard abdominal work. Special reconstructions would be done on the modality workstation and the product then exported onto the PACS system. We need to have an idea of the volume of data which will be passing across our network, and need to be archived, on a typical 9/5 working day. Can anyone help us with this? (We are assuming model efficiency!!!!!!).

What would be typical daily workload be for a multislice CT in terms of examination mix and complexity (standard DGH)?

Any help would be much appreciated.
 Link to this message Laurence Sutton  posted on Thursday, February 17, 2005 - 05:58 pm Edit Post Delete Post Print Post
Stephen. I think I work in an average DGH, with a the expected amount of scanning workload for cancer imaging ( CT and MR). We do about 124,000 Exams per year from CR to MR. Below is a table of a weeks output from the department which is just under 1Terabyte per year. We have a 4 slice CT scanner. Our send to PACS protocol is similar to that outlined by Rhidian but it is true to say that the majority of the slices we reconstruct we send to PACS. So under 10GB per week from CT and MR and between 4 and 5 GB from CT.
Our long term storage is only 2.6TB but because the data is compressed we are only one third full after 46 months.
RAID in pretty cheap now and I have been told that it is around 1800 per Terabyte if you purchase independently. Our back up, long term archive is MOD. So for easy access and cheap storage ?! 10T/15T RAID with some longterm recovery solution.
How much data is flowing across a particular part of a network is a difficult topic and will depend on how the network is managed. The raw data in the attached will give an indication of the output from the modalities. How much is retrieved to workstation etc is more difficult I think.
Laurence.



application/vnd.ms-powerpoint
Data Output Per Week..ppt (25.6 k)
 Link to this message Rhidian Bramley  posted on Thursday, February 17, 2005 - 07:32 pm Edit Post Delete Post Print Post
Thanks Laurence. It is interesting to see what would happen if you did have a 16+ multislice scanner and did decide to archive the thin section data, as some trusts are planning. At 1mm slice thickness for CT thorax/abdo/pelvis you would acquire > 800 images. At your current storage levels of 8759 images per week you could only scan only 10 such patients per week!

I'd be interested to know ~ how many patients you scan on an average day and ~ what proportion are CT heads. Also do you know why your ultrasound images only half the size of a CT and MR image?
 Link to this message Andrew Downie  posted on Thursday, February 17, 2005 - 09:36 pm Edit Post Delete Post Print Post
We have a 16 slice CT, scan 25+ patients a day, & half or less are heads. We could probably produce 8000x1mm images per day. Currently we don't keep the 1300 images from a typical CT peripheral angiogram, for example, but if there was a central store someone else was paying for I might be tempted. Ideally the 1mm slices could be kept for a short period (months), then discarded, but I doubt any PACS can do this automatically.

We are looking at this issue in Scotland. If one department insists on sending 1mm slices to the proposed central archive, so will all the others, I suspect. Do we need a national policy on what gets archived centrally, and would it work?
 Link to this message Rhidian Bramley  posted on Thursday, February 17, 2005 - 10:37 pm Edit Post Delete Post Print Post
Thanks Andrew. One option is to have a local CT archive for the thin section data (for a short period). Individual studies can then be sent to PACS if required, or as discussed, reviewed on the modality specific workstations and any further reconstructions/key images then sent to PACS.

I agree there needs to be some fair allocation of storage space in a shared central archive. If the only concern is cost then funded service level agreements may offer the most flexible arrangements as these would enable departments to purchase additional storage capacity as required. If the central archive delivers the cost savings envisaged for the ASP model of large volume data storage then it should be more economic to send images centrally than maintaining a local archive.

In the NWWM cluster a CCN has been put forward on this basis to divide payments/allocations on numbers of examinations (as a surrogate for volume of data) instead of the small, medium and large trust basis in the original contract. I don't think this goes into detail of annual storage SLAs for each trust, although it was felt this could be addressed in future by flexing internal funding allocations through the SHAs.

In practice a trust sending large volumes of data may impact on the solution in other ways that just storage costs, such a network capacity and retrieval times. If this has a detrimental effect then an interim solution may be to reach agreement and cap the numbers of images (per exam) trusts are allowed to archive centrally.
 Link to this message Neelam Dugar  posted on Friday, February 18, 2005 - 10:25 am Edit Post Delete Post Print Post
ASP and PACS
http://www.thedesktop.com/emed/asp_9.html
 Link to this message Ross Clark  posted on Friday, February 18, 2005 - 10:43 am Edit Post Delete Post Print Post
I think the main challenge in deciding what to store is what you make accessible to the average end-user.

Experience says that they are only interested in the 'representative' images.

We have a 16 slice scanner (about 25 exams/day about 20% of which are heads), 1mm recons are sent to the modality workstation and stored on MOD and a representative set similar to what used to be filmed is sent to the PACS store for general consumption. The radiologist reports from the modality workstation and has the ability to store any additional MIPS, MPRs or 3d recons they feel are helpful from there. Almost no CT reporting is done on actual PACS workstations.

Of course it would be nice to store all your raw data or 1mm recons on the off chance that in 4 years time you might want to go back and do some further reconstructions. The practical problem, as Rhidian says, would be in stopping these being prefetched or queried up from the store for/by someone who cannot make use of them thus wasting time & bandwidth.

From a Scottish perspective, I think a binding and robust national policy must be agreed before any storage to the proposed central store is started.
 Link to this message June Bea  posted on Tuesday, May 17, 2005 - 12:32 am Edit Post Delete Post Print Post
application/pdf
[IF-RND-350S-1000]Reformat Gateway Specification.pdf (332.2 k)


The Reformat Gateway receives DICOM files from modalities or PACS server and generates MPR images automatically. According to the study information, axial or coronal or sagittal images are generated with the preset interval, thickness and window W/L
 
Add Your Message Here
Post:
Username: Posting Information:
This is a private posting area. Only registered users may post messages here.
Password:
Options: Automatically activate URLs in message
Action: