Description
In the field we're having a lot of issues with completing maintenance especially on tblDicomImage and tblFile on large databases over 600 gigabytes. These 2 tables normally comprise 95 to 98% of the total database size which means they can be anywhere from 950 to 980 gigabytes in a 1 TB database. Optimizing the statistics and indexes can take upwards to 3 days to complete and some customers have more than 1 database that is that size or bigger. If a customer needs a 64bit conversion, then the database is large enough to have maintenance issues. At UPMC they are currently setting up the first partitioned archive database and once that is completed and we have data on how it is working I would like to see it added to the 64bit conversion. In the field we are increasingly spending large amounts on time to customize the Index Optimize job on the sites with these large database. Maintenance should be easier if at least tblDicomImage is partitioned although tblfile should be as well. |
I provided a description with the original request but it is not showing up so here is the gist of what I put in with the request.
If we need to do a 64bit conversion, then the archive database is large enough that we're more than likely having trouble completing maintenance. The 2 largest tables in the archive database are tblDicomImage and tblFile and they comprise 95 to 98% of the total size of an archive database. On a 1 TB database tblDicomImage would take anywhere from 950 to 980 gigabytes of space. We are increasingly struggling in the field to complete maintenance on larger databases and at some sties the index optimize job can take over 3 days to complete or never complete and has to be stopped.
At UPMC they have installed the first partitioned database with tblDicomImage being the targeted table if it is successful, I would like to include this in the 64bit conversion as you cannot partition an already existing database. Please vote if you agree.
Thanks,
Sue