On-Demand Webinar: How to Improve Your Data Integrity

Discover a solution that can help you ensure your data is accurate and consistent

5 Dec 2018
Holly McHugh
Administrator / Office Personnel

Data integrity violations have become an increasing concern, as repeated incompliance has led to regulatory actions. New data procedures need to be efficient and affordable, whilst ensuring data is accurate, reliable and compliant.

In a recent webinar now available on demand, Christoph Jansen, from Mettler Toledo, presents a simple yet effective solution to improve data integrity, as well as discussing previous incompliance case studies.

Missed live event? Don’t worry, you can watch the webinar anytime here>>

At the end of the webinar, our guest speaker answered live questions from attendees, the highlights of this Q&A session can be read below.

Q: Do you recommend updating the software on a regular basis or is it easy to skip updates to save effort?

A: I have discussed this with other experts and the recommendation is to do an update on a regular basis. The reason for this is that LabX comes every year with an updated version, but the changes are only incremental and can be read in the release note. You don't have to make a complete revalidation, you can make a partial revalidation. The parts that need to be reconsidered will come out of the risk assessment by studying the release note, and the changes in your documentation will be incremental. If you do this step by step every year you have few steps.

We bring new releases of LabX for good reasons, such as fixing potential bugs or adding functionality. We add new instruments, new instrument classes, and new features and you can benefit from that if you follow the updates. And again, the effort for the validation of that update could potentially be quite little. If it's a big step, then you will also add a lot of important features and then you benefit from this end.

Q: Do you recommend doing a file-based result transfer to ERP or LIMS or application programming interface? What are the arguments for and against each option?

A: A file-based transfer is easier to implement and works well and I know some of the ERP systems maybe support this type of solution. The application programming interface (API) is easier to handle in routine and to maintain. I recently learned there may have been some changes in the requirements that you also have to keep intermediate data (transferred files). So that's the "comma separated value" (CSV) file that will be sent from the software to a protected folder and then will be read by the other end software, that would be the LIMS or the ERP system. Those files must be kept as well, and you have to make sure that they are not altered or modified, therefore you have to find measures that keep them protected and archive them. With modern data storage systems, I wouldn't worry too much about the storage capacity, but the effort to keep this could be a bit of a pain. It is your decision — I cannot say which one is better. I would today prefer the API data transfer than making it via CSV.

You can catch up with the on-demand version of this webinar at a time that suits you

SelectScience runs 3-4 webinars a month across various scientific topics, discover more of our upcoming webinars>>

Links

Tags