Runs cutting out mid-processing
Has anyone ever experienced this- processing a run with a method set that has 2 derived channels in it- two single wavelengths (220nm and 240nm). It gets about 3 injections in, sometimes less or more, then stops, and the message says "Not enough memory allocation" or "Failed to map chrom field". But its random! If I log out and in again and process the same run it goes to completion no problem. Other times it will work for a few runs then cut out again, same messages.
It ONLY happens on runs with derived channels method sets- I create a new one and try that, still happens. I try processing with acquisition set vs picking method set- no, still random. Never happens with a run without derived channels.
This is driving me mad- Waters support say to increase the raw data file in the project and that using several intersample custom fields across 2 channels could be clogging it up but then why does it work fine sometimes?? I have enough table space but yes, my raw data file is 1200MB so a bit high.
I just cant understand exactly what could be causing this, it doesn't seem to be a known bug. I use Empower 2 Build 2154- any ideas???
It ONLY happens on runs with derived channels method sets- I create a new one and try that, still happens. I try processing with acquisition set vs picking method set- no, still random. Never happens with a run without derived channels.
This is driving me mad- Waters support say to increase the raw data file in the project and that using several intersample custom fields across 2 channels could be clogging it up but then why does it work fine sometimes?? I have enough table space but yes, my raw data file is 1200MB so a bit high.
I just cant understand exactly what could be causing this, it doesn't seem to be a known bug. I use Empower 2 Build 2154- any ideas???
0
Best Answers
-
HI MJS. Yes, I think I am going to have to put this issue down to file size plus there may be an issue with this specific sample set. Its in a testing project, it was copied over from a live project and its had tonnes and tonnes of changes to the proc method (CConst values, component RT, CCalRef1 etc) and tonnes of result sets too as I'm testing out different custom fields.
Its very possible that something has become corrupt in the interim, it just strange that sometimes it does work and others it doesn't. I'm hoping I don't encounter similar issues when I release the custom fields. Thanks again.
1 -
Custom fields are given a certain memory space in which to calculate. Depending upon what you are doing, this memory space my become full and processing will abort mid sequence. If you log out and back in, you have effectively cleared the memory space, at which point you are able to fully process your sample set.
This commonly happens with intersample calculations are not properly limited or if you are using too many calculations. Poper limitation may include fully specifying all paramters in the intersample calculation as well as setting the custom field to run on the current result set only.
2
Answers
-
I recall something similar happening to one analyst a few years ago and it was specific to just one or two sample sets within the same project (Enterprise system/Citrix, Empower 2 at the time). Nobody else was experiencing issues with PDA analysis in the other projects. It did seem to be that the project's raw data was just getting too large and the system just couldn't handle it any more. I think that I had even sent Waters my project to see if they could troubleshoot it directly.
I think what worked was having the analyst manually process the sample set through the review window (tedious, but would likely bypass any memory issue) and I created a new project for anything new. I think I had also even tried copying the sample set to a fresh project, but I don't think this worked. It was a few years ago, so it may have and maybe I'm just mis-remembering. Good luck!1 -
Hi guys, is the raw data memory file increased by processing runs with large numbers of intersample custom fields? I don't understand how logging and off would clear the memory, wouldn't you be adding to it all the time each sample set that is processed?
I think that's probably whats happening to me. My project is designed only to test a whole range of custom fields from intersample summaries etc and a lot of them are of the type A.%.(Area)*(CConst1/CConst2) etc etc and I don't specify which peak to calculate it on so PDA runs could really pile on the memory for the samples labelled A..
0 -
In the Custom Field settings check the Search Order setting. We have had issues when the Search Order is set to Result Set First. When there are a large number of result sets in the project this can cause issues with running out of memory. When set to Result Set First the CF will start to query all of the available result sets to include in the CF result value.0