System Suit graphs over multiple result sets

I use a system suit summary plot to graph data from sample sets, so in a sample set with 50 samples, I need to plot an average value per 10 samples so 5 points on the graph. This works fine with one sample set into one result set but...when I run these samples over 2 or more sample sets and use the report to graph this data, I get a tonne of error messages along the lines of "Was looking for processing method ID 1009 and found 1008 instead, used 1008...was looking for processing method is 1010 and found 1011 instead...etc. I don't have the option to suppress error logs.


Any ideas why I'm getting this over multiple sample sets? I wouldn't have though chart data is only limited to one result set as surely you can use it to track resolution and tailing over multiple runs?

Answers

  • How are you doing the averaging? By a custom field?
  • Hi Heather. Yes the average is a custom field along the lines of SAME.%..AVE(Area) however even when graphing a custom field like Area*CConst1 over more than 1 result set I still get that pile of error messages. Is it the search order causing this?
  • Bump. 

    Anyone any ideas how I can get around this issue? I want to graph a peak custom field over multiple sample sets and I'm getting a big error box saying that Empower is looking for multiple processing methods..surely you can graph values over more than one result set without encountering this? Is there a filter of order by condition to remove this?
  • Are these  results from different projects? Is the CF setup to look outside a given result set? 
  • No, the results are all in the same project but different result sets. The search order is Result Set Only for the CF so that might be whats confusing it. The error message I get at the end of the graph is "Was looking for Proc method ID 2009, instead found Proc Method ID 2088, Proc Method ID 2088 was used instead"...followed by another one "Was looking for Proc Method ID 2011, instead found Proc Method ID 2099, Proc Method ID 2099 was used"..and so on, which suggests it could be an ordering thing as well?


    The graph actually does plot the correct data but the following error log is very unsightly.

  • I think I see what you are getting at...your plot is fine, but you get the error log at the end of the actual report stating this...not that there are errors preventing the plot from working.  See the attachment to confirm...I grabbed a few random result sets and simply plotted area rather than a custom field even...

    I'm not aware of any way to suppress the error log (it plagues me in other scenarios as well).  Maybe the ability to suppress/hide it is a worthy feature improvement idea you can submit?



  • Yes that's exactly what I get too. Which is misleading when I made the graphs because following some examples from Waters slides, they suggest tracking USP Tailing or Resolution values over the lifetime of the column, which means lots of sample sets which means lots of processing methods id. In their slides they conveniently leave out this error log. Maybe its only suitable for pure development work where you don't generate results and only plot values using raw data. 

    There is actually an option in report method properties to "Suppress Wrong Channel and Chrom type errors" but this is viewed with suspicion by auditors and even admin staff who ask "Why do you need that". I tried to explain that these error logs are just a visual message (eg selecting multiple injections of only 1 channel, Empower still looks for other channels) but it didn't make any difference, suppress is a bad word in this industry!

    It does still work mind you, I may just live with it, it just sticks out a mile and will prompt any auditor to ask deeper questions.  
  • I’m not sure that the Suppress report errors setting will hide those specific messages, but it’s worth a try. The setting is only about suppressing report errors on a report. I believe they’ll still be in Empower somewhere. If someone felt a need to see them in Empower they could. They aren’t lost, just not on the report...
  • Any ideas why I'm getting this over multiple sample sets? I wouldn't have though chart data is only limited to one result set as surely you can use it to track resolution and tailing over multiple runs?

    To answer why.... it is for transparency and traceability. Empower needs to show the associated processing methods to trace the threshold values for UCL/LCL, UWL/LWL, etc.... 

    The processing method contains the settings for System Suit charts. Using multiple processing methods (ie:different PM IDs) can be precarious; they may contain different settings and may present data incorrectly. If settings are different between PM IDs, it may be an attempt to present data in a polished way. Empower reports each ID for transparency and traceability. One simply needs to verify the system suit tab settings are consistent across the PM IDs.

    QA's tend to look at these as "error messages", I think presenting them as a QA tool for traceable and transparent reporting truly reflects the messages' purpose.
  • Thanks medicnman. Does that mean then that the first PM ID, and any associated UCL and LCL values will be used to graph the data despite it being spilled over 3 result sets? What I mean is if Empower spots 3 PM IDs then will it revert to all the integration and Limit settings of the first PM ID to calculate and graph the 3 values, including amounts etc? That's what I take from the message "Was looking for Proc Method ID 2999, found 3000 instead. Proc Method ID 2999 will be used to make the report". 

    Now I haven't found this to change my results as my result for result set 1 was 98, and the other two were 100 and 103 so even using the original proc id kept the same results, if that's what it does?
  • The Empower team are looking into this, and yes, the likely cause is the reliance on the Processing Method to find limits
  • Thanks Heather, it will be interesting to find out exactly what Empower does with the data when it encounters multiple PM Ids. As I said, it didn't change any of my values but that doesn't mean it doesn't revert to PM ID1 for all values. 
Sign In or Register to comment.