Percentage Deviation on calibration curve

edited April 14 in Informatics
If I prepare two working standards separately (Working Standard A/Working Standard B ) and I need to calculate the % Difference between them, can I run the two injections and just check the value for % Deviation in the calibration curve, what exactly does this value mean and for that to work, how would I need to set up my sample set?

At present our methods state to compare the difference using the weight adjusted areas. So Working Standard A could have an area of 12000 and a sample weight of 98mg (theoretical sample weight of 100mg), Working Standard B could have an area of 11987 and a sample weight of 101mg. 

The calculation we use is: (Difference between weight adjusted areas/Mean of weight adjusted areas)*100. Would % Deviation help?

Best Answer

  • Accepted Answer
    The percentage deviation is calculated in Empower by taking the difference between the calculated and actual value and expressing that difference as a percentage of the actual value. As a formula, it is ( Calc. Value – X Value / X Value) * 100.

    I’d advise against manually checking Empower on this one; you will get a number that is close, but it will not exactly match on account of the precision Empower used to calculate and store results and differences between how Empower and Microsoft display numbers.

    However, if you use the formula above to calculate check standard, then yes you should be able to use data that Empower is giving you elasehwere.

    I would think setting up STD A as a true single level standard and STD B as a control within the sample set should give you back what you are after.

     
Sign In or Register to comment.