Please clarify your question. If your question is integration-related, and you are trying to remove an integrated peak, you can do so by selecting the appropriate row in the Peak Table (located in Review) and pressing the Delete key.
No still not answered, because I do not want to remove the whole curve.
It is a fitting problem, the base is not straight. I want to subtract the lower part of the curve to approach a Gaussian form. Manually I know how that can be performed, but how to program this? I hope it is now clear to you?
When you state 'curve, ' are you referring to the peak (as opposed to the baseline) and that you have a small, unresolved peak on the tail of your parent peak? To automatically integrate this peak, which would allow you to not include the area of the small peak within the area of the parent peak, you can try adding one of the available integration events to skim the small (child) peak off of the parent peak. For Traditional integration, you can use a Tangential or an Exponential Skim event (set the Value parameter = 0). With ApexTrack integration, you can use a Gaussian Skim event. With ApexTrack, you may need to add a Detect Shoulders event, in addition to the Gaussian Skim event. In either case, you will need to set your Peak Width and Threshold parameters properly as well. For good information on the use of all of these parameters, you may want to consult the Empower Data Acquisition and Processing Theory Guide which is posted on the Waters website.
I am still unsure that I fully understand the question being asked. However, that never stops me from having an opinion. Piet, if the problem you are having is integrating very small peaks that ride on a rising, falling or undulating baseline, you may find that the valley to valley event in Empower (I assume) is very useful. I utilize this event very often. I find that it tends to enforce peak integration consistency even if the integration is not "perfect" from a theoretical stand point. In my work, consistency is paramount. Running a set of standards under the identical processing parameters essentially normalizes the whole assay to those conditions and yields predictable behavior. For low concentration or trace level work, I often combine the valley to valley event with a very low peak detection threshold. This almost always identifies the peaks and ignores the unstable baseline.
I hope I got the question (and answer) right,