Maximizing the minimum absolute contrast-to-noise ratios (CNRs) between a desired feature and multiple interfering processes, by linear combination of images in a magnetic resonance imaging (MRI) scene sequence, is attractive for MRI analysis and
interpretation. A general formulation of the problem is presented, along with a novelsolution utilizing the simple and numerically stable method of Gram-Schmidt orthogonallzation. We derive explicit solutions for the case of two interfering features first, then for three interfering features, and, finally, using a typical example, for an arbitrary number of interfering features. For the case of two interfering features, we also provide simplified analytical expressions for the signal-to-noise ratios(SNRs)and CNRs ofthe filtered images. The technique is demonstrated through its applications to simulatedand acquiredMRl scene sequences of a human brain with a cerebralinfarction. For these applications, a 50 to 100% improvement for the smallest absolute CNR is obtained.