Stars are regularly observed in the visible channels of the GOES Imagers for real-time navigation operations. However, we
have been also using star observations off-line to deduce the rate of degradation of the responsivity of the visible channels.
We estimate degradation rates from the time series of the intensities of the Imagers' output signals when viewing stars,
available in the GOES Orbit and Attitude Tracking System (OATS). We begin by showing our latest results in monitoring
the responsivities of the visible channels of the Imagers on GOES-8, -9, -10, -11 and -12. Unfortunately, the OATS
computes the intensities of the star signals with approximations suitable for navigation, not for estimating accurate signal
strengths, and thus we had to develop objective criteria for screening out unsuitable data. With several layers of screening,
our most recent trending method yields smoother time series of star signals, but the time series are populated by a smaller
pool of stars. With the goal of simplifying the task of data selection and to retrieve stars that have been rejected in
the screening, we tested a technique that accessed the raw star measurements before they were processed by the OATS.
We developed formulations that not only produced star signals more suitable for monitoring the changes in the Imager's
outputs from views of constant-irradiance stellar sources, but also gave more information on the radiometric characteristics
of the visible channels. We present specifics of this technique together with sample results. We discuss improvements in
the quality of the time series that allow for more reliable inferences on the gradually changing responsivities of the visible
channels. We describe further contributions of this method to monitoring of other performance characteristics of the visible
channel of an Imager.