Charge-transfer-efficiency (CTE) is a parameter that is associated with the optical performance and radiometric accuracy of a charge-coupled device (CCD). While modern CCDs are typically quoted as having CTE > 0.99999, we show that a) this efficiency can be degraded by exposure to energetic protons and b) the measured efficiency, at least in the case of irradiated devices, is dependent on the size of the signal being transferred. A comparison of two techniques of CTE measurement is presented, with emphasis on a 55Fe x-ray charge-generation and collection technique tailored especially for use with linear CCDs. While our technique follows the same general x-ray method widely used to characterize area CCDs, its implementation, including the processing of the resulting experimental data, is somewhat novel. The x-ray technique requires no special device circuitry or equipment and, in our configuration, may be used on virtually any linear or bilinear CCD; the same general technique is used on area arrays. It is shown that the technique is appropriate for characterizing small-signal CTE with or without a high background. It is also shown that the electrical injection CTE, while requiring the injection circuitry to be designed and built-in to the CCD structure, is more appropriate for large-signal CTE measurement. Data are presented showing experimental pre- and post-irradiation CTE measured as a function of signal level and background level for a two-phase linear CCD. Additionally, it is shown that in our tests, the CTE degraded even at very small accelerated doses, but further degradation was at least partly compensated by an enhanced background due to increasing dark currents.