A new and simple mathematical model for describing radiation-induced absorption in optical fibers is presented. It treats the radiation-induced defect-generation and the decay process as a series of superposable infinitesimal growth and decay events. Unlike the existing power law growth equation, the new equation is non-empirical, dose-rate dependent, and describes the growth and decay of the induced defect at the same time. It is capable of predicting long-exposure, low-dose-rate induced fiber loss normally taking place in a space mission, using short-exposure, high-dose-rate results produced in a ground-based laboratory. Numerically, the derived equation is also capable of simulating those effects caused by irregular radiation events such as solar-flare radiation burst. In the case of constant dose rate, the general equation reduces to a simple analytical form which agrees reasonably well with the experiment.