Land surface temperature (LST) and sea surface temperature (SST) are important quantities for many environmental
models, and remote sensing is a feasible and promising way to estimate them on a regional and global
scale. In order to estimate LST and SST from satellite data many algorithms have been devised, most of which
require a-priori information about the surface and the atmosphere. However, the high variability of surface and
atmospheric parameters causes these traditional methods to produce significant estimation errors, thus making
their application on a global scale critical. A recently proposed approach involves the use of support vector
machines (SVMs). Based on satellite data and corresponding in-situ measurements, they generate an approximation
of the relation between them, which can be used subsequently to estimate unknown surface temperatures
for additional satellite data. Such a strategy requires the user to set several internal parameters.
In this paper a method is proposed for automatically setting these parameters to values that lead to minimum
estimation errors. This is achieved by minimizing a functional correlated to regression errors (i.e., the "spanbound"
upper bound on the leave-one-out error) which can be computed using only the training set, without the
need for a further validation set. In order to minimize this functional, the Powell's algorithm is used, because
it is applicable also to nondifferentiable functions. Experimental results generated by the proposed method turn
out to be very similar to those obtained by cross-validation and by a grid search for the parameter configuration
yielding the best test-set accuracy, although with a dramatic reduction in the computational times.