Anomaly detection is an important task for hyperspectral data exploitation. A standard approach for anomaly detection in the literature is the method developed by Reed and Yu, also called RX algorithm. It implements the Mahalanobis distance, which has been widely used in hyperspectral imaging applications. A variation of this algorithm, known as kernel RX (KRX), consists of applying the same concept to a sliding window centered around each image pixel. KRX is computationally very expensive because, for every image pixel, a covariance matrix and its inverse has to be calculated. We develop an efficient implementation of the kernel RX algorithm. Our proposed approach makes use of linear algebra libraries and further develops a parallel implementation optimized for multi-core platforms, which is a well known, inexpensive and widely available high performance computing technology. Experimental results for two hyperspectral data sets are provided. The first one was collected by NASA's airborne visible infra-red imaging spectrometer (AVIRIS) system over the World Trade Center (WTC) in New York, five days after the terrorist attacks, and the second one was collected by the hyperspectral digital image collection experiment (HYDICE). Our anomaly detection accuracy, evaluated using receiver operating characteristics (ROC) curves, indicates that KRX can significantly outperform the classic RX while achieving close to linear speedup in state-of-the-art multi-core platforms.