Non-destructive evaluation (NDE) techniques for condition monitoring in remote solid structures have evolved vastly in the last few years. Algorithms for estimation of sensor integrity and for noise correction form a crucial aspect of NDE. This paper presents a sensor validation approach that verifies sensor integrity, identifies and corrects noise effects and selects the best possible array of sensors for multi-sensor fusion. The proposed methodology uses a novel change detection algorithm for noise correction and a clustering algorithm to isolate useful signal information from the sensor data. It was used for sensor selection in a NDE field study, where multiple sensors were used to examine a solid structure. The methodology achieved 97% accuracy in the experiments, indicating its efficacy.
In this paper, we present a feature selection and classification approach that was used to assess highly noisy sensor data from a NDE field study. Multiple, heterogeneous NDT sensors were employed to examine the solid structure. The goal was to differentiate between two types of phenomena occurring in a solid structure where one phenomenon was benign, the other was malignant. Manual distinction between these two types is almost impossible. To address these issues, we used sensor validation techniques to select the best available sensor that had the least noise effects and the best defect signature in the region of interest. Hundreds of features were formulated and extracted from data of the selected sensors. Next, we employed separability measures and correlation measures to select the most promising set of features. Because the NDE sensors poorly described the different defect types under consideration, the resulting features also exhibited poor separability. The focus of this paper is on how one can improve the classification under these constraints while minimizing the risk of overfitting (the number of field data was small). Results are shown from a number of different classifiers and classifier ensembles that were tuned to a set true positive rate using the Neyman-Pearson criterion.
Rapid developments in sensor technology and its applications have energized research efforts towards devising a firm theoretical foundation for sensor management. Ubiquitous sensing, wide bandwidth communications and distributed processing provide both opportunities and challenges for sensor and process control and optimization. Traditional optimization techniques do not have the ability to simultaneously consider the wildly non-commensurate measures involved in sensor management in a single optimization routine. Market-oriented programming provides a valuable and principled paradigm to designing systems to solve this dynamic and distributed resource allocation problem. We have modeled the sensor management scenario as a competitive market, wherein the sensor manager holds a combinatorial auction to sell the various items produced by the sensors and the communication channels. However, standard auction mechanisms have been found not to be directly applicable to the sensor management domain. For this purpose, we have developed a specialized market architecture MASM (Market architecture for Sensor Management). In MASM, the mission manager is responsible for deciding task allocations to the consumers and their corresponding budgets and the sensor manager is responsible for resource allocation to the various consumers. In addition to having a modified combinatorial winner determination algorithm, MASM has specialized sensor network modules that address commensurability issues between consumers and producers in the sensor network domain. A preliminary multi-sensor, multi-target simulation environment has been implemented to test the performance of the proposed system. MASM outperformed the information theoretic sensor manager in meeting the mission objectives in the simulation experiments.