Discrimination between different rocket types is an important application for utilizing infrasound in event monitoring within a range of 0-100 km. This is in contrast to traditional nuclear weapons monitoring which leverages infrasound propagation over thousands of kilometers. The motivation of this research is to demonstrate the utilization of deep neural network architectures to discriminate infrasonic signals produced by rocket launches and collected by an near-field infrasound sensor array. The data collection contains three space bound rocket classes: Delta IV, Atlas V, and Falcon 9. In particular, we investigate the classification accuracy of a multi-class convolutional neural network (CNN) and a deep neural network (DNN) on various feature representations, such as neural network derived features, spectrograms, and wavelet scattering transform coefficients. Our experiments validate the viability of a CNN and DNN framework for near-field infrasonic applications, with our proposed method achieving favorable results.
Infrasonic waves continue to be a staple of threat identification due to their presence in a variety of natural and man-made events, along with their low-frequency characteristics supporting detection over great distances. Considering the large set of phenomena that produce infrasound, it is critical to develop methodologies that exploit the unique signatures generated by such events to aid in threat identification. In this work, we propose a new infrasonic time-series classification technique based on the recently introduced Wavelet Scattering Transform (WST). Leveraging concepts from wavelet theory and signal processing, the WST induces a deep feature mapping on time series that is locally time invariant and stable to time-warping deformations through cascades of signal filtering and modulus operators. We demonstrate that the WST features can be utilized with a variety of classification methods to gain better discrimination. Experimental validation on the Library of Typical Infrasonic Signals (LOTIS)—containing infrasound events from mountain associated waves, microbaroms, internal atmospheric gravity waves and volcanic eruptions—illustrates the effectiveness of our approach and demonstrate it to be competitive with other state-of-the-art classification techniques.
Infrasound propagation through various atmospheric conditions and interaction with environmental factors in- duce highly non-linear and non-stationary effects that make it difficult to extract reliable attributes for classi- fication. We present featureless classification results on the Library of Typical Infrasonic Signals using several deep learning techniques, including long short-term memory, self-normalizing, and fully convolutional neural net- works with statistical analysis to establish significantly superior models. In general, the deep classifiers achieve near-perfect classification accuracies on the four classes of infrasonic events including mountain associated waves, microbaroms, auroral infrasonic waves, and volcanic eruptions. Our results provide evidence that deep neural network architectures be considered the leading candidate for classifying infrasound waveforms which can directly benefit applications that seek to identify infrasonic events such as severe weather forecasting, natural disaster early warning systems, and nuclear weapons monitoring.
In this work, we investigate and compare centrality metrics on several datasets. Many real-world complex systems can be addressed using a graph-based analytical approach, where nodes represent the components of the system and edges are the interactions or relationships between them. Different systems such as communication networks and critical infrastructure are known to exhibit common characteristics in their behavior and structure. Infrastructure networks such as power girds, communication networks and natural gas are interdependent. These systems are usually coupled such that failures in one network can propagate and affect the entire system. The purpose of this analysis is to perform a metric analysis on synthetic infrastructure data. Our view of critical infrastructure systems holds that the function of each system, and especially continuity of that function, is of primary importance. In this work, we view an infrastructure as a collection of interconnected components that work together as a system to achieve a domain-specific function. The importance of a single component within an infrastructure system is based on how it contributes, which we assess with centrality metrics.
A novel approach using a support vector machine (SVM) is proposed to classify bare earth points in LiDAR point clouds. Using graph based segmentation, the LiDAR point cloud is segmented into a set of topological components. Several features establishing relationships from those components to their neighboring components are formulated. The SVM is then trained on the segment features to establish a model for the classification of bare earth and non bare earth points. Quantitative results are presented for training and testing the proposed SVM classifier on the ISPRS data set. Using the ISPRS data set as a training set, qualitative results are presented by testing the proposed SVM classifier on data downloaded from Open Topography; which covers a variety of different landscapes and building structures in Frazier Park, California. Despite the data being captured from different sensors, and collected from scenes with different terrain types and building structures, the results shown were processed with no parameter changes. Furthermore, a confidence value is returned indicating how well the unforeseen data fits the SVM’s trained model for bare earth recognition.
We discuss a robust method for optimal oil probe path planning inspired by medical imaging. Horizontal wells require
three-dimensional steering made possible by the rotary steerable capabilities of the system, which allows the hole to
intersect multiple target shale gas zones. Horizontal "legs" can be over a mile long; the longer the exposure length, the
more oil and natural gas is drained and the faster it can flow. More oil and natural gas can be produced with fewer wells
and less surface disturbance. Horizontal drilling can help producers tap oil and natural gas deposits under surface areas
where a vertical well cannot be drilled, such as under developed or environmentally sensitive areas. Drilling creates well
paths which have multiple twists and turns to try to hit multiple accumulations from a single well location. Our
algorithm can be used to augment current state of the art methods. Our goal is to obtain a 3D path with nodes describing
the optimal route to the destination. This algorithm works with BIG data and saves cost in planning for probe insertion.
Our solution may be able to help increase the energy extracted vs. input energy.
LiDAR is an efficient optical remote sensing technology that has application in geography, forestry, and
defense. The effectiveness is often limited by signal-to-noise ratio (SNR). Geiger mode avalanche photodiode
(APD) detectors are able to operate above critical voltage, and a single photoelectron can initiate the current surge,
making the device very sensitive. These advantages come at the expense of requiring computationally intensive
noise filtering techniques. Noise is a problem which affects the imaging system and reduces the capability.
Common noise-reduction algorithms have drawbacks such as over aggressive filtering, or decimating in order to
improve quality and performance. In recent years, there has been growing interest on GPUs (Graphics Processing
Units) for their ability to perform powerful massive parallel processing. In this paper, we leverage this capability to
reduce the processing latency. The Point Spread Function (PSF) filter algorithm is a local spatial measure that has
been GPGPU accelerated. The idea is to use a kernel density estimation technique for point clustering. We
associate a local likelihood measure with every point of the input data capturing the probability that a 3D point is
true target-return photons or noise (background photons, dark-current). This process suppresses noise and allows for
detection of outliers. We apply this approach to the LiDAR noise filtering problem for which we have recognized a
speed-up factor of 30-50 times compared to traditional sequential CPU implementation.
A novel use of Felzenszwalb’s graph based efficient image segmentation algorithm* is proposed for segmenting 3D
volumetric foliage penetrating (FOPEN) Light Detection and Ranging (LiDAR) data for automated target detection. The
authors propose using an approximate nearest neighbors algorithm to establish neighbors of points in 3D and thus form
the graph for segmentation. Following graph formation, the angular difference in the points’ estimated normal vectors is
proposed for the graph edge weights. Then the LiDAR data is segmented, in 3D, and metrics are calculated from the
segments to determine their geometrical characteristics and thus likelihood of being a target. Finally, the bare earth
within the scene is automatically identified to avoid confusion of flat bare earth with flat targets. The segmentation, the
calculated metrics, and the bare earth all culminate in a target detection system deployed for FOPEN LiDAR. General
purpose graphics processing units (GPGPUs) are leveraged to reduce processing times for the approximate nearest
neighbors and point normal estimation algorithms such that the application can be run in near real time. Results are
presented on several data sets.