We present a proof-of-concept of a lightweight and low-power network intrusion detection system (NIDS) using a
commercially available neural network chip. Such a system is well-suited to the increasing deployment of low-power
devices with ubiquitous internet connectivity. Our proposal makes use of previous work on extracting a feature vector
from network packets using a histogram of hashed n-grams. The commercially available CogniMem CM1K device
implements a version of the Restricted Coulomb Energy neural network classifier, which was used to classify the
resulting feature vectors at high speed and low power. In this paper, we describe our feature extraction technique for
network packets and the classification algorithm used by the CM1K chip, and present initial classification results on a
fabricated test set. Despite the generality of the RCE algorithm and our ‘plug-in’ approach to the classification task,
with no fine-tuning of the hardware to our problem domain, we obtain surprisingly good classification results even on
highly imbalanced and restricted training sets.
The sensors network is becoming prolific and play now increasingly more important role in acquiring and processing information. Cyber-Physical Systems are focusing on investigation of integrated systems that includes sensing, networking, and computations. The physics of the seismic measurement and electromagnetic field measurement requires special consideration how to design electromagnetic field measurement networks for both research and detection earthquakes and explosions along with the seismic measurement networks. In addition, the electromagnetic sensor network itself could be designed and deployed, as a research tool with great deal of flexibility, the placement of the measuring nodes must be design based on systematic analysis of the seismic-electromagnetic interaction.
In this article, we review the observations of the co-seismic electromagnetic field generated by earthquakes and man-made sources such as vibrations and explosions. The theoretical investigation allows the distribution of sensor nodes to be optimized and could be used to support existing geological networks. The placement of sensor nodes have to be determined based on physics of electromagnetic field distribution above the ground level. The results of theoretical investigations of seismo-electromagnetic phenomena are considered in Section I. First, we compare the relative contribution of various types of mechano-electromagnetic mechanisms and then analyze in detail the calculation of electromagnetic fields generated by piezomagnetic and electrokinetic effects.
Small-to-medium sized businesses lack resources to deploy and manage high-end advanced solutions to deter sophisticated threats from well-funded adversaries, but evidence shows that these types of businesses are becoming key targets. As malicious code and network attacks become more sophisticated, classic signature-based virus and malware detection methods are less effective. To augment the current malware methods of detection, we developed a proactive approach to detect emerging malware threats using open source tools and intelligence to discover patterns and behaviors of malicious attacks and adversaries. Technical and analytical skills are combined to track adversarial behavior, methods and techniques.
We established a controlled (separated domain) network to identify, monitor, and track malware behavior to increase understanding of the methods and techniques used by cyber adversaries. We created a suite of tools that observe the network and system performance looking for anomalies that may be caused by malware. The toolset collects information from open-source tools and provides meaningful indicators that the system was under or has been attacked. When malware is discovered, we analyzed and reverse engineered it to determine how it could be detected and prevented. Results have shown that with minimum resources, cost effective capabilities can be developed to detect abnormal behavior that may indicate malicious software.
The smart grid is the integration of computing and communication technologies into a power grid with a goal of enabling real time control, and a reliable, secure, and efficient energy system . With the increased interest of the research community and stakeholders towards the smart grid, a number of solutions and algorithms have been developed and proposed to address issues related to smart grid operations and functions. Those technologies and solutions need to be tested and validated before implementation using software simulators. In this paper, we developed a general smart grid simulation model in the MATLAB/Simulink environment, which integrates renewable energy resources, energy storage technology, load monitoring and control capability. To demonstrate and validate the effectiveness of our simulation model, we created simulation scenarios and performed simulations using a real-world data set provided by the Pecan Street Research Institute.
The increasing use of online collaboration and information sharing in the last decade has resulted in explosion of criminal and anti-social activities in online communities. Detection of such behaviors are of interest to commercial enterprises who want to guard themselves from cyber criminals, and the military intelligence analysts who desire to detect and counteract cyberwars waged by adversarial states and organizations. The most challenging behaviors to detect are those involving multiple individuals who share actions and roles in the hostile activities and individually appear benign. To detect these behaviors, the theories of group behaviors and interactions must be developed. In this paper we describe our exploration of the data from collaborative social platform to categorize the behaviors of multiple individuals. We applied graph matching algorithms to explore consistent social interactions. Our research led us to a conclusion that complex collaborative behaviors can be modeled and detected using a concept of group behavior grammars, in a manner analogous to natural language processing. These grammars capture constraints on how people take on roles in virtual environments, form groups, and interact over time, providing the building blocks for scalable and accurate multi-entity interaction analysis and social behavior hypothesis testing.
Understanding the mechanism behind large-scale information dispersion through complex networks has important implications for a variety of industries ranging from cyber-security to public health. With the unprecedented availability of public data from online social networks (OSNs) and the low cost nature of most OSN outreach, randomized controlled experiments, the "gold standard" of causal inference methodologies, have been used with increasing regularity to study viral information dispersion. And while these studies have dramatically furthered our understanding of how information disseminates through social networks by isolating causal mechanisms, there are still major methodological concerns that need to be addressed in future research. This paper delineates why modern OSNs are markedly different from traditional sociological social networks and why these differences present unique challenges to experimentalists and data scientists. The dynamic nature of OSNs is particularly troublesome for researchers implementing experimental designs, so this paper identifies major sources of bias arising from network mutability and suggests strategies to circumvent and adjust for these biases. This paper also discusses the practical considerations of data quality and collection, which may adversely impact the efficiency of the estimator. The major experimental methodologies used in the current literature on virality are assessed at length, and their strengths and limits identified. Other, as-yetunsolved threats to the efficiency and unbiasedness of causal estimators--such as missing data--are also discussed. This paper integrates methodologies and learnings from a variety of fields under an experimental and data science framework in order to systematically consolidate and identify current methodological limitations of randomized controlled experiments conducted in OSNs.
This paper presents a threat-driven quantitative mathematical framework for secure cyber-physical system design
and assessment. Called The Three Tenets, this originally empirical approach has been used by the US Air Force
Research Laboratory (AFRL) for secure system research and development. The Tenets were first documented
in 2005 as a teachable methodology. The Tenets are motivated by a system threat model that itself consists of
three elements which must exist for successful attacks to occur:
– system susceptibility;
– threat accessibility and;
– threat capability.
The Three Tenets arise naturally by countering each threat element individually. Specifically, the tenets are:
Tenet 1: Focus on What’s Critical - systems should include only essential functions (to reduce susceptibility);
Tenet 2: Move Key Assets Out-of-Band - make mission essential elements and security controls difficult
for attackers to reach logically and physically (to reduce accessibility);
Tenet 3: Detect, React, Adapt - confound the attacker by implementing sensing system elements with
dynamic response technologies (to counteract the attackers’ capabilities).
As a design methodology, the Tenets mitigate reverse engineering and subsequent attacks on complex systems.
Quantified by a Bayesian analysis and further justified by analytic properties of attack graph models, the Tenets
suggest concrete cyber security metrics for system assessment.
Radar sensors can be viewed as a limited wireless sensor network consisting of radar transmitter nodes, target nodes, and
radar receiver nodes. The radar transmitter node sends a communication signal to the target node which then reflects it in
a known pattern to the radar receiver nodes. This type of wireless sensor network is susceptible to the same types of
attacks as a traditional wireless sensor network, but there is less opportunity for defense. The target nodes in the network
are unable to validate the return signal, and they are often uncooperative. This leads to ample opportunities for spoofing
and man-in-the-middle attacks. This paper explores some of the fundamental techniques that can be used against a
limited wireless network system as well as explores the techniques that can be used to counter them.
The motivation for this work comes from a desire to improve resilience of mission critical cyber enabled systems
including those used in critical infrastructure domains such as cyber, power, water, fuel, financial, healthcare,
agriculture, and manufacturing. Resilience can be defined as the ability of a system to persistently meet its performance
requirements despite the occurrence of adverse events. Characterizing the resilience of a system requires a clear
definition of the performance requirements of the system of interest and an ability to quantify the impact on performance
by the adverse events of concern. A quantitative characterization of system resilience allows the resilience requirements
to be included in the system design criteria. Resilience requirements of a system are derived from the service level
agreements (SLAs), measures of effectiveness (MOEs), and measures of performance (MOPs) of the services or
missions supported by the system. This paper describes a methodology for designing resilient systems. The components
of the methodology include resilience characterization for threat models associated with various exposure modes,
requirements mapping, subsystem ranking based on criticality, and selective implementation of mitigations to improve
system resilience to a desired level.
The model that brings the data input/output under control in closed network systems, that maintains the system securely,
and that controls the flow of information through the Main Control Computer which also brings the network traffic under
control against cyber-attacks. The network, which can be controlled single-handedly thanks to the system designed to
enable the network users to make data entry into the system or to extract data from the system securely, intends to
minimize the security gaps. Moreover, data input/output record can be kept by means of the user account assigned for
each user, and it is also possible to carry out retroactive tracking, if requested. Because the measures that need to be
taken for each computer on the network regarding cyber security, do require high cost; it has been intended to provide a
cost-effective working environment with this model, only if the Main Control Computer has the updated hardware.
Wars previously being executed at land and sea have also become applicable in air and space due to the advancements of aircraft and satellite systems. Rapid improvements in information technologies have triggered the concept of cyberspace which is considered as the fifth dimension of war. While transferring information quickly from physical area to electronic/digital area, cyberspace has caused to emerge a lot of threats and methods like cyber-attack, cyber-crime, cyber war which are spreading too rapidly. Individuals, institutions and establishments have begun to take their own cyber security precautions to cope with these threats. This study gives information about the concepts and advances on cyberspace in order to raise comprehensive awareness. The study also focuses on the effects of these improvements in the battlefield, and analyzes them.
A theoretical possibility of non-resonant, fast, and efficient (up to 40 percent) heating of very thin conducting cylindrical targets by broad electromagnetic beams was predicted in [Akhmeteli, arXiv:physics/0405091 and 0611169] based on rigorous solution of the diffraction problem. The diameter of the cylinder can be orders of magnitude smaller than the wavelength (for the transverse geometry) or the beam waist (for the longitudinal geometry) of the electromagnetic radiation. Experimental confirmation of the above results is presented [Akhmeteli, Kokodiy, Safronov, Balkashin, Priz, Tarasevitch, arXiv:1109.1626 and 1208.0066].
Developing a customized micro-mirror array (MMA) is costly and time consuming. Characterization experiments can be nearly as labor-intensive as fabrication. It is therefore desirable to have a computational simulation as a cost-effective and efficient means to conduct exploratory investigations such that appropriate parameters and optical characteristics can be obtained prior to any physical tests. In this article, we present our simulation of a novel MMA and preliminary diffraction analysis. Using geometric mapping, we created a three-dimensional visualization of the 5 × 5 MMA based on userspecified poses, which provides an interactive virtual view of the MMA geometry. A number of parameters are used to allow customizable simulation of various micro-mirror poses within the hardware limits. In addition, the far-field diffraction is simulated based on the given mirror geometry. This simulation package provides a tool for studying optical properties of MMAs, and it enables computational means to evaluate the performance of MMA as a beam steering element of Infrared Countermeasure (IRCM) systems.
We discuss a robust method for quantifying change of multi-temporal remote sensing point data in the presence of affine registration errors. Three dimensional image processing algorithms can be used to extract and model an electronic module, consisting of a self-contained assembly of electronic components and circuitry, using an ultrasound scanning sensor. Mutual information (MI) is an effective measure of change. We propose a multi-resolution 3D fractal algorithm which is a novel extension to MI or regional mutual information (RMI). Our method is called fractal mutual information (FMI). This extension efficiently takes neighborhood fractal patterns of corresponding voxels (3D pixels) into account. The goal of this system is to quantify the change in a module due to tampering and provide a method for quantitative and qualitative change detection and analysis.
Data sets are often modeled as point clouds lying in a high dimensional space. In practice, they usually reside on or near a much lower dimensional manifold embedded in the ambient space; this feature allows for both a simple representation of the data as well as accurate performance for statistical inference procedures such as estimation, regression and classification. In this paper we propose a framework based on geometric multi-resolution analysis (GMRA) to tackle the problem of classifying data lying around a low-dimensional set M embedded in a high-dimensional space RD. We test our algorithms on real data sets and demonstrate its efficacy in the presence of noise.
We present an optical cybersecurity-crypto-module as a resilient cyber defense agent. It has no hardware signature since it is bitstream reconfigurable, where single hardware architecture functions as any selected device of all possible ones of the same number of inputs. For a two-input digital device, a 4-digit bitstream of 0s and 1s determines which device, of a total of 16 devices, the hardware performs as. Accordingly, the hardware itself is not physically reconfigured, but its performance is. Such a defense agent allows the attack to take place, rendering it harmless. On the other hand, if the system is already infected with malware sending out information, the defense agent allows the information to go out, rendering it meaningless. The hardware architecture is immune to side attacks since such an attack would reveal information on the attack itself and not on the hardware. This cyber defense agent can be used to secure a point-to-point, point-to-multipoint, a whole network, and/or a single entity in the cyberspace. Therefore, ensuring trust between cyber resources. It can provide secure communication in an insecure network. We provide the hardware design and explain how it works. Scalability of the design is briefly discussed. (Protected by United States Patents No.: US 8,004,734; US 8,325,404; and other National Patents worldwide.)