The recent war on terrorism and increased urban warfare has been a major catalysis for increased interest in the development of disposable unattended wireless ground sensors. While the application of these sensors to hostile domains has been generally governed by specific tasks, this research explores a unique paradigm capitalizing on the fundamental functionality related to sensor systems. This functionality includes a sensors ability to Sense - multi-modal sensing of environmental events, Decide - smart analysis of sensor data, Act - response to environmental events, and Communication - internal to system and external to humans (SDAC). The main concept behind SDAC sensor systems is to integrate the hardware, software, and networking to generate 'knowledge and not just data'. This research explores the usage of wireless SDAC units to collectively make up a sensor system capable of persistent, adaptive, and autonomous behavior. These systems are base on the evaluation of scenarios and existing systems covering various domains. This paper presents a promising view of sensor network characteristics, which will eventually yield smart (intelligent collectives) network arrays of SDAC sensing units generally applicable to multiple related domains. This paper will also discuss and evaluate the demonstration system developed to test the concepts related to SDAC systems.
Wireless sensor networks allow detailed sensing of otherwise
unknown and inaccessible environments. While it would be
beneficial to include cameras in a wireless sensor network because
images are so rich in information, the power cost of transmitting
an image across the wireless network can dramatically shorten the
lifespan of the sensors. This paper investigates various
compression techniques and what the cost of these algorithms would
be to the lifespan of the sensor nodes. We further describe a new
paradigm for cameras and wireless networks. Rather than focusing
on transmitting images across the network, we show how an image
can be processed locally for key features using simple detectors.
Contrasted with traditional event detection systems that trigger a
an image capture, this enables a new class of sensors which uses a
low power imaging sensor to detect a variety of visual cues.
The wireless intelligent monitoring and analysis systems is a proof-of-concept directed at discovering solution(s) for
providing decentralized intelligent data analysis and control for distributed containers equipped with wireless sensing
units. The objective was to embed smart behavior directly within each wireless sensor container, through the
incorporation of agent technology into each sensor suite. This approach provides intelligent directed fusion of data based
on a social model of teaming behavior. This system demonstrates intelligent sensor behavior that converts raw sensor
data into group knowledge to better understand the integrity of the complete container environment. The emergent team
behavior is achieved with lightweight software agents that analyze sensor data based on their current behavior mode.
When the system starts-up or is reconfigured the agents self-organize into virtual random teams based on the
leader/member/lonely paradigm. The team leader collects sensor data from their members and investigates all abnormal
situations to determine the legitimacy of high sensor readings. The team leaders flag critical situation and report this
knowledge back to the user via a collection of base stations. This research provides insight into the integration issues and
concerns associated with integrating multi-disciplinary fields of software agents, artificial life and autonomous sensor
behavior into a complete system.