An innovative approach to building operation, called the adaptive optimal operation methodology (AOOM), is developed and validated in this study. The AOOM, which resides in the building energy management and control system, estimates the building and heating, ventilating, and air conditioning (HVAC) system loads and parameters and supplies the local controllers the optimal set points that minimize the energy cost while maintaining occupant comfort. The AOOM uses the recursive least square method with an adaptive forgetting factor to estimate the parameters for the building zones and HVAC systems. A genetic algorithm optimizer together with a system model is then used to generate the optimal set points, such as the supply air temperature set point as well as the set points of minimum air flow rate and zone temperature for each zone. The system model is validated through different types of experiments. System level validation experiments conducted during the heating and cooling seasons indicate that the HVAC system operated under the AOOM consumes 3 to 10.8 % less heated water energy during the heating season and 1 to 4 % less electrical energy during the cooling season when compared with a commercial operation methodology.
In today's global world, manufacturers are facing many challenges such as product design with distributed and collaborative workflows. Complexity in collaborative product design arises from the need to synthesize different perspectives of a problem. Specifically, dependency identification of the product design process, as well as integration and sharing of computing application among the design teams that are critical for efficiency of manufacturability. Web services are considered to be the key to collaborative product design through the Internet. Web services alone are passive whereas agents can provide alerts and updates when new information becomes available. In this paper, an agent-based Web services architecture is proposed and applied to augment manufacturability. Not only the agent-based Web services architecture makes system interoperation feasible, but also increases the efficiency of the distributed collaboration.
The concept of an integrated or synthesized supply chain is a strategy for managing today's globalized and customer driven supply chains in order to better meet customer demands. Synthesizing individual entities into an integrated supply chain can be a challenging task due to a variety of factors including conflicting objectives, mismatched incentives and constraints of the individual entities. Furthermore, understanding the effects of disruptions occurring at any point in the system is difficult when working toward synthesizing supply chain operations. Therefore, the goal of this research is to present a modeling methodology to manage the synthesis of a supply chain by linking hierarchical levels of the system and to model and analyze disruptions in the integrated supply chain. The contribution of this research is threefold: (1) supply chain systems can be modeled hierarchically (2) the performance of synthesized supply chain system can be evaluated quantitatively (3) reachability analysis is used to evaluate the system performance and verify whether a specific state is reachable, allowing the user to understand the extent of effects of a disruption.
Due to increased competition in today's global environment, companies are embracing Collaborative Product Development (CPD) as a path to success in new product development. CPD is an engineering process that involves decision-making through iterative communication and coordination among product designers throughout the lifecycle of the product. The high level of collaboration and communication of a CPD environment requires a robust Distributed Information System. In this paper, we review existing research related to CPD and propose an information framework termed VE4PD based upon the integration of Web services and agent technologies to manage the CPD process across the product lifecycle. VE4PD maintains the information consistency and enables the proactive information update to facilitate a faster and more efficient CPD. In addition, the use of agent technology helps to leverage the integration of server applications and client applications. An implementation system is developed to validate the application.
This paper presents a methodology for issue resolution in conceptual design. Conceptual design is the preliminary phase of design in which both well-defined problem specifications and high level design solutions are developed. It has been shown that eighty five percent of the lifecycle costs are determined during the conceptual design phase in the development of a product. Generally there are three modules: problem definition, issue resolution, and conceptual formation in the architecture of conceptual design. Problem definition generates a problem description, a set of requirements, and/or a set of design constraints. After the problem is defined, consultation with professionals may be necessary to find out the actual corresponding issues for the design. A more objective approach for convincing participators of the correctness of the chosen issues is to use data. After the design issues are identified the conflicts among them need to be resolved in the issue resolution module. The proposed approach provides a methodology for designers to determine the solutions to the appropriate issues that fit in with the problem definition and improves the efficiency of conceptual design process. It is based on an intuitively appealing methodology, the analytic hierarchy process (AHP). Our approach has been applied to magnesium alloys industry for the conceptual design of WIP containers. The conflicts among the identified issues are resolved successfully after applying it. The real world case demonstrates the practicability and efficiency of the proposed methodology.
This paper examines the forecasting performance of multi-layer feed forward neural networks in modeling a particular foreign exchange rates, i.e. Japanese Yen/US Dollar. The effects of two learning methods, Back Propagation and Genetic Algorithm, in which the neural network topology and other parameters fixed, were investigated. The early results indicate that the application of this hybrid system seems to be well suited for the forecasting of foreign exchange rates. The Neural Networks and Genetic Algorithm were programmed using MATLAB®.
This paper proposes a new method of analyzing the solution space of multi-factor manufacturing scheduling problems. The proposed method is introduced together with two new concepts: relation matrix and decision matrix. This method simplifies a multi-factor problem into a number of two-factor sub-problems which are then analyzed individually. Some close-expressions of the number of feasible solutions for multi-device, multi-worker and multi-task are obtained. It can be used not only to calculate the number of possible/feasible solutions, but also to obtain these solutions in simple cases. It is particularly useful in very complex situations, since the results of solution space analysis can help choose appropriate techniques or algorithms to solve complex scheduling problems.
A new generation of industrial robots needs to have reliable perceptual systems that are similar to human vision. Human vision is based on the principles of image understanding and active vision. Both principles are possible in the form of Network-Symbolic systems. An Image/Video Analysis that is based on Network-Symbolic principles significantly differs from a traditional image analysis. Instead of precise computations of 3-dimensional models, such a system converts image into an "understandable" relational format similar to knowledge models. It is hard to use geometric operations for processing of natural images. Instead, the brain builds a relational network-symbolic structure of visual scene, using different clues to set up the relational order of surfaces and objects with respect to the observer and to each other. Spatial order can be represented as a connection graph. There is a generic logic of 3-D structures, which is based on relational changes of object views in the visual or object buffers. In Network-Symbolic Models, the derived structure and not the primary view is a subject for recognition. Such recognition is not affected by local changes and appearances of the object as seen from a set of similar views. Integrated into the industrial robotic systems, Network-Symbolic models can intelligently interpret images and video.
Holding the work piece for machining, forming, assembly, or inspection operations is a universally encountered problem in the manufacturing world. The apparatus used to accomplish this is a fixture. Using efficient fixtures is a good way of improving the throughputs of the processes by reducing the part setup time, which can be defined as locating the part in the desired position in a safe way to allow machining. Identification of design requirements, fixture analysis and fixture synthesis can be named as the phases of computer aided fixture design. Typical fixture design systems focus on one of these phases, as the knowledge representation requirements differ for each stage. This paper discusses these phases with respect to the general manufacturing scheme and identifies the issues on integration in order to have a successful variant approach to the fixture selection problem. A system architecture is discussed that is based upon the new graph-based integrated fixture representation. This system architecture should facilitate the rapid exploration of existing fixtures and fixture assemblies to find suitable matches for new part manufacture.
Multivariate statistical techniques are used to analyze complex data sets with many independent and dependent variables. The dataset may be analyzed for relationships among variables based on correlation, significance of group differences based on variance and covariance, prediction of group membership, and prediction of empirical or theoretical structure of the data. The choice among the available multivariate analysis techniques for each of these research questions is based on the nature of the variables, the number of independent and dependent variables and if the independent variables can be considered as covariates. This paper describes a software tool that can assist researchers in selecting the appropriate data analysis technique based on the research needs of the data. The data analyses techniques discussed in this paper are discriminant function analysis, multi-way frequency analysis and logistic regression. The structure underlying a dataset is based on multivariate approaches such as principal components analysis, factor analysis and structural equation modeling. The paper illustrates the software tool on the Fisher's Iris data set.
This paper describes an internet-based software tool developed for the West Virginia State Police Forensics Laboratory. The software enables law enforcement agents to submit crime information to the Forensic Laboratory via a secure Internet connection. Online electronic forms were created to mirror the existing paper based forms, making the transition easier. The process of submitting case information was standardized and streamlined, there by minimizing information inconsistency. The crime information once gathered is automatically stored in a database, and can be viewed and queried by any authorized law enforcement officers. The software tool will be deployed in all counties of WV.
Scheduling problems concern the allocation of limited resources over time among both parallel and sequential activities. The majority of these problems belong to the class of NP-hard. Agent-based approaches have been recently applied to solve some of these difficult problems, particularly distributed scheduling problems. Load balancing has been adopted as an optimization criterion for several scheduling problems. However, in many practical situations, a load balanced solution may not be feasible or attainable. To deal with this limitation, this paper presents a generic mathematical model of load distribution for resource allocation, called desired load distribution. The objective is to develop a model for scheduling of general parallel machines that can be used both in centralized resource management settings and in agent-based distributed scheduling systems. Unlike many existing agent-based scheduling systems, this model attempts to obtain a global optimal solution through many-to-many task/resource allocation instead of one-to-many negotiation approaches.
Advances in automated data collection tools in design and manufacturing have far exceeded our capacity to analyze this data for novel information. Techniques of data mining and knowledge discovery in large databases promise computationally efficient and accurate means to analyze such data for patterns and similar structures. In this paper, we present a unique data mining approach for finding similarities in classes of 3D models, using discovery of association rules. PCA is first performed on the 3D model to transform it along first principal axis. Transformed 3D model is then sliced and segmented along multiple principal axes, such that each slice can be interpreted as a transaction in a transaction database. Association-rule discovery is performed on this transaction space for multiple models and common association rules among those transactions are stored as a representative of a class of models. We have evaluated the performance of association rules for efficient representation of classes of shape models. The method is time and space efficient, besides presenting a novel paradigm for searching content dependencies in a database of 3D models.
Effective sensor placement methodologies are desired for distributed sensor networks frequently encountered in military, environmental, and nano-biotechnology applications. The goal is to provide a (sub-)optimal framework for sensor resource management, while placing those sensors such that they provide accurate coverage within the required location and range probability. The problem is not trivial as the sensors might not be of equal capacity, the terrain upon which the sensors are deployed might have many obstacles and some sensors might fail. In some applications, areas over the sensor field are marked preferential, with high desired probability of detection and coverage. In this paper, we propose a unique sensor placement computing framework for preferential coverage in the sensor-field, while trying to deploy minimum number of sensors. The proposed approach treats the sensor field as an image, which provides an advantage of attaining pixel-level accuracy in sensor placement. A unique algorithm is presented that initially concentrates on the preferential regions and then proceeds towards the calibration of other uncovered regions of the sensor field. Our approach has shown significant improvement in time-performance in contrast to the greedy-approach, and has a strong potential for applications in several mission-critical applications.
The impact between rigid bodies with friction is studied. The rolling friction moment and the coefficient of rolling friction are introduced, and an improved mathematical model of impact with friction is presented. The influence of the moment of rolling friction on the energy dissipated by friction during the impact is analyzed. For a simple pendulum, using the energetic coefficient of restitution, more energy is dissipated for larger values of the coefficient of kinetic friction and contact radius, and for smaller values of the length of the beam.
The results of dry machining characterization simulations and experiments for new design technology of using a heat pipe installed in a cutting tool to remove the heat produced at the tool-chip interface which causes thermal damage and tool wear are presented in the paper. Analysis of the results by a heat transfer finite element model indicates that the particular heat pipe used was capable of removing heat with a significant reduction in the rise of the tool-chip interface temperature above the temperature in the surrounding environment at steady operating conditions. Measurements of the variation of the tool insert temperature with time are reported. Both
cases, with and without heat-pipe, were considered. In the end the project yields dry machining characterization on influence of embedded heat pipe on mechanical properties of insert and workpiece, tool-chip interface temperature, and tool wear. The results of this study are useful for the cutting tool design and implementation in environmentally conscious manufacturing applications.
Globalization and miniaturization - these trends in production technology cause R&D activities, focused on agile microassembly systems with autonomous subcomponents. Besides product and processes flexibility, agility in small production volumes can be mainly achieved by controlling the system in different operational modes and switching between those in an efficient way. Therefore an agile micro-assembly structure is presented, which can be controlled both in manual mode by teleoperation and in semi-automatic mode. To assure a highly efficient use of production resources in all operational modes, several cooperating sensor components have been developed. Thus a human operator can focus on the main processes, while secondary processes like adjusting and calibrating of sensor-modules are controlled by autonomous functions. In automatically controlled systems the same agents can speed up the main production processes and minimize set up times. In order to switch simply between manual and automatic mode, a smart teaching procedure is integrated into the control framework.
Semiconductor industry accounts for 1.3% - 2% of the total US electricity consumption in the manufacturing sector. Energy in the form of electricity is required to operate the process tools, maintain the clean room conditions, operate Heating Ventilation and Air conditioning units, and Chillers, etc. The process tools account for 40% of the operating costs in a semiconductor fabrication unit. Since a significant amount of energy is used by the process tools, it becomes necessary to determine process parameters which govern energy. A model is built in this study to estimate the energy requirement of any particular process in semiconductor manufacturing based on the input variables. It is intended to enable the estimation of process energy beforehand by analysis of process parameters governing energy. This paper also reports a sensitivity analysis of process variables with respect to energy. Often physical energy measurement in semiconductor fabrication unit is time consuming as well as uneconomical. A research in this area will help the production managers in the Semiconductor fabrication facilities to effectively select the production parameters and use the process tools based on the results obtained from the analysis.
We address the problem of efficiency in image texture analysis. Motivated by the statistical occupancy model, we introduce the notion of patch re-occurrences. Using the re-occurrences, we propose the use of approximate textural features in image analysis. We describe how the proposed approximate features can be extracted for Gabor filters, a popular texture analysis method. Preliminary results on image texture classification show that the proposed method can provide an improved efficiency in the processing, without introducing any significant degradation in the classification results.
It has been found in the plant assessments that plant personnel are fairly uncertain about motor loading. The decisions about the motor sizing are based on judgment or easy slip method to avoid the hurdle of load determination methods. Accurate load determination methods involve working around high voltages hence have safety issues, are time consuming and also not economical. A method that is simple to use, economical, safe and fairly accurate is necessary, on which motor sizing decisions can be based. The paper aims at data acquisition and analysis for tackling this issue. Data was collected from fifteen facility energy audits by carrying out motor load monitoring with help of AmprobeTM data logger and DM II Pro acquisition software as well as stroboscope method. Statistical analysis can be carried out to establish a relationship between them, so that prediction of actual load factor can be made based on the load factor obtained by slip method.
Virtual manufacturing is a new and emerging concept to integrate different areas of manufacturing by using computer technology for creation and execution of virtual models. Virtual manufacturing is defined as a computer-based system, which consists of evolving models of manufacturing systems and processes, and exercised to enhance one or more attributes of the real system. Manufacturing as a whole is a very complex system consisting of various interacting, interrelated and interdependent subsystems and processes. Construction, validation and calibration of an all-inclusive virtual manufacturing system is difficult and sometimes impossible. The object-oriented approach suggested will help to simplify this task. The entire virtual system is now constructed using smaller blocks or objects. Each object is a sub-model, which is created, validated and calibrated independently. Various objects are integrated at different levels to form higher-level subsystems which in turn create the whole system. The concept is illustrated with the example of a virtual machining operation, representing one of the smallest building blocks in the comprehensive virtual manufacturing system. Results from real experiments were used to validate and calibrate the virtual machining operation and to prove its adequacy.
Rapid prototyping (RP) technology, such as Laser Engineering Net Shaping (LENSTM), can be used to fabricate heterogeneous objects with gradient variations in material composition. These objects are generally characterized by enhanced functional performance. Past research on the design of such objects has focused on representation, modeling, and functional performance. However, the inherent constraints in RP processes, such as system capability and processing time, lead to heterogeneous objects that may not meet the designer's original intent. To overcome this situation, the research presented in this paper focuses on the identification and implementation of manufacturing constraints into the design process. A node-based finite element modeling technique is used for the representation and analysis and the multicriteria design problem corresponds to finding the nodal material compositions that minimize structural weight and maximize thermal performance. The optimizer used in this research is a real-valued Evolutionary Strategies (ES), which is well suited for this type of multi-modal problem. Two limitations of the LENS manufacturing process, which have an impact on the design process, are identified and implemented. One of them is related to the manufacturing time, which is considered as an additional criterion to be minimized in the design problem for a preselected tool path. A brake disc rotor made of two materials, aluminum for lightweight and steel for superior thermal characteristics, is used to illustrate the tradeoff between manufacturability and functionality.
An embedded system is a hybrid of hardware and software, which combines software's flexibility and hardware real-time performance. Embedded systems can be considered as assemblies of hardware and software components. An Open Embedded System Model (OESM) is currently being developed at NIST to provide a standard representation and exchange protocol for embedded systems and system-level design, simulation, and testing information. This paper proposes an approach to representing an embedded system feature-based model in OESM, i.e., Open Embedded System Feature Model (OESFM), addressing models of embedded system artifacts, embedded system components, embedded system features, and embedded system configuration/assembly. The approach provides an object-oriented UML (Unified Modeling Language) representation for the embedded system feature model and defines an extension to the NIST Core Product Model. The model provides a feature-based component framework allowing the designer to develop a virtual embedded system prototype through assembling virtual components. The framework not only provides a formal precise model of the embedded system prototype but also offers the possibility of designing variation of prototypes whose members are derived by changing certain virtual components with different features. A case study example is discussed to illustrate the embedded system model.
In today's intensely competitive and highly volatile business environment, consistent development of low cost and high quality products meeting the functionality requirements is a key to a company's survival. Companies continuously strive to reduce the costs while still producing quality products to stay ahead in the competition. Many companies have turned to target costing to achieve this objective. Target costing is a structured approach to determine the cost at which a proposed product, meeting the quality and functionality requirements, must be produced in order to generate the desired profits. It subtracts the desired profit margin from the company's selling price to establish the manufacturing cost of the product. Extensive literature review revealed that companies in automotive, electronic and process industries have reaped the benefits of target costing. However target costing approach has not been applied in the machining industry, but other techniques based on Geometric Programming, Goal Programming, and Lagrange Multiplier have been proposed for application in this industry. These models follow a forward approach, by first selecting a set of machining parameters, and then determining the machining cost. Hence in this study we have developed an algorithm to apply the concepts of target costing, which is a backward approach that selects the machining parameters based on the required machining costs, and is therefore more suitable for practical applications in process improvement and cost reduction. A target costing model was developed for turning operation and was successfully validated using practical data.
We present a system for intelligent machine fault detection and analysis. This system examines the signals in real-time, determines the quality of the signature for the entire set of signals and evaluates the error states of these signal combinations or signatures. This approach of continually evaluating quality of signals allows for predictive maintenance of the manufacturing system. The signals from the manufacturing system are obtained in a standard, optically isolated interface, the signals into this Remote Observation Manufacturing Equipment (ROME) system is processed and evaluated in real-time and history of these signals is stored. This system allows for the monitoring of signals in a continuous manner and these signals are recorded till a fault occurs. The graphical user interface provides user visualization control of the full family of signals at various time instants. These analog and digital signals are synchronized with the color images from two cameras and can be viewed with this GUI. The user can review both error and normal condition state using this interface.
This paper puts forward some methods aiming at grinding diamond (1) a new method of examining and testing grinding force through tests on the rotational speed of driven belt pulley (2) at the original stage of grinding, the grinding force and area are adopted to control the feeding speed of grinding. The variation rule is given in the paper drawn from the experiment about the rotational speed and mill grinding force in normal direction (short for: grinding force). And the formula is induced about the grinding surface area of circular diamond varying with the grinding amount from blank the shape of while grinding. The adoption of the method in the paper can meet the diamond of real time controlling the grinding process of diamond precisely and efficiently.