We describe an integrated system developed for use onboard a moving work machine. The machine is targeted to such applications as e.g. automatic container handling at loading terminals. The main emphasis is on the various environment perception duties required by autonomous or semi-autonomous operation. These include obstacle detection, container position determination, localization needed for efficient navigation and measurement of docking and grasping locations of containers. Practical experience is reported on the use of several different types of technologies for the tasks. For close distance measurement, such as container row following, ultrasonic measurement was used, with associated control software. For obstacle and docking position detection, 3D active vision techniques were developed with structured lighting, utilizing also motion estimation techniques. Depth from defocus-based methods were developed for passive 3D vision. For localization, fusion of data from several sources was carried out. These included dead-reckoning data from odometry, an inertial unit, and several alternative external localization devices, i.e. real-time kinematic GPS, inductive and optical transponders. The system was integrated to run on a real-time operating system platform, using a high-level software specification tool that created the hierarchical control structure of the software.