In recent years the number of offshore wind farms is rapidly increasing. Especially coastal European countries are building numerous offshore wind turbines in the Baltic, the North, and the Irish Sea. During both construction and operation of these wind farms, many specially-equipped helicopters are on duty. Due to their flexibility, their hover capability, and their higher speed compared to ships, these aircraft perform important tasks like helicopter emergency medical services (HEMS) as well as passenger and freight transfer flights. The missions often include specific challenges like platform landings or hoist operations to drop off workers onto wind turbines. However, adverse weather conditions frequently limit helicopter offshore operations. In such scenarios, the application of aircraft-mounted sensors and obstacle databases together with helmet-mounted displays (HMD) seems to offer great potential to improve the operational capabilities of the helicopters used. By displaying environmental information in a visual conformal manner, these systems mitigate the loss of visual reference to the surroundings. This helps the pilots to maintain proper situational awareness. This paper analyzes the specific challenges of helicopter offshore operations in wind parks by means of an online survey and a structured interview with pilots and operators. Further, the work presents how our previously introduced concept of an HMD-based virtual flight deck could enhance helicopter offshore missions. The advantages of this system – for instance its “see-through the airframe”-capability and its highly-flexible cockpit setup – enable us to design entirely novel pilot assistance systems. The gained knowledge will be used to develop a virtual cockpit that is tailor-made for helicopter offshore maneuvers
In the past couple of years, research on display content for helicopter operations headed in a new direction. The already reached goals could evolve into a paradigm change for information visualization. Technology advancements allow implementing three-dimensional and conformal content on a helmet-mounted see-through device. This superimposed imagery inherits the same optical flow as the environment. It is supposed to ease switching between display information and environmental cues. The concept is neither pathbreaking nor new, but it has not been successfully established in aviation yet. Nevertheless, there are certainly some advantages to expect—at least from perspective of a human-centered system design. Within the following pages, the next generation displays will be presented and discussed with a focus on human factors. Beginning with recalling some human factor related research facts, an experiment comparing the former two-dimensional research displays will be presented. Before introducing the DLR conformal symbol set and the three experiments about an innovative drift, indication related research activities toward conformal symbol sets will be addressed.
A head-worn display combined with accurate head-tracking allows one to show synthetically generated symbols in a way that they appear as a part of the real world. Depending on the specific research context, different terms have been used for the ability to show display elements as parts of the outside world. These include contact analog, scene linked, augmented reality, and outside conformal. While the famous highway in the sky was one of the first applications in avionics, over the years more and more conformal counterparts have been devised for aircraft-related instruments. Among them are routing information, navigation aids, specialized landing displays, obstacle warnings, drift indicators, and many more. Conformal displays have been developed for more than 40 years. We present a review of some results, as well as look ahead to research trends for the next years. We suggest that naturalism is not the best choice for the design of conformal displays. Instead, more abstract representations often yield better pilot acceptance.
Degraded visual environment is still a major problem for helicopter pilots especially during approach and landing. Particularly with regard to the landing phase, pilot’s eyes must be directed outward in order to find visual cues as indicators for drift estimation. If lateral speed exceeds the limits it can damage the airframe or in extreme cases lead to a rollover. Since poor visibility can contribute to a loss of situation awareness and spatial disorientation, it is crucial to intuitively provide the pilot with the essential visual information he needs for a safe landing. With continuous technology advancement helmet-mounted displays (HMD) will soon become a spreading technology, because look through capability is an enabler to offer monitoring the outside view while presenting flight phase depending symbologies on the helmet display. Besides presenting primary flight information, additional information for obstacle accentuation or terrain visualization can be displayed on the visor. Virtual conformal elements like 3D pathway depiction or a 3D landing zone representation can help the pilot to maintain control until touchdown even during poor visual conditions. This paper describes first investigations in terms of both en route and landing symbology presented on a helmet mounted display system in the scope of helicopter flight trials with DLR’s flying helicopter simulator ACT/FHS.
Helicopter operations require a well-controlled and minimal lateral drift shortly before ground contact. Any lateral speed exceeding this small threshold can cause a dangerous momentum around the roll axis, which may cause a total roll over of the helicopter. As long as pilots can observe visual cues from the ground, they are able to easily control the helicopter drift. But whenever natural vision is reduced or even obscured, e.g. due to night, fog, or dust, this controllability diminishes. Therefore helicopter operators could benefit from some type of “drift indication” that mitigates the influence of a degraded visual environment. Generally humans derive ego motion by the perceived environmental object flow. The visual cues perceived are located close to the helicopter, therefore even small movements can be recognized. This fact was used to investigate a modified drift indication. To enhance the perception of ego motion in a conformal HMD symbol set the measured movement was used to generate a pattern motion in the forward field of view close or on the landing pad. The paper will discuss the method of amplified ego motion drift indication. Aspects concerning impact factors like visualization type, location, gain and more will be addressed. Further conclusions from previous studies, a high fidelity experiment and a part task experiment, will be provided. A part task study will be presented that compared different amplified drift indications against a predictor. 24 participants, 15 holding a fixed wing license and 4 helicopter pilots, had to perform a dual task on a virtual reality headset. A simplified control model was used to steer a “helicopter” down to a landing pad while acknowledging randomly placed characters.
Helicopter operations require a well-controlled and minimal lateral drift shortly before ground contact. Any lateral speed exceeding this small threshold can cause a dangerous momentum around the roll axis, which may cause a total roll-over of the helicopter. As long as pilots can observe visual cues from the ground, they are able to easily control the helicopter drift. However, when visibility is reduced or even obscured, e.g. due to night, fog, or dust, this controllability diminishes. Therefore helicopter operators could benefit from some type of "drift indication" that mitigates the influence of degraded visual environment.
With continuous technology advancement helmet-mounted displays (HMD) will soon become a spreading technology. At the present state HMDs are still expensive and are mostly reserved for military operations. The symbol sets fielded are designed for well trained staff and special missions. Investigating some of those symbol sets revealed that lateral drift indication doesn’t live for what it promises. With practice these symbol sets assist well during the approach but lack of proper cues once the helicopter hovers. Present developments also focus on three dimensional symbol sets that are conformal with the environment. All of them present a virtual landing pad. These types of see-through synthetic vision displays allow several new methods of information visualization.
Generally humans derive ego motion by the perceived environmental optical flow. To enhance this perception a pattern motion was implemented in a conformal HMD symbol set which amplifies the measured own ship movement. The paper presents results from an experimental study with 18 pilots from civil and military operators. In this study the forward landing zone border was replaced by an animated dashed line for indicating the amplified ego motion.
The availability of new technologies for helmet- and head-mounted displays facilitates the design of innovative cockpit layouts like a completely virtual flight deck. After the introduction of the so-called "glass cockpit", where formerly mechanical instruments have been converted digitally onto large panel screens, the virtual flight deck could be the logical next step into the future. Obviously, such a concept will save installation cost of conventional display hardware. Furthermore, and probably of greater importance, stressful and time-consuming accommodation changes for the pilots' eyes between outside- and inside-view can be avoided.
During the last months we have developed a concept for virtual cockpit instrumentation. Our implementation is based onto the JedEye™, a monochrome green, "looking-through" HMD, which offers a resolution of more than HD-TV, good enough to show detailed information as on presently installed head-down instruments. Our approach augments our latest "3D helicopter landing symbol format" with basic virtual instruments (PFD, ND, knee-board) in the near field of the cockpit environment in "no-window" areas. Besides, we have implemented a "drag and drop" mechanism, which enables pilots to arrange instrumentation on their personal preference. Tests in our Generic Cockpit Simulator (GECO) are currently conducted. As first pilots’ feedback show, our concept offers a great potential to be introduced into the future flight deck.
Helicopter guidance in situations where natural vision is reduced is still a challenging task. Beside new available sensors, which are able to “see” through darkness, fog and dust, display technology remains one of the key issues of pilot assistance systems. As long as we have pilots within aircraft cockpits, we have to keep them informed about the outside situation. “Situational awareness” of humans is mainly powered by their visual channel. Therefore, display systems which are able to cross-fade seamless from natural vision to artificial computer vision and vice versa, are of greatest interest within this context. Helmet-mounted displays (HMD) have this property when they apply a head-tracker for measuring the pilot’s head orientation relative to the aircraft reference frame. Together with the aircraft’s position and orientation relative to the world’s reference frame, the on-board graphics computer can generate images which are perfectly aligned with the outside world. We call image elements which match the outside world, “visual-conformal”. Published display formats for helicopter guidance in degraded visual environment apply mostly 2D-symbologies which stay far behind from what is possible. We propose a perspective 3D-symbology for a head-tracked HMD which shows as much as possible visual-conformal elements. We implemented and tested our proposal within our fixed based cockpit simulator as well as in our flying helicopter simulator (FHS). Recently conducted simulation trials with experienced helicopter pilots give some first evaluation results of our proposal.
Supporting a helicopter pilot during landing and takeoff in degraded visual environment (DVE) is one of the challenges
within DLR's project ALLFlight (Assisted Low Level Flight and Landing on Unprepared Landing Sites). Different types
of sensors (TV, Infrared, mmW radar and laser radar) are mounted onto DLR’s research helicopter FHS (flying
helicopter simulator) for gathering different sensor data of the surrounding world. A high performance computer cluster
architecture acquires and fuses all the information to get one single comprehensive description of the outside situation.
While both TV and IR cameras deliver images with frame rates of 25 Hz or 30 Hz, Ladar and mmW radar provide georeferenced
sensor data with only 2 Hz or even less. Therefore, it takes several seconds to detect or even track potential
moving obstacle candidates in mmW or Ladar sequences. Especially if the helicopter is flying with higher speed, it is
very important to minimize the detection time of obstacles in order to initiate a re-planning of the helicopter’s mission
timely. Applying feature extraction algorithms on IR images in combination with data fusion algorithms of extracted
features and Ladar data can decrease the detection time appreciably. Based on real data from flight tests, the paper
describes applied feature extraction methods for moving object detection, as well as data fusion techniques for
combining features from TV/IR and Ladar data.
The objective of the project ALLFlight (Assisted Low Level Flight and Landing on Unprepared Landing Sites) is to
demonstrate and evaluate the characteristics of different sensors for helicopter operations within degraded visual
environments, such as brownout or whiteout. The sensor suite, which is mounted onto DLR's research helicopter EC135
consists of standard color or black and white TV cameras, an un-cooled thermal infrared camera (EVS-1000, Max-Viz,
USA), an optical radar scanner (HELLAS-W, EADS, Germany) and a millimeter wave radar system (AI-130, ICx Radar
Systems, Canada). Data processing is designed and realized by a sophisticated, high performance sensor co-computer
(SCC) cluster architecture, which is installed into the helicopter's experimental electronic cargo bay.
This paper describes applied methods and the software architecture in terms of real time data acquisition, recording, time
stamping and sensor data fusion. First concepts for a pilot HMI are presented as well.
In 2008 the German Aerospace Center (DLR) started the project ALLFlight (Assisted Low Level Flight and Landing on
Unprepared Landing Sites). This project deals with the integration of a full scale enhanced vision sensor suite onto the
DLR's research helicopter EC135. This sensor suite consists of a variety of imaging sensors, including a color TV
camera and an un-cooled thermal infrared camera. Two different ranging sensors are also part of this sensor suite: an
optical radar scanner and a millimeter wave radar system. Both radar systems are equipped with specialized software for
experimental modes, such as terrain mapping and ground scanning. To be able to process and display the huge incoming
flood of data from these sensors, a compact high performance sensor co-computer system (SCC) has been designed and
realized, which can easily be installed into the helicopter's cargo bay. A sophisticated, high performance, distributed
data acquisition, recording, processing, and fusion software architecture has been developed and implemented during the
first project year. The paper describes the challenging mechanical integration of such a comprehensive sensor suite onto
the EC135 and explains the architectural hard- and software concept and the implementation on the SCC.