Ground
Control Station Human Factors Issues
Unmanned Aerial Systems
(UAS) have a variety of control mechanisms. Small systems may simply use a
handheld remote control. Larger aircraft, however, utilize a more significant
mechanism that is essentially a land-based cockpit. This is generally called a
Ground Control Station (GCS). The United States Air Force flies MQ-1 and MQ-9
UAS from a GCS built by General Atomics Aeronautical Systems, Inc. (GA-ASI).
However, this system has been around for many years, and it was not optimally
designed. According to a US Air Force Predator commander, the GCS was never
given the time to integrate human factors principals into its design
(Freedburg, 2012). The control station itself, Figure 1, is a tool to control
the aircraft. Stick and throttle, along with rudder and command screens, turn
physical commands into computer code commands. The data is then transmitted to
the aircraft, which processes the commands and reacts to them. The UAS itself
has minimal control logic within itself, it relies on direct commands to
perform its functions. There are many human factors problems with this design.
The first problem with the system is the manner in which the pilot receives
attitude information about the aircraft. The aircraft sends down attitude
information to the GCS, and the information is displayed on a HUD. The design
of the HUD is similar to those found in fighter aircraft cockpits (Figure 2).
Although the HUD itself is a proven concept, it’s execution in this context is
problematic. An important principle of human factors in display design is the absence
of excessive clutter (Wickens, C, Gordon, S, and Liu, Y, 1998). The HUD has
lots of information, but it is not in itself excessively cluttered. The problem
occurs in the placement of the symbology. While in a manned aircraft, the HUD
is a glass panel with the outside world behind it, in the GCS the HUD must have
streaming video behind it. The primary video source on the aircraft is a nose
camera that is fixed in the forward perspective (GA-ASI, 2015). This provides a
view similar to a manned aircraft, where the HUD is superimposed over the
natural horizon. But this is not the only camera on the aircraft, and not the
only video that can be displayed. The aircraft also has a targeting camera that
rotates and points at the Earth, providing a detailed view of scenes below.
This is the primary purpose of the aircraft: to view and collect information
with this view. When conducting operations, the pilot desires to view this
camera, and the only place to put this video is in place of the nose camera.
Now the problem exists where the HUD attitude and horizon are at odds with the
video view. During many maneuvers, the video changes angle rapidly, even when
the aircraft maintains a straight and level attitude. The potential for
disorientation is very high. The best way to eliminate this problem is to
separate the HUD from the video. In the current system this is difficult,
because not only can the pilot only view one video source at a time, datalink
issues limit receiving both videos from the aircraft without degrading the
quality of both. A second significant human factors issue is the presentation
of other data. The aircraft downlinks hundreds of pieces of information, all
displayed in tables (Figure 3). There are more than 65 tables available. This
information is helpful, but extremely limited in usefulness. For increased
perception, mental models help when the user can perceive pictorial realism and
visualize the moving parts of that data (Wickens, et. al., 1998). The variable
information tables, of which only two can be viewed at a time, does not have
any pictorial aspect. It is simply a collection of words and numbers that
require significant attention to ascertain their meaning. And the analog nature
of the numbers presents limits to perceived motion of the values. When a number
is changing, such as a temperature rising, it is difficult to ascertain its
motion, or the rate of that motion. That delays perception of changes within
the system. These, among many others, are some of the problems inherent with
GCS systems. There is a significant amount of data and information not
typically available to a aircraft cockpit. How to present these is a challenge,
and one not currently met by the existing system.
References
Freedberg,
S. (2012, August 07). Too Many Screens: Why Drones Are So Hard To Fly, So Easy
To Crash. Retrieved from: http://breakingdefense.com/2012/08/too-many-screens-why-drones-are-so-hard-to-fly-and-so-easy/
General
Atomics Aeronautical Systems, Inc. (2015, August 4). Flight Manual, USAF Series
MQ-9 Aircraft, Serial Numbers 004, 006, 008, and Above. California: General Atomics – ASI.
Wickens,
C, Gordon, S, and Liu, Y. (1998). An
Introduction to Human Factors Engineering. New York: Addison Wesley
Longman, Inc.
Appendix
Figure 1
Figure 2
Figure 3