GCS
DESIGN CONSIDERATIONS
Shawn
M. Wyne
ASCI
638 – Human Factors in Unmanned Systems
Embry-Riddle Aeronautical
University
Abstract
Positive control and
coordination of unmanned aerial systems (UAS) is a unique task set that has not
been completely explored from a human factors perspective. Aircraft cockpits
have been studied and improved over decades of research. But UAS, as an
emerging technology, have not had the same level of rigor applied. Is it
sufficient to simply re-create a cockpit on the ground and call it complete? An
aircraft cockpit, while optimized for the tasks it fulfills, also has
limitations in its scope. It has size and weight restrictions. Its equipment
must operate in a high-atmosphere environment that is cold, low air pressure,
and possibly under sustained g-forces. UAS are not encumbered by the same
physical limits of a cockpit. It follows that a renewed analysis should focus
on Ground Control Station (GCS) design and development, in order to optimize
the system for its intended purposes. System limitations and use differences
need to be fully accounted for in any analysis. UAS are not just another
aircraft, and a GCS should not be treated like just another cockpit.
Keywords:
UAS, GCS design, GCS systems
Significance
of Problem
Design considerations for
cockpits is a well-documented area of study. The environment, the information
processing and decision making aspects of a cockpit, are well known. But just
because a GCS and a cockpit both control a vehicle that is airborne, does not
mean they are interchangeable, or should be close to the same. A USAF
Scientific Advisory Board report noted “the considerable base of human factors
knowledge derived from cockpit experience may have limited applicability to
future systems” (as cited in Tvaryanas, 2006). Although the Board was
generalizing about the pace of technological advancement, the applicability of
this concept to GCS design is particularly salient.
There are numerous
standards a cockpit design must adhere to. Most are related to safety, in that
any inefficiency or error in pilot action may cause an accident and loss of
life. But for UAS, there is currently no equivalent set of standards for design.
As an effort to assist decision making of rules for integrating UAS into
national airspace, NASA commissioned a study to analyze FAA regulations and
determine how they relate to human factors issues applicable to GCS. In short,
the 100 page result suggests: nearly all of them (Jones, Harron, Hoffa, Lyall
& Wilson 2012). This direction implies the obvious equivalent to GCS design
standards is that of manned aircraft cockpits. But this is an imprecise, and
inappropriate analogy. Although the rules may state so, there are many reasons
they should not be the same. For hardware design, the GCS is not limited in the
way that an aircraft cockpit is limited. The physical environment of an
aircraft cockpit is harsher, and cockpit displays and controls are mandated to
survive in that environment. Land-bound, a GCS does not have the same
limitations. Power supply and computing support have their own limits, as an
aircraft often sacrifices system support or redundancy for less weight. Cockpits
also have limited ability to fully meet ergonomic ideals; things like ejection
seats and crash survivability take priority over crew comfort. Without these
limitations, GCS physical layout and design can surpass that of a manned
cockpit. Some aspects of design are appropriate, as they apply to all
human-machine systems, like those for visual acuity and information display
arrangement (Wickens, Gordon, & Lui 1998). But the design principles for
cockpits leave a number of informational avenues uncovered. Proprioceptive cues
typically available to pilots physically inside an aircraft are notably absent
in a GCS. Sensory input of inherent auditory and vestibular information is
crucially lacking. There are no rush of wind noises to imply excessive
airspeed, engine roar to verify a power setting, vibrations to judge
turbulence, or g-forces to inform a decrease in total energy state. The
displays in a cockpit are largely a visual medium to present aircraft system information
and states. But even this is only partially inclusive of the visual information
a pilot receives. Visual environmental cues allow a pilot to gather a
significant amount of information that is important to safe flight simply by
looking outside the cockpit. While zero visibility flight is technically
possible for manned flight, it is considered the most difficult, and requires
special training and certification to attempt (FAA 2015). Research has
indicated significant benefit of utilizing multiple sensory input methods to
increase response time and accuracy (Elliott & Stewart 2006). Other
technologies, such as speech-based input, have limited applicability in a noisy
cockpit, but may provide useful in a GCS. Even though the sum of information
required of a pilot in a manned aircraft may be the same as that in a GCS, the
systems and interface of the UAS pilot should not be the same.
In an analysis of military
UAS accidents it is noted that up to 69% of all mishaps are attributed to human
factors issues (Tvaryanas, Thompson, & Constable 2006, Waraich 2013). This
is significant in the design of current GCS and the acknowledgement of an area
of design flaws. But how this knowledge is used to dictate future design is not
known. Another field which has standards is that of computer workstations. As a
ground-based unit, a GCS typically has more in common with basic computer
workstations similar to those that control equipment such as manufacturing
facilities and heavy industry. The commercial standard is the American National
Standards Institute/Human Factors and Ergonomics Society-100 (ANSI/HFES-100).
These standards apply to workstations and input/output devices. Waraich (2013)
also compared several current GCS designs to ANSI/HFES-100 standards and found
them mostly compliant. Combining the fact that GCS are compliant with computer
workstation human factors standards, and yet the majority of accidents are
still human factors related, one can conclude that those standards alone are
insufficient. The Department of Defense (DoD) has its own standards for design.
Mil-Std-1472G dictates specific human engineering aspects of design, which is
largely anthropometric (DoD 2012). Mil-Std-1787C is narrower in focus, and
describes requirements of aircraft display symbology (DoD 2001). All of these
standards are applicable to UAS GCS design, but none of them address UAS
specific problems. Another area that should influence design standards is that
which mishaps are evaluated by. The DoD utilizes a Human Factors Analysis and
Classification System (HFACS) model to determine human factors influences on
mishap causation. The model looks at organizational influences, unsafe
supervision, preconditions for unsafe acts, and unsafe acts (Waraich 2013). The
latter two aspects are those traditionally considered design issues. Although
the first two categories are not normally addressed through design standards,
their potential causal influence on mishaps is clear. Even more concerning, is
the design of the UAS as a whole system can force organizational influences and
supervision to operate in a manner that is detrimental to overall safety and
efficiency. These problems are not addressed by standards that are purely
anthropometric. GCS standards need to include the full realm of operation and
investigate how the design influences all aspects of operation.
Another area of UAS
control that is entirely unique is the quality of data received by UAS crews.
By being physically separated from the aircraft, a GCS relies on some sort of
datalink to receive video and information. This link creates two problems. The
first is that bandwidth can be a limiting factor. The pilot’s view is already
restricted to what onboard cameras can see, and the video itself may be
degraded or simply poor. Resolution, color, and field of view are all impacted.
Poor or missing information leads to low awareness and mistakes. Synthetic,
computer-generated graphics can be developed to improve operator efficiency
(Calhoun, Draper, Abernathy, Delgado & Patzek 2005). Although such a system
creates new problems relating to clutter and accuracy, these issues are already
addressed by existing display standards. Synthetic vision can be tool to make
up for lost visual cues when the pilot is remotely controlling the aircraft. A
second issue unique to GCS design are the effects of temporal distortion due to
latency in datalinks. Simple control inputs can have delayed feedback which
restricts the operator’s ability to assess the accuracy and effects of those
inputs. Research has shown degraded performance when control is impaired by low
temporal update rates or transmission delays (McCauley & Wickens 2004).
This is another area of display design standards that are inadequate for GCS.
Normal design principles inform that feedback lag should be avoided when
possible (Wickens, et al 1998).
As a GCS is control of an
aircraft, a number of philosophies exist as to what type of crew should man
them. The USAF takes the position that a military trained pilot is required. The
FAA, Navy, and Marines simply require basic Private Pilot certificate. The US
Army, however, trains what they call simply “operators”, and are not actually
pilots (McCarly & Wickens 2005). The success of military UAS suggest that
neither approach is inherently incorrect. A significant finding in analyzing
UAS mishaps, is that across services, there are comparable rates of mishaps
resulting from judgment and decision-making errors, as well as those involving
crew resource management errors (Tvaryanas, Thompson, & Constable 2005).
This evidence points to the aircraft and GCS system design as the contributing
source of error, and also implies the need for extensive prerequisite training
is misguided. The USAF utilizes many manual control processes, and believes using
manned pilots to control UAS allows the transfer of skills and knowledge to
this enterprise (Cantwell 2009). But since the design of the system the pilots
are placed in is significantly lacking in overall human factors design, even
the thorough training they received as pilots has been insufficient to overcome
those shortfalls. Even more complicating to the issue is what should comprise
the crew makeup. Military UAS typically utilize two crewmembers, separating
flight control from payload control tasks. While research has shown poor
performance on current systems if one operator must control all the functions,
other research suggests improvements to the GCS can mitigate performance loss
when increasing workload (McCarley & Wickens 2004). A further complicating
aspect of UAS operation compared to manned cockpits is that the crew is not
likely to remain with the aircraft for the duration of a mission. Some UAS are
capable of 40+ hour flights, and it is common for control of an aircraft to
pass from crew to crew, or even from one GCS to another GCS. While this migration of control is beneficial
for mitigating operator fatigue, other aspects are potentially detrimental.
What is not clear is the implication for display design when operators must
interact with a system that is already in execution (Tvaryanas 2006). There is
potential for degraded situational and system awareness and resultant impacts
on performance and decision making.
Many negative human
factors issues typically manifest in problems with information presentation and
processing by the pilot. Divided attention is an issue when too much
information is presented in a single channel (Wickens & Dixon 2002). That
is, the human can only differentiate and process a finite number of cues with
one method of perception. There is a significant risk of error due to cognitive
saturation and tunneling (Cummings & Mitchell 2007, Dixon, Wickens &
Chang 2003). Typically these cues are exclusively visual in the GCS. Because so
much information is presented in a single channel, the operator is easily
overwhelmed. Wickens and Dixon (2002) demonstrated benefits of utilizing
multi-channel attention modes that allow humans to perceive and assimilate
disparate information all at once. In this way, workload can be maintained or
increased with less risk of error. Further, the limited cues existent in
current GCS have negative impacts on manual control of aircraft (Elliot &
Stewart 2006). The lack of cues leave the operator with inaccurate, or
incomplete perceptions of aircraft states and behavior. The consequence is poor
decision making and increased errors.
Automation, however, has
been demonstrated to improve performance in objective measurements of UAS
operation where it can limit the amount of information the pilot must process
(Wickens & Dixon 2002). Automation is not necessarily the complete
self-control of a system by computers. Automation exists along a scale, with
many levels (Wickens, et al 1998, Elliott & Stewart 2006, Drury & Scott
2008). Although automation solves some issues of excessive workload and control
limitations, new problems must be addressed. Complete automation relegates the
operator to the task of passive monitoring, which humans are not particularly
adept at accomplishing (Tvaryanas 2006). As an automated system makes decisions
and performs actions, there is a risk the operator is not aware of these
changes. As the operator becomes out-of-the-loop, there is potential for
experiencing mode confusion, where the operator does not understand the state
of the system or why that state exists (Wickens & Holland 2000). Once mode
confusion takes place, the risk of error and degraded performance increases
significantly. Additionally, the operator can experience distrust of automated
systems. The effect may be increased workload and decreased performance when
the operator doesn’t believe the automation will operate correctly.
Alternatively, an overly trusting operator becomes complacent and suffers from
automation bias, blindly trusting the system even when the consequences are
negative (Cummings & Mitchell 2007). Although automation design issues are
known human factors problems, their interaction with other unique aspects of
GCS design is a compounding problem and thus unique to UAS.
Alternative
Actions
The attempt to simply
copy and continue with current design requirements is possible, and will yield
a usable product, as it has for many years. Safety and efficiency will be
limited, perhaps excessively so. The long term success of UAS is not probable
under this paradigm. Human factors deficiencies often result in mishaps
eventually, and the potentially exponential growth of UAS into the civilian and
private sector make the risk untenable. Trial and error is also an insufficient
design methodology for a technological sector. Growth of capabilities and
emerging tasks for UAS to complete will outpace the ability of designers to
iterate improvements based on experienced deficiencies.
Recommendation
Further research
needs to be conducted to define what is truly necessary for GCS systems to
operate effectively. It will be a summation of current HF research, combined
with basic cockpit design precepts, but must include the unique environment and
cognitive challenges of teleoperation of aircraft. A particular difficulty in
defining appropriate standards is numerous variations of unmanned aircraft and
their task requirements, as well as their automation capabilities. Drury and
Scott (2008) compiled a framework of types of awareness required for UAS
operation. Notably, they include the vehicle itself as one of the system
components which needs information. The four categories of awareness they
identify are Human-UAV, Human-Human, UAV-Human, and UAV-UAV (Drury & Scott
2008). Importantly, the system as a whole includes in the Human-Human category
those individuals indirectly influencing the air vehicle described as mission
stakeholders. This concept brings completeness to the design of the GCS as a
whole system, and includes all the supporting elements of HFACS analysis. This
framework should be the starting point for more detailed analysis of GCS
systems, and a baseline for determining operator use-cases. The task-based
analysis will then be able to apply the more abstract human factors principles
to determine effective design strategies. An operational assessment of the MQ-9
Aircraft and Control Station rated the overall system satisfactory in terms of
system usability and design, but a number of areas were rated unsatisfactory in
broader terms of mission and task completion (AFOTEC 2013). This further
supports the concept that design standards in their current form are
insufficient to address the complicated use and unique problems facing UAS
control. UAS are the culmination of decades of innovation in technology. Remote
operation of a flying vehicle is an incredibly complex task, which requires more
detailed study and a development of UAS specific standards.
References
Air
Force Operational Test and Evaluation Center (AFOTEC). (2013). MQ-9 Reaper Increment 1 (Block 5
Aircraft/Block 30 Ground Control Station) Operational Assessment (OA)-3 Report.
Kirtland Air Force Base, New Mexico: Author.
ANSI/HFES-100.
(2007). Human Factors Engineering of
Computer Workstations. Santa Monica, CA: Human Factors and Ergonomics
Society.
Calhoun,
G., Draper, M., Abernathy, M., Delgado, F., & Patzek, M. (2005). Synthetic
Vision System for Improving Unmanned Aerial Vehicle Operator Situation Awareness.
Proceedings of SPIE vol. 5802, 219-230.
doi:10.1117/12.603421
Cantwell,
H. (2009). Operators of Air Force Unmanned Aircraft Systems. Air & Space Power Journal, 23(2),
67-77.
Cummings,
M., and Mitchell, P. (2007). Operator
Scheduling Strategies in Supervisory Control of Multiple UAVs. MIT Humans
and Automation Lab. Retrieved from: http://web.mit.edu/aeroastro/labs/halab/papers/CummingsMitchell_AST06.pdf
Department
of Defense. (2012). Mil-Std 1472G:
Department of Defense Design Criteria Standard, Human Engineering. Redstone
Arsenal, AL: Author.
Department
of Defense. (2001). Mil-Std-1787C:
Department of Defense Interface Standard, Aircraft Display Symbology.
Wright-Patterson AFB, Ohio: Author.
Dixon,
S., Wickens, C., & Chang, D. (2003) Comparing Quantitative Model
Predictions to Experimental Data in Multiple-UAV Flight Control. In Proceedings of the Human Factors and
Ergonomics Society Annual Meeting, 47(1), 104-108.
Drury,
J., & Scott, S. (2008). Awareness in Unmanned Aerial Vehicle Operations. The International C2 Journal, 2(1),
1-10. Retrieved from: http://www.cs.uml.edu/~holly/91.550/papers/Drury_Scott_UAV-Awareness.pdf
Elliott,
L., & Stewart, B. (2006). Automation and Autonomy in Unmanned Aircraft
Systems. In N. Cooke, H. Pringle, & H. Pedersen (Eds.), Human Factors of Remotely Operated Vehicles,
Volume 7. New York: JAI Press.
Federal
Aviation Administration (FAA). (2015). Federal Aviation Regulation Part 61.
Retrieved from: http://www.ecfr.gov/cgi-bin/text-idx?SID=9fbfcef04c89d713fb11ea739ad865db&mc=true&node=pt14.2.61&rgn=div5
Jones,
E., Harron, G., Hoffa, B., Lyall, B., & Wilson, J. (2012). Research Project: Human factors Guidelines
for Unmanned Aircraft System (UAS) Ground Control Station (GCS) Design, Year 1
Report. NASA Ames Research Center. Retrieved from: http://www.researchintegrations.com/publications/Jones_etal_2012_Human-Factors-Guidelines-for-UAS-GCS-Design_Year-1.pdf
McCarley, J., and Wickens, C. (2004). Human Factors
Concerns in UAV Flight. University of
Illinois at Urbana-Champaign Institute of Aviation, Aviation Human Factors
Division. Retrieved from: https://www.researchgate.net/profile/Christopher_Wickens/publication/241595724_
HUMAN_FACTORS_CONCERNS_IN_UAV_FLIGHT/links/\00b7d53b850921e188000000.pdf
McCarley,
J., and Wickens, C. (2005). Human Factors Implications of UAVs in the National
Airspace. University of Illinois at
Urbana-Champaign Institute of Aviation, Aviation Human Factors Division.
Retrieved from: http://www.tc.faa.gov/logistics/Grants/pdf/2004/04-G-032.pdf
Tvaryanas,
T. (2006). Human Factors Considerations
in Migration of Unmanned Aircraft Systems (UAS) Operator Control. USAF
Performance Enhancement Research Division. Retrived from: http://www.wpafb.af.mil/shared/media/document/AFD-090121-046.pdf
Tvaryanas,
A., Thompson, W. & Constable, S. (2005). The U.S. Military Unmanned Aerial Vehicle (UAV) Experience:
Evidence-Based Human Systems Integration Lessons Learned. Retrieved from http://www.rto.nato.int/abstracts.asp
Tvaryanas,
A., Thompson, W. & Constable, S. (2006). Human Factors in Remotely Piloted
Aircraft Operations: HFACS Analysis of 221 Mishaps Over 10 Years. Aviation, Space, and Environmental Medicine,
77(7), 724-732.
Waraich,
Q. (2013). Heterogeneous Design Approach
for Ground Control Stations to Marginalize Human Factors Mishaps in Unmanned Aircraft
Systems (Doctoral Dissertation). Retrieved from: https://www.sintef.no/globalassets/project/hfc/documents/waraich-qaisar-ehf-in-uas.pdf
Wickens,
C. & Dixon, S. (2002). Workload Demands of Remotely Piloted Vehicle
Supervision and Control. Savoy, IL: University of Illinois, Aviation Research
Lab.
Wickens,
C, Gordon, S, & Liu, Y. (1998). An
Introduction to Human Factors Engineering. New York: Addison Wesley
Longman, Inc.
Wickens, C., &
Holland, J. (2000). Engineering
Psychology and Human Performance, 3rd ed. New Jersey, Prentice Hall.