Sunday, December 20, 2015

Ethical and Moral Issues in UAS


Ethical and Moral Issues in UAS
Shawn M. Wyne
ASCI 638 – Human Factors in Unmanned Systems
Embry-Riddle Aeronautical University
Abstract
           Over the last decade, Unmanned Aerospace Systems (UAS) have become a significant part of warfare. They are a tool for accomplishing a number of military objectives, including employing weapons. This paper compares and contrasts the moral and ethical uses of UAS with those of more traditional weapons.
Keywords: UAS, remote warfare, ethics
Morality in UAS
Unmanned aircraft have been in use since almost the beginning of flight. Early versions were even weaponized, made into remote controlled unmanned suicide machines. But for decades, their use was limited by general technology, and they were never employed in any measurable number, nor were they particularly effective. The biggest gains in unmanned systems came in those of satellites throughout the Cold War. But in the last two decades this niche technology has grown and expanded into a useful industry and unmanned aircraft have become the norm for governments and militaries across the globe. As with all new technologies, its emergence raises new concerns to whether it should be used.
Whether or not to use a weapon requires first a discussion on what is legally allowed. UAS themselves are not actually a weapon, but they are a platform for employing weapons. In their current state, they employ the same weapons that are employed from other aircraft. Since the weapons themselves are allowed, then UAS are technically allowed as well. The second point to consider is the targeting employed by UAS. At least for the US military, targeting requirements are identical for manned and unmanned aircraft. Primary rules for legal targeting are to follow the principles of discrimination, proportionality, and military necessity (Johansson 2011). The laws of armed conflict apply no matter the type of weapon or the platform that employs it. If a target is legal, then it can be struck by UAS or manned aircraft alike.
The real concern over the use of UAS is the level of ease at which force is applied. The evolution of military technology has always embraced methods to inflict damage onto the enemy while limiting risk to oneself. From archers, to rifles, to artillery, to cruise missiles, distance affords the attacker a level of safety. The risk of harm to one’s own forces is often a deterring factor for warfare. A risk exists that increasing the ease with which one can engage in warfare will increase its use. I think this is a false analogy, however, for two reasons. First, is that not all forces abide by this logic in the first place. Dictators may send their forces to battle with no regard for their losses, or peoples like Japan in WWII or currently in North Korea will fight without question if told to do so, without any reservation. Secondly, nations that are considered more reasonable in modern times also must consider the open nature of the global community. Just because going to war might be easier, there are still consequences that cannot be avoided, even if those consequences are not direct military retaliation. I do not think UAS themselves will lend nations to more easily engage in warfare.
An additional concern is the risk of abuse of the new technology. I think that is a legitimate concern, but not one that requires any special care. All military technologies can be abused by those who wield them. Additionally, the States that may be inclined to use this type of technology against others, or even its own people, are not particularly considerate of rules anyway. Some States already abuse their power with more traditional tools. I do not believe an otherwise just government will begin to abuse their citizens just because of this additional technology.
The US Air Force has made a deliberate effort to re-brand UAS by calling them Remotely Piloted Aircraft (RPAs). This is an important distinction that the aircraft, even though there is not human inhabiting it, is still under deliberate control of an operator. The aircrew are trained to fly UAS in the same ways they are trained to fly manned aircraft. As such, they follow the same rules and procedures as any other aircraft. But they can do some things better than ever before. A fighter jet striking a target may only have minutes to identify a target and decide to employ weapons. In some cases the pilot never sees the target at all, but relies on someone else passing a coordinate and the fighter blindly release a GPS guided weapon. There are procedures that must be followed, but it is one way they operate. UAS have the ability to operate for much longer periods of time. A single UAS might orbit a target area for more than 40 hours, and may be relieved by subsequent UAS in turn for days or weeks of constant coverage. Using very detailed cameras, the UAS can collect information on a target like never before. Not only can they be surer of a target’s validity, they can be extremely precise in determining the level of collateral damage that might result from an attack. I have personally seen many valid targets not be struck at all because a UAS could identify civilians in the area, and have seen UAS wait for days until a clear shot is available. The level of detail provided by UAS allows a level of restraint not possible with manned aircraft. A fighter may have only minutes to decide whether to strike, and if they don’t the target may be lost. Even though, legally speaking, collateral damage may be acceptable, the global community is insisting that modern warfare limit civilian casualties to almost none. Avoiding “international censure” has become a deliberate goal of governments (Kreps & Kaag 2012). This task is made more difficult without using UAS.
UAS, like manned aircraft, missiles, artillery, and tanks, are a tool for accomplishing military objectives in warfare. They still follow the same laws of armed conflict applicable to all other fighting. They also allow more precision and surety in an environment that historically contains neither. As long as they are used responsibly, UAS are no more of a moral risk than any other long-distance conventional weapons that have come before them. The issues of when to use force are still a point of argument and discussion, but which tools to use should not exclude UAS simply because they might be misused.
References
Johansson, L. (2011, June 10). Is it Morally Right to Use Unmanned Aerial Vehicles (UAVs) in War? doi: 10.1007/s13347-011-03308
Kreps, S. and Kaag, J. (2012 Apr). The Use of Unmanned Aerial Vehicles in Contemporary Conflict: A Legal and Ethical Analysis. Polity 44.2: 260-285.
 
 

ASCI 638 end of course

For the Embry Riddle Aeronautical University Master's course ASCI 638, I wrote a case analysis of UAS GCS design. As a tool, the case analysis was very effective in the course. Primarily, this came in the required research undertook for the paper. In my job, I work on advanced GCS design and have input to many human factors related design decisions. One of the most interesting things I found in my research was several papers written by the same author on this topic. I have used my research to bring up discussions with coworkers on the topic, and found out some of them knew the author I was talking about. He had worked indirectly with our project several years ago and was an early impetus for some of the human factors we currently do. It also provided a framework for some of the principles we now follow, since I better understand where they came from. Some other things I have found through my research is work that has been done in an area I am just starting to work on. I have additional tasks starting next year to incorporate some new technologies into our company's products. Some of these will be difficult to do, but I was able to find some research on the exact topic that will help guide some of our decisions. Particularly, I am going to try to use some of this research to convince one of our program managers that some of his expectations are not realistic. He has the right intent, but some of his ideas are actually counter to some important human factors principles, which have been researched and proven, specifically in the domain in which we work.

Saturday, December 19, 2015

GCS Design Considerations


GCS DESIGN CONSIDERATIONS
Shawn M. Wyne
ASCI 638 – Human Factors in Unmanned Systems
Embry-Riddle Aeronautical University
Abstract
Positive control and coordination of unmanned aerial systems (UAS) is a unique task set that has not been completely explored from a human factors perspective. Aircraft cockpits have been studied and improved over decades of research. But UAS, as an emerging technology, have not had the same level of rigor applied. Is it sufficient to simply re-create a cockpit on the ground and call it complete? An aircraft cockpit, while optimized for the tasks it fulfills, also has limitations in its scope. It has size and weight restrictions. Its equipment must operate in a high-atmosphere environment that is cold, low air pressure, and possibly under sustained g-forces. UAS are not encumbered by the same physical limits of a cockpit. It follows that a renewed analysis should focus on Ground Control Station (GCS) design and development, in order to optimize the system for its intended purposes. System limitations and use differences need to be fully accounted for in any analysis. UAS are not just another aircraft, and a GCS should not be treated like just another cockpit.
Keywords: UAS, GCS design, GCS systems
Significance of Problem
Design considerations for cockpits is a well-documented area of study. The environment, the information processing and decision making aspects of a cockpit, are well known. But just because a GCS and a cockpit both control a vehicle that is airborne, does not mean they are interchangeable, or should be close to the same. A USAF Scientific Advisory Board report noted “the considerable base of human factors knowledge derived from cockpit experience may have limited applicability to future systems” (as cited in Tvaryanas, 2006). Although the Board was generalizing about the pace of technological advancement, the applicability of this concept to GCS design is particularly salient.
There are numerous standards a cockpit design must adhere to. Most are related to safety, in that any inefficiency or error in pilot action may cause an accident and loss of life. But for UAS, there is currently no equivalent set of standards for design. As an effort to assist decision making of rules for integrating UAS into national airspace, NASA commissioned a study to analyze FAA regulations and determine how they relate to human factors issues applicable to GCS. In short, the 100 page result suggests: nearly all of them (Jones, Harron, Hoffa, Lyall & Wilson 2012). This direction implies the obvious equivalent to GCS design standards is that of manned aircraft cockpits. But this is an imprecise, and inappropriate analogy. Although the rules may state so, there are many reasons they should not be the same. For hardware design, the GCS is not limited in the way that an aircraft cockpit is limited. The physical environment of an aircraft cockpit is harsher, and cockpit displays and controls are mandated to survive in that environment. Land-bound, a GCS does not have the same limitations. Power supply and computing support have their own limits, as an aircraft often sacrifices system support or redundancy for less weight. Cockpits also have limited ability to fully meet ergonomic ideals; things like ejection seats and crash survivability take priority over crew comfort. Without these limitations, GCS physical layout and design can surpass that of a manned cockpit. Some aspects of design are appropriate, as they apply to all human-machine systems, like those for visual acuity and information display arrangement (Wickens, Gordon, & Lui 1998). But the design principles for cockpits leave a number of informational avenues uncovered. Proprioceptive cues typically available to pilots physically inside an aircraft are notably absent in a GCS. Sensory input of inherent auditory and vestibular information is crucially lacking. There are no rush of wind noises to imply excessive airspeed, engine roar to verify a power setting, vibrations to judge turbulence, or g-forces to inform a decrease in total energy state. The displays in a cockpit are largely a visual medium to present aircraft system information and states. But even this is only partially inclusive of the visual information a pilot receives. Visual environmental cues allow a pilot to gather a significant amount of information that is important to safe flight simply by looking outside the cockpit. While zero visibility flight is technically possible for manned flight, it is considered the most difficult, and requires special training and certification to attempt (FAA 2015). Research has indicated significant benefit of utilizing multiple sensory input methods to increase response time and accuracy (Elliott & Stewart 2006). Other technologies, such as speech-based input, have limited applicability in a noisy cockpit, but may provide useful in a GCS. Even though the sum of information required of a pilot in a manned aircraft may be the same as that in a GCS, the systems and interface of the UAS pilot should not be the same.
In an analysis of military UAS accidents it is noted that up to 69% of all mishaps are attributed to human factors issues (Tvaryanas, Thompson, & Constable 2006, Waraich 2013). This is significant in the design of current GCS and the acknowledgement of an area of design flaws. But how this knowledge is used to dictate future design is not known. Another field which has standards is that of computer workstations. As a ground-based unit, a GCS typically has more in common with basic computer workstations similar to those that control equipment such as manufacturing facilities and heavy industry. The commercial standard is the American National Standards Institute/Human Factors and Ergonomics Society-100 (ANSI/HFES-100). These standards apply to workstations and input/output devices. Waraich (2013) also compared several current GCS designs to ANSI/HFES-100 standards and found them mostly compliant. Combining the fact that GCS are compliant with computer workstation human factors standards, and yet the majority of accidents are still human factors related, one can conclude that those standards alone are insufficient. The Department of Defense (DoD) has its own standards for design. Mil-Std-1472G dictates specific human engineering aspects of design, which is largely anthropometric (DoD 2012). Mil-Std-1787C is narrower in focus, and describes requirements of aircraft display symbology (DoD 2001). All of these standards are applicable to UAS GCS design, but none of them address UAS specific problems. Another area that should influence design standards is that which mishaps are evaluated by. The DoD utilizes a Human Factors Analysis and Classification System (HFACS) model to determine human factors influences on mishap causation. The model looks at organizational influences, unsafe supervision, preconditions for unsafe acts, and unsafe acts (Waraich 2013). The latter two aspects are those traditionally considered design issues. Although the first two categories are not normally addressed through design standards, their potential causal influence on mishaps is clear. Even more concerning, is the design of the UAS as a whole system can force organizational influences and supervision to operate in a manner that is detrimental to overall safety and efficiency. These problems are not addressed by standards that are purely anthropometric. GCS standards need to include the full realm of operation and investigate how the design influences all aspects of operation.
Another area of UAS control that is entirely unique is the quality of data received by UAS crews. By being physically separated from the aircraft, a GCS relies on some sort of datalink to receive video and information. This link creates two problems. The first is that bandwidth can be a limiting factor. The pilot’s view is already restricted to what onboard cameras can see, and the video itself may be degraded or simply poor. Resolution, color, and field of view are all impacted. Poor or missing information leads to low awareness and mistakes. Synthetic, computer-generated graphics can be developed to improve operator efficiency (Calhoun, Draper, Abernathy, Delgado & Patzek 2005). Although such a system creates new problems relating to clutter and accuracy, these issues are already addressed by existing display standards. Synthetic vision can be tool to make up for lost visual cues when the pilot is remotely controlling the aircraft. A second issue unique to GCS design are the effects of temporal distortion due to latency in datalinks. Simple control inputs can have delayed feedback which restricts the operator’s ability to assess the accuracy and effects of those inputs. Research has shown degraded performance when control is impaired by low temporal update rates or transmission delays (McCauley & Wickens 2004). This is another area of display design standards that are inadequate for GCS. Normal design principles inform that feedback lag should be avoided when possible (Wickens, et al 1998).
As a GCS is control of an aircraft, a number of philosophies exist as to what type of crew should man them. The USAF takes the position that a military trained pilot is required. The FAA, Navy, and Marines simply require basic Private Pilot certificate. The US Army, however, trains what they call simply “operators”, and are not actually pilots (McCarly & Wickens 2005). The success of military UAS suggest that neither approach is inherently incorrect. A significant finding in analyzing UAS mishaps, is that across services, there are comparable rates of mishaps resulting from judgment and decision-making errors, as well as those involving crew resource management errors (Tvaryanas, Thompson, & Constable 2005). This evidence points to the aircraft and GCS system design as the contributing source of error, and also implies the need for extensive prerequisite training is misguided. The USAF utilizes many manual control processes, and believes using manned pilots to control UAS allows the transfer of skills and knowledge to this enterprise (Cantwell 2009). But since the design of the system the pilots are placed in is significantly lacking in overall human factors design, even the thorough training they received as pilots has been insufficient to overcome those shortfalls. Even more complicating to the issue is what should comprise the crew makeup. Military UAS typically utilize two crewmembers, separating flight control from payload control tasks. While research has shown poor performance on current systems if one operator must control all the functions, other research suggests improvements to the GCS can mitigate performance loss when increasing workload (McCarley & Wickens 2004). A further complicating aspect of UAS operation compared to manned cockpits is that the crew is not likely to remain with the aircraft for the duration of a mission. Some UAS are capable of 40+ hour flights, and it is common for control of an aircraft to pass from crew to crew, or even from one GCS to another GCS.  While this migration of control is beneficial for mitigating operator fatigue, other aspects are potentially detrimental. What is not clear is the implication for display design when operators must interact with a system that is already in execution (Tvaryanas 2006). There is potential for degraded situational and system awareness and resultant impacts on performance and decision making.
Many negative human factors issues typically manifest in problems with information presentation and processing by the pilot. Divided attention is an issue when too much information is presented in a single channel (Wickens & Dixon 2002). That is, the human can only differentiate and process a finite number of cues with one method of perception. There is a significant risk of error due to cognitive saturation and tunneling (Cummings & Mitchell 2007, Dixon, Wickens & Chang 2003). Typically these cues are exclusively visual in the GCS. Because so much information is presented in a single channel, the operator is easily overwhelmed. Wickens and Dixon (2002) demonstrated benefits of utilizing multi-channel attention modes that allow humans to perceive and assimilate disparate information all at once. In this way, workload can be maintained or increased with less risk of error. Further, the limited cues existent in current GCS have negative impacts on manual control of aircraft (Elliot & Stewart 2006). The lack of cues leave the operator with inaccurate, or incomplete perceptions of aircraft states and behavior. The consequence is poor decision making and increased errors.
Automation, however, has been demonstrated to improve performance in objective measurements of UAS operation where it can limit the amount of information the pilot must process (Wickens & Dixon 2002). Automation is not necessarily the complete self-control of a system by computers. Automation exists along a scale, with many levels (Wickens, et al 1998, Elliott & Stewart 2006, Drury & Scott 2008). Although automation solves some issues of excessive workload and control limitations, new problems must be addressed. Complete automation relegates the operator to the task of passive monitoring, which humans are not particularly adept at accomplishing (Tvaryanas 2006). As an automated system makes decisions and performs actions, there is a risk the operator is not aware of these changes. As the operator becomes out-of-the-loop, there is potential for experiencing mode confusion, where the operator does not understand the state of the system or why that state exists (Wickens & Holland 2000). Once mode confusion takes place, the risk of error and degraded performance increases significantly. Additionally, the operator can experience distrust of automated systems. The effect may be increased workload and decreased performance when the operator doesn’t believe the automation will operate correctly. Alternatively, an overly trusting operator becomes complacent and suffers from automation bias, blindly trusting the system even when the consequences are negative (Cummings & Mitchell 2007). Although automation design issues are known human factors problems, their interaction with other unique aspects of GCS design is a compounding problem and thus unique to UAS.
Alternative Actions
The attempt to simply copy and continue with current design requirements is possible, and will yield a usable product, as it has for many years. Safety and efficiency will be limited, perhaps excessively so. The long term success of UAS is not probable under this paradigm. Human factors deficiencies often result in mishaps eventually, and the potentially exponential growth of UAS into the civilian and private sector make the risk untenable. Trial and error is also an insufficient design methodology for a technological sector. Growth of capabilities and emerging tasks for UAS to complete will outpace the ability of designers to iterate improvements based on experienced deficiencies.
Recommendation
Further research needs to be conducted to define what is truly necessary for GCS systems to operate effectively. It will be a summation of current HF research, combined with basic cockpit design precepts, but must include the unique environment and cognitive challenges of teleoperation of aircraft. A particular difficulty in defining appropriate standards is numerous variations of unmanned aircraft and their task requirements, as well as their automation capabilities. Drury and Scott (2008) compiled a framework of types of awareness required for UAS operation. Notably, they include the vehicle itself as one of the system components which needs information. The four categories of awareness they identify are Human-UAV, Human-Human, UAV-Human, and UAV-UAV (Drury & Scott 2008). Importantly, the system as a whole includes in the Human-Human category those individuals indirectly influencing the air vehicle described as mission stakeholders. This concept brings completeness to the design of the GCS as a whole system, and includes all the supporting elements of HFACS analysis. This framework should be the starting point for more detailed analysis of GCS systems, and a baseline for determining operator use-cases. The task-based analysis will then be able to apply the more abstract human factors principles to determine effective design strategies. An operational assessment of the MQ-9 Aircraft and Control Station rated the overall system satisfactory in terms of system usability and design, but a number of areas were rated unsatisfactory in broader terms of mission and task completion (AFOTEC 2013). This further supports the concept that design standards in their current form are insufficient to address the complicated use and unique problems facing UAS control. UAS are the culmination of decades of innovation in technology. Remote operation of a flying vehicle is an incredibly complex task, which requires more detailed study and a development of UAS specific standards.
References
Air Force Operational Test and Evaluation Center (AFOTEC). (2013). MQ-9 Reaper Increment 1 (Block 5 Aircraft/Block 30 Ground Control Station) Operational Assessment (OA)-3 Report. Kirtland Air Force Base, New Mexico: Author.
ANSI/HFES-100. (2007). Human Factors Engineering of Computer Workstations. Santa Monica, CA: Human Factors and Ergonomics Society.
Calhoun, G., Draper, M., Abernathy, M., Delgado, F., & Patzek, M. (2005). Synthetic Vision System for Improving Unmanned Aerial Vehicle Operator Situation Awareness. Proceedings of SPIE vol. 5802, 219-230. doi:10.1117/12.603421
Cantwell, H. (2009). Operators of Air Force Unmanned Aircraft Systems. Air & Space Power Journal, 23(2), 67-77.
Cummings, M., and Mitchell, P. (2007). Operator Scheduling Strategies in Supervisory Control of Multiple UAVs. MIT Humans and Automation Lab. Retrieved from: http://web.mit.edu/aeroastro/labs/halab/papers/CummingsMitchell_AST06.pdf
Department of Defense. (2012). Mil-Std 1472G: Department of Defense Design Criteria Standard, Human Engineering. Redstone Arsenal, AL: Author.
Department of Defense. (2001). Mil-Std-1787C: Department of Defense Interface Standard, Aircraft Display Symbology. Wright-Patterson AFB, Ohio: Author.
Dixon, S., Wickens, C., & Chang, D. (2003) Comparing Quantitative Model Predictions to Experimental Data in Multiple-UAV Flight Control. In Proceedings of the Human Factors and Ergonomics Society Annual Meeting, 47(1), 104-108.
Drury, J., & Scott, S. (2008). Awareness in Unmanned Aerial Vehicle Operations. The International C2 Journal, 2(1), 1-10. Retrieved from: http://www.cs.uml.edu/~holly/91.550/papers/Drury_Scott_UAV-Awareness.pdf
Elliott, L., & Stewart, B. (2006). Automation and Autonomy in Unmanned Aircraft Systems. In N. Cooke, H. Pringle, & H. Pedersen (Eds.), Human Factors of Remotely Operated Vehicles, Volume 7. New York: JAI Press.
Federal Aviation Administration (FAA). (2015). Federal Aviation Regulation Part 61. Retrieved from: http://www.ecfr.gov/cgi-bin/text-idx?SID=9fbfcef04c89d713fb11ea739ad865db&mc=true&node=pt14.2.61&rgn=div5
Jones, E., Harron, G., Hoffa, B., Lyall, B., & Wilson, J. (2012). Research Project: Human factors Guidelines for Unmanned Aircraft System (UAS) Ground Control Station (GCS) Design, Year 1 Report. NASA Ames Research Center. Retrieved from: http://www.researchintegrations.com/publications/Jones_etal_2012_Human-Factors-Guidelines-for-UAS-GCS-Design_Year-1.pdf
 McCarley, J., and Wickens, C. (2004). Human Factors Concerns in UAV Flight. University of Illinois at Urbana-Champaign Institute of Aviation, Aviation Human Factors Division. Retrieved from: https://www.researchgate.net/profile/Christopher_Wickens/publication/241595724_
           HUMAN_FACTORS_CONCERNS_IN_UAV_FLIGHT/links/\00b7d53b850921e188000000.pdf
McCarley, J., and Wickens, C. (2005). Human Factors Implications of UAVs in the National Airspace. University of Illinois at Urbana-Champaign Institute of Aviation, Aviation Human Factors Division. Retrieved from: http://www.tc.faa.gov/logistics/Grants/pdf/2004/04-G-032.pdf
Tvaryanas, T. (2006). Human Factors Considerations in Migration of Unmanned Aircraft Systems (UAS) Operator Control. USAF Performance Enhancement Research Division. Retrived from: http://www.wpafb.af.mil/shared/media/document/AFD-090121-046.pdf
Tvaryanas, A., Thompson, W. & Constable, S. (2005). The U.S. Military Unmanned Aerial Vehicle (UAV) Experience: Evidence-Based Human Systems Integration Lessons Learned. Retrieved from http://www.rto.nato.int/abstracts.asp
Tvaryanas, A., Thompson, W. & Constable, S. (2006). Human Factors in Remotely Piloted Aircraft Operations: HFACS Analysis of 221 Mishaps Over 10 Years. Aviation, Space, and Environmental Medicine, 77(7), 724-732.
Waraich, Q. (2013). Heterogeneous Design Approach for Ground Control Stations to Marginalize Human Factors Mishaps in Unmanned Aircraft Systems (Doctoral Dissertation). Retrieved from: https://www.sintef.no/globalassets/project/hfc/documents/waraich-qaisar-ehf-in-uas.pdf
Wickens, C. & Dixon, S. (2002). Workload Demands of Remotely Piloted Vehicle Supervision and Control. Savoy, IL: University of Illinois, Aviation Research Lab.
Wickens, C, Gordon, S, & Liu, Y. (1998). An Introduction to Human Factors Engineering. New York: Addison Wesley Longman, Inc.
Wickens, C., & Holland, J. (2000). Engineering Psychology and Human Performance, 3rd ed. New Jersey, Prentice Hall.
 
 
 
 

Sunday, December 6, 2015

Automatic Take-off and Landing


Automatic Takeoff and Landing
Shawn M. Wyne
ASCI 638 – Human Factors in Unmanned Systems
Embry-Riddle Aeronautical University
Abstract
        Airplanes require a host of skills for pilots to operate them safely. When airborne during a cruise phase, there is larger margin of error than compared to flight near terrain. By obvious necessity, aircraft must go near the ground at a minimum for takeoff and landing. It is these regimes that are the most dangerous, and certainly many mishaps that otherwise could be recovered from cause an accident. As all aspects of flight, anything that has the potential for accidents has been attempted to alleviate through the use of technology. Manned and unmanned aerospace systems (UAS) both benefit from the use of automatic takeoff and landing systems.
        Keywords: UAS, autoland, automatic takeoff and landing

Auto Takeoff and Landing
        Automatic takeoff and landing systems are employed on a number of aircraft. The reason for this is very simple. Aircraft have lots of accidents during the landing and takeoff phases of flight. Often these accidents are the result of poor flying conditions. Low visibility is a significantly difficult situation for a pilot to land in, as there is minimal visual reference to align the aircraft to the runway. A number of instruments have been developed to provide information to a pilot for this assistance. The localizer, and instrument landing systems have been around for decades. But even their accuracy is limited, and bad weather may preclude even their use. The FAA defines weather minimums to allow for particular systems to land. The worst allowable landing weather is called Category IIIB, which means a decision height less than 50ft and runway visibility between 150-700 feet (FAA 1999). These conditions are such than an airliner may not actually see the runway until as few as five seconds before touchdown. Category III weather minima basically dictates that an automatic landing system be used; manual landing is only approved for Category II or better weather. Another reason for using automatic takeoff or landing procedures is for use in UAS. Datalink latency may make manual control of the aircraft dangerous, or the control station may not even have the mechanical controls allowing manual flight at all. All of these scenarios provide a situation where an autopilot may be preferred, or required, to use.
        One aircraft that uses a variety of automatic systems is the Boeing 737-600 series and higher. The 737 actually utilizes a system capable of three different automatic approach and landing levels. A publication by Boeing by Craig, Houck, and Shomber (2003) describes the different capabilities of the aircraft. The actual autoland is the only one certified for Category IIIB operations. It works by utilizing the existing instrumentation, and details from the aircraft Flight Management System (FMS) like runway width and length, to manage aircraft parameters like pitch, engines, and brakes as appropriate. Once set up by the pilot, it is a hands-off system as long as it is working. The aircraft maintains course and decent path all the way through the flare and touchdown, applies brakes and spoilers and brings the aircraft to a complete stop. Use of reverse thrusters remains a pilot controlled item. Since it is a fail-safe design (as required by regulation), a failed system automatically reverts to manual flight. The pilot can also revert to manual flight if they choose. A second method of landing is with a GPS based landing system. This system relies on both GPS signals, and a special ground based system to provide positional information to the aircraft. This also can be reverted to manual flight, and is only approved for Category I approaches with a 200 foot decision height as of publication date. But the automatic landing is still available. There is also a third automatic approach system, but it is only for non-precision approaches and requires pilot control of altitude control and a manual decent and landing.
        Another aircraft that utilizes automatic takeoff and landing capability is the General Atomics MQ-1C Gray Eagle. This aircraft utilizes automatic systems for the primary reason that the Army does not use pilots to control them. The operators are primarily sensor managers and control aircraft position by manipulating the autopilots. The takeoff and landing capability was originally enabled by use of the Tactical automatic Landing System (TALS). This is a system of ground based radars and aircraft systems that orient the aircraft’s location in relation to the runway (“TALS” n.d.). However, as that system relies on a ground element to be precisely installed and aligned, overall it was impractical for military use. The system was converted to an Automatic Takeoff and Landing System (ATLS) that is wholly onboard the aircraft. This system is strictly GPS based and therefore is more flexible. Although the system would not be at all approved for Category IIIB type approaches, the military considers it irrelevant as there are no personnel at risk in a low visibility landing, just equipment. As of 2013, the new system had surpassed 20,000 safe landings (Pocock 2013).
        The success of autoland systems shows their viability in both manned and unmanned systems. However, complete reliance on such technology is not appropriate. A number of conditions or situations, both internal and external, make the autoland features unusable. Therefore, there still should be a backup method for manual landing. This is a problem for training and keeping skills up for pilots. Complacency is always an issue for automated systems, and it must be designed with human interaction in mind.
References
        Craig, R., Houck, D., and Shomber, R. (2003). 737 Approach Navigation Options. Boeing Aero Magazine. Retrieved from: http://www.boeing.com/commercial/aeromagazine/aero_22/737approach.pdf.
        FAA (1999, July 13). Advisory Circular 120-28D Criteria for Approval of Category III Weather Minima For Takeoff, Landing, and Rollout. US Department of Transportation Federal Aviation Administration.
        Pocock, C. (2013, October 25). Improved Gray Eagle UAS Flies for 45 Hours. Retrieved from: http://www.suasnews.com/2013/10/25700/gray-eagle-completes-20000-automated-takeoffs-landings/
         TALS Tactical Automatic Landing System. (n.d.). Sierra Nevada Corporation. Retrieved from http://sncorp.com/Pdfs/BusinessAreas/TALS%20Product%20Sheet.pdf

Saturday, November 28, 2015

UAS Shift Work Schedule

UAS Shift Work Schedule
Shawn M. Wyne
ASCI 638 – Human Factors in Unmanned Systems
Embry-Riddle Aeronautical University
 Abstract
The long duration of Unmanned Aircraft Systems (UAS) flight creates a new phenomenon for aviation. With aircraft that can stay aloft as much as 40 hours at a time, pilots and flight crews must operate in shifts to accomplish a mission. In a military unit that controls multiple UAS at a time, personnel manning must be sufficient to cover 365/24/7 operations. Like other professions that have no down time, such as police and hospital nurses, a schedule must be created for employees to cover multiple shifts. This paper analyzes a current UAS squadron’s shift schedule and proposes changes for the purpose of decreasing fatigue among flight crews.
Keywords: UAS, fatigue, shift work, schedules
UAS Shift Work Schedule
A United States Air Force (USAF) UAS squadron operating MQ-1B Medium Altitude, Long Endurance (MALE) aircraft operates 24/7, 365 days a year. The crews in this squadron are assigned to four teams on a continuous shift work schedule of six days on, two days off (Figure 1). Unfortunately, crews have been reporting excessive levels of fatigue, and complain of inadequate levels of sleep due to their schedule. Fatigue is a problem, especially when operating aircraft, because it has been shown to adversely affect performance (Wickens, Gordon, & Liu, 1998). Previous research indicates nearly 50% of UAS operators meet the threshold for levels of daily sleepiness that is expected to negatively affect job performance and safety (Tvaryanas, Platt, Swigart, Colebank, & Miller, 2008). A new schedule is presented to accommodate known issues related to fatigue resulting from shift work.
The current schedule causes excessive levels of fatigue in its participants because of several reasons. The first reason is because the schedule disrupts crews’ circadian rhythms. Circadian rhythms are physiological responses to the wake/sleep cycle, and when they are desynchronized from nature they body physically feels sleepy when the worker is trying to be awake (Wickens, et al, 1998). A certain amount of disruption is inevitable when work must be accomplished overnight. But the body can adapt, which takes 4-5 days to occur (Wickens, et al, 1998). Unfortunately, the current schedule provides only six days on a particular shift, and then the shift is altered. The crews are not provided with sufficient time on a particular schedule to adapt before their schedule changes and they must start over. Longer periods of maintaining a particular shift will allow crews the chance to adapt to their schedule.
The second problem with the current schedule is that is allows repeated states of sleep loss. The six-on two-off schedule provides insufficient time to recover from a sleep deprived state. A typical Monday through Friday worker has an average of 21.8 days of work and 8.7 days off per calendar month. The current UAS squadron shift results in 24 days of work and 6 days off each month. Deprivation of sleep that occurs when trying to work against circadian rhythms can build up over time, resulting in a cumulative sleep debt (Wickens, et al, 1998). More days off, or more frequent days off, will allow crews to recover from lost sleep during the work week.
The first change to the new schedule is to alter the team arrangement. Four teams will be consolidated to three. This way, each team can be assigned to a single shift at a time (Figure 2). The team will maintain that shift for a longer period of time. Fast rotations of shifts are poor for circadian rhythm management. But staying on a single shift for excessively long periods of time can also be bad. Long periods of a single shift is only effective if crews can maintain their sleep schedule even during days off (Tvaryanas, et al, 2008). This might be achieved in a deployed scenario where there is little to do outside of work. Realistically this is not probable considering crews have families and other priorities outside of work. The new schedule will rotate shifts every six weeks.
The second change to the new schedule is to alter the number of days each crew works. Since a single team covers a single shift, crews within the team must alternate work and off days to ensure enough workers are present each day. The previous schedule of six-on, two-off will be replaced with a four-on, two-off rotation (Figure 3). This schedule yields a monthly total of 20 work days and 10 days off. The increased frequency and quantity of off days allows more opportunities to recover from sleep debt. The rotation still keeps the same numbers of workers on the shift any particular day as the six/two shift. This type of manning schedule, called a waterfall, has overlap during the transition week when one team moves onto its new shift (Figure 4). But the shifts are still covered with the same number of crews.
         Any changes to the shift schedule are only a partial solution. Increased manning would provide significantly more opportunity for breaks in the schedule for single crewmembers, giving greater opportunities for vacation or alternate duties that do not require shift work. The new schedule gives longer periods on a particular shift to acclimate to the schedule, and also provides more breaks. Both of these should contribute to decreased problems with fatigue and have an overall positive impact on crew performance compared to the old schedule. 
References
        Tvaryanas, P., Platte, W., Swigart, C., Colebank, J., and Miller, N. (2008). A Resurvey of Shift Work-Related Fatigue in MQ-1 Predator Unmanned Aircraft System Crewmembers. Monterey, CA: Naval Postgraduate School.
        Wickens, C, Gordon, S, and Liu, Y. (1998). An Introduction to Human Factors Engineering. New York: Addison Wesley Longman, Inc.
Appendix
Figure 1. Current Squadron Shift Schedule.
Figure 2. New Team Arrangement.
Figure 3. New Crew Schedule.

Figure 4. Transition Weeks.
 

Sunday, November 22, 2015

UAS Beyond Line of Sight Operations


UAS Beyond Line of Sight Operations
Shawn M. Wyne
ASCI 638 – Human Factors in Unmanned Systems
Embry-Riddle Aeronautical University
Abstract
    The operation of Unmanned Aircraft Systems (UAS) normally requires a connection to the aircraft through a datalink. This paper explores the method and technique of datalink through a beyond line of sight (BLOS) connection. The technical and procedural implications are described, as well as an analysis of benefits and drawbacks of such a system. Finally, several applications are suggested that may see a benefit of BLOS operations.
Keywords: UAS, BLOS, beyond line of sight, datalink
UAS Beyond Line of Sight Operations
        Unmanned Aircraft Systems (UAS) utilize a number of techniques to receive control inputs and return data to the controller. The MQ-9 Reaper aircraft is manufactured by General Atomics Aeronautical Systems (GA-ASI) with the express capability to operate beyond line of sight (BLOS) from the control station (GA-ASI, 2015). Radio waves have a limitation in that the curvature of the Earth prevents their transmission very far. In order to extend this range, the radio waves need to have a method of reaching an aircraft that is not blocked by the planet. The technique used by the MQ-9 is to bounce the signal off of a satellite in geosynchronous orbit. For many technical and practical reasons, the signal currently chosen for use is in the Ku-band of the radio spectrum. The command signal departs the ground station, arrives at a Ku Satellite Communications Antenna (SATCOM), and is directed at the satellite. The satellite that receives the signal then re-transmits it down to the aircraft, which has an 18 inch satellite dish in the nose (GA-ASI, 2015). The return link signal follows the same path back to the Ground Control Station (GCS). This extends the range of the aircraft to anywhere within the footprint of the satellite, which is hundreds of miles wide. Figure 1 shows the arrangement of systems to achieve this range. Of course this process is neither simple nor cheap. The aircraft is only built in one configuration, but the GCS requires extra equipment to utilize the BLOS link. A specific link manager assembly and ground modem are needed to convert signals into a digital form that is required. The next complicated piece is the SATCOM antenna. This is not a small piece of equipment and requires independent power and controls for operation. It must be precisely aimed and calibrated by experts before use. Of course the most expensive piece of the puzzle is the satellites themselves. Ku-band satellites are a limited resource, and used by many commercial enterprises. MQ-9, and any UAS, are unlikely to have dedicated satellites, so bandwidth is typically purchased from existing satellites. Bandwidth is limited, and is directly related to cost.
        Ongoing operations require a dedicated planning team to schedule the appropriate time and frequencies needed. Operating in this manner also requires communication specialists to coordinate with the satellites for frequencies and numerous other details needed to get a good satellite connection. Government operations also encrypt their signals, so security personnel are needed to keep cryptologic devices current. The last pieces are the ones not unique to UAS operations: the maintenance personnel and the flight crew. For the flight crew, the procedure is actually fairly simple. The communications specialists inform the crew what satellite settings to configure, and beyond that it is simply turning the equipment on. Because the satellite is so exacting, it either works or it does not. If not, the crew cannot do anything besides changing settings.
        The most significant advantage to this mode of operation is the greater range achievable by the aircraft. Because the signal is digital, it allows the inclusion of voice radio, high definition video, and all in an encrypted and secure transmission. There are also limitations involved. Notably, there is a greater time delay between the sending and receiving of signals. Ku radio waves, like all electromagnetic waves, travels at the speed of light. Over a short distance, the time is extremely short. But the distance up to and back down from the satellite is cumulative. Control and response of the aircraft is impacted, since the signal must make two trips between the pilot commanding a turn, and seeing the resulting bank. Additionally, since the signal is digital, there is an unavoidable delay in the processing of the signal. The cumulative effects of these delays can exceed 1.5 seconds (GA-ASI, 2015). While it may seem short, it is long enough to make landing an aircraft under this mode difficult enough that it is prohibited as a rule.
        Depending on the goals and performance of the UAS, this delay does not have to impact its use negatively. But human factors issues do arise from the delays inherent in BLOS operations. Time delays between control inputs and observed responses induce a margin of instability that is prone to error (Wickens, Gordon, and Liu, 1998). Pilot induced oscillations are a continual risk, and the emergency procedures in the technical manual explicitly contain a boldface procedure to recover from this dangerous error (GA-ASI, 2015). The delay is not insurmountable through training, but it is always present. Due to the long duration of flight capability by larger UAS, BLOS operations are a useful tool even for commercial applications. A single aircraft could monitor hundreds of miles of pipelines in a single flight. Where terrain is uneven and might block LOS signals, an aircraft could search for wildfires across the entire state of Colorado.  Even if the land is flat, a UAS could survey coastal damage after a hurricane along the entire coast of Florida. In aviation terms, the distance allowed by an LOS signal is extremely small. Many of the uses of UAS large enough to accept BLOS equipment would benefit from the expansion of its range.
References
        General Atomics Aeronautical Systems, Inc. (2015, August 4). Flight Manual, USAF Series MQ-9 Aircraft, Serial Numbers 004, 006, 008, and Above.  California: General Atomics – ASI.
        Wickens, C, Gordon, S, and Liu, Y. (1998). An Introduction to Human Factors Engineering. New York: Addison Wesley Longman, Inc.
Appendix

Figure 1. UAS Command and Control Diagram reprinted from “Flight Manual, USAF Series MQ-9 Aircraft, Serial Numbers 004, 006, 008, and Above,” by GA-ASI, 2015.
 

Friday, November 20, 2015

UAS Integration into NAS


UAS Integration into the National Airspace System
Shawn M. Wyne
ASCI 638 – Human Factors in Unmanned Systems
Embry-Riddle Aeronautical University
 
The challenges of managing and coordinating large metal machines moving through air at high speeds have always been daunting. In the beginning, pilots simply used their eyesight to avoid hitting other objects and each other. Over time, technology like radar and airborne radios allowed a measure of direct coordination between aircraft. In spite of these changes, mid-air collisions between aircraft continued through the 1950’s and 60’s (Kochenderfer, Holland, and Crysanthacopoulos, 2012). New technology was required to overcome the obstacles of increasing air traffic. The Federal Aviation Administration (FAA) enacted stricter rules and procedures, but this did not solve the problem alone. The introduction of the Traffic Alert and Collision Avoidance System (TCAS) helped, and TCAS II equipment became mandatory in the United States in 1990 (Rosenkrans, 2014). This improvement in technology substantially increased flight safety, just as new technologies did 60 years ago. However, just as it happened before, the new technology is reaching its limit of usefulness (Wyne, 2015). Continued increase in air traffic, and in particular the introduction of unmanned aircraft systems (UAS) into the same airspace, requires another leap in technological capability. The National Airspace System (NAS) requires an upgrade of systems and procedures to not only improve safety, but increase efficiency and maximize use of resources.
The process in which the FAA is pushing this change is with the implementation of the Next Generation Air Transportation System (NextGen). The goals of implementing NextGen cover several areas. The first, as an extension of current systems, is to improve collision avoidance. A new technology that is helping push NextGen and the inclusion of unmanned systems is the introduction of Automatic Dependent Surveillance-Broadcast (ADS-B) equipment. The ADS-B is a replacement system for transponders currently in use for traffic control (FAA ADS-B, 2015). A significant problem with current TCAS, radar, and transponders is their lack of precise information. Radar gives position, but requires a Mode C transponder to return altitude data. Ground radars typically interrogate every 12 seconds, and only request altitude data every other interrogation (Richards, O’Brian, and Miller, 2010). So an average aircraft location is only updated five times per minute, and altitude only two or three times per minute. The new system of ADS-B utilizes a completely new approach. It uses Global Positioning Satellites (GPS) signals to generate a precise location of itself in three dimensions. Then, it broadcasts detailed position data to other users (Richards, et. al., 2010). Because it does not rely on a rotating radar dish, it can send information more frequently, and always include all the data it has. Improved data quality and rate of data updates will be crucial for collision avoidance among heterogeneous airborne systems in increasingly crowded airspace. ADS-B concepts, and the GPS technology behind it, is not particularly recent. However, it is yet to be fully implemented as a tool. The FAA now requires ADS-B compliant equipment for all airspace that currently requires Mode C transponders by 2020 (Babbitt, 2010). As the supporting systems and airborne equipment are fully developed and implemented, they will provide the structure to improve automated collision avoidance systems (Wyne, 2015). This is also a significant addition to UAS operation under potential lost-link behavior. While still updating position to controllers, and maybe more importantly to other aircraft, there will remain a system for other aircraft to avoid a UAS not under positive control.
Another important goal of NextGen is increasing the overall efficiency of air traffic, especially in the departure and terminal approach areas of airports. The significant increase in overall air traffic is overloading the usefulness of the current arrival and departure procedures. Current flight paths in terminal areas are published, fixed routes. Since wake turbulence spacing cannot be decreased, there exists a maximum throughput of aircraft along a single route. NextGen is implementing satellite-based arrival and departure procedures, which instead of being singular, can be unique for different aircraft (Carey, 2015). This means more traffic through a smaller space, since it adds more lateral separation from following aircraft. This also means more aircraft taking off from a single runway, a potential increase of 8-12 departures every hour (FAA NextGen Experience, 2015). This cuts down taxi and holding time for aircraft on the ground, which saves time and fuel. The monetary savings are also significant. US Transportation Secretary Anthony Foxx stated the savings already realized under the limited rollout of new procedures is over $2 billion, and expected to surpass $130 billion over the next 15 years if the system is fully implemented (Carey 2015). Unique arrival and departure procedures also provide a tool to keep UAS especially separated from manned traffic, negating some sense-and-avoid limitations.
Another goal of the NextGen system is to increase efficiency in the enroute portions of flight (FAA NextGen Experience, 2015). Similar to arrival and departures, enroute flight paths also follow published paths, similar to roads on the ground. This is convenient for controllers because with the current limited position data, aircraft also have an expected location that can be inferred. The improved position information allows controllers to send aircraft on more direct paths to their destination. The improved courses are a major part of the time and fuel savings air traffic can realize. As a secondary effect, the increased fuel savings from better routing means a corresponding decrease in emissions. With about 85000 daily flights within the NAS, this is no small reduction (FAA NextGen Experience, 2015).
There are other, more technical pieces to the NextGen system improvements, such as replacing analog voice systems, improving data dissemination among ground based controllers, and better integrating weather data into controller decisions (FAA NextGen Experience, 2015). All these are the backbone pieces to implement the front side, which is better flight coordination and planning. The ability to control aircraft to a much more detailed level, send unique and specific flight plan changes, and keep data consistent among many users will bring significant improvements to the NAS. Importantly, it provides a level of control needed to integrate UAS into already congested airspace.
  References
Babbitt, J. (2010, May 28). ADS-B Out Performance Requirements to Support Air Traffic Control Service; Final Rule. Department of Transportation Federal Aviation Administration. Retrieved from: http://www.gpo.gov/fdsys/pkg/FR-2010-05-28/pdf/2010-12645.pdf
Carey, B. (2015, November 2). US Transportation, Industry Officials Upbeat on NextGen. Retrieved from http://www.ainonline.com/aviation-news/air-transport/2015-11-02/us-transportation-industry-officials-upbeat-nextgen
Federal Aviation Administration (2015). Automatic Dependent Surveillance-Broadcast. Retrieved November 8, 2015 from: https://www.faa.gov/nextgen/programs/adsb/
Federal Aviation Administration (2015). NextGen Experience. Retrieved November 7, 2015 from https://www.faa.gov/nextgen/experience/?episode=2
Kochenderfer, M., Holland, J., and Cryssanthapolous, J. (2012). Next Generation Airborne Collision Avoidance System. Lincoln Laboratory Journal. Retrieved from: https://www.ll.mit.edu/publications/journal/pdf/vol19_no1/19_1_1_Kochenderfer.pdf
Richards, W., O’Brian, K., and Miller, D. (2010). New Airborne Surveillance Technology. Boeing Aeromagazine. Retrieved from: http://www.boeing.com/commercial/aeromagazine/articles/qtr_02_10/pdfs/AERO_Q2-10_article02.pdf
Rosenkrans, W. (2014, October). ACAS X. AeroSafety World Magazine. Retrieved from: http://flightsafety.org/aerosafety-world-magazine/october-2014/acas-x.
Wyne, S. (2015). Collision Avoidance in Unmanned Aerial Systems. Embry-Riddle Aeronautical University, Unmanned Systems 610.

 

Sunday, November 15, 2015

Ground Control Station Human Factors Issues


Ground Control Station Human Factors Issues


Unmanned Aerial Systems (UAS) have a variety of control mechanisms. Small systems may simply use a handheld remote control. Larger aircraft, however, utilize a more significant mechanism that is essentially a land-based cockpit. This is generally called a Ground Control Station (GCS). The United States Air Force flies MQ-1 and MQ-9 UAS from a GCS built by General Atomics Aeronautical Systems, Inc. (GA-ASI). However, this system has been around for many years, and it was not optimally designed. According to a US Air Force Predator commander, the GCS was never given the time to integrate human factors principals into its design (Freedburg, 2012). The control station itself, Figure 1, is a tool to control the aircraft. Stick and throttle, along with rudder and command screens, turn physical commands into computer code commands. The data is then transmitted to the aircraft, which processes the commands and reacts to them. The UAS itself has minimal control logic within itself, it relies on direct commands to perform its functions. There are many human factors problems with this design. The first problem with the system is the manner in which the pilot receives attitude information about the aircraft. The aircraft sends down attitude information to the GCS, and the information is displayed on a HUD. The design of the HUD is similar to those found in fighter aircraft cockpits (Figure 2). Although the HUD itself is a proven concept, it’s execution in this context is problematic. An important principle of human factors in display design is the absence of excessive clutter (Wickens, C, Gordon, S, and Liu, Y, 1998). The HUD has lots of information, but it is not in itself excessively cluttered. The problem occurs in the placement of the symbology. While in a manned aircraft, the HUD is a glass panel with the outside world behind it, in the GCS the HUD must have streaming video behind it. The primary video source on the aircraft is a nose camera that is fixed in the forward perspective (GA-ASI, 2015). This provides a view similar to a manned aircraft, where the HUD is superimposed over the natural horizon. But this is not the only camera on the aircraft, and not the only video that can be displayed. The aircraft also has a targeting camera that rotates and points at the Earth, providing a detailed view of scenes below. This is the primary purpose of the aircraft: to view and collect information with this view. When conducting operations, the pilot desires to view this camera, and the only place to put this video is in place of the nose camera. Now the problem exists where the HUD attitude and horizon are at odds with the video view. During many maneuvers, the video changes angle rapidly, even when the aircraft maintains a straight and level attitude. The potential for disorientation is very high. The best way to eliminate this problem is to separate the HUD from the video. In the current system this is difficult, because not only can the pilot only view one video source at a time, datalink issues limit receiving both videos from the aircraft without degrading the quality of both. A second significant human factors issue is the presentation of other data. The aircraft downlinks hundreds of pieces of information, all displayed in tables (Figure 3). There are more than 65 tables available. This information is helpful, but extremely limited in usefulness. For increased perception, mental models help when the user can perceive pictorial realism and visualize the moving parts of that data (Wickens, et. al., 1998). The variable information tables, of which only two can be viewed at a time, does not have any pictorial aspect. It is simply a collection of words and numbers that require significant attention to ascertain their meaning. And the analog nature of the numbers presents limits to perceived motion of the values. When a number is changing, such as a temperature rising, it is difficult to ascertain its motion, or the rate of that motion. That delays perception of changes within the system. These, among many others, are some of the problems inherent with GCS systems. There is a significant amount of data and information not typically available to a aircraft cockpit. How to present these is a challenge, and one not currently met by the existing system.
 
References
Freedberg, S. (2012, August 07). Too Many Screens: Why Drones Are So Hard To Fly, So Easy To Crash. Retrieved from: http://breakingdefense.com/2012/08/too-many-screens-why-drones-are-so-hard-to-fly-and-so-easy/
General Atomics Aeronautical Systems, Inc. (2015, August 4). Flight Manual, USAF Series MQ-9 Aircraft, Serial Numbers 004, 006, 008, and Above.  California: General Atomics – ASI.
Wickens, C, Gordon, S, and Liu, Y. (1998). An Introduction to Human Factors Engineering. New York: Addison Wesley Longman, Inc.
Appendix
Figure 1
Figure 2
Figure 3