Abstract
Unmanned aircraft are unique in that the pilot does not occupy any space within the airframe so a separate cockpit also known as a ground control station must be utilized. This research traces the development of ground control stations utilized in the operation of these systems. The development of the consoles is researched including the human factors issues that have resulted in changes in the design over the development. Military and commercial stations including fixed and portable units are discussed, their impact on operational safety and how the design of these stations have contributed to the loss of aircraft. Accidents caused by human factors issue in the design of the ground control station are investigated and the subsequent changes made after cause was discovered. Design improvements are researched including the use of haptic feedback and augmented reality and how they affect the operation of the aircraft and their effect on the user and what improvements can be realistically implemented.
Introduction
Unmanned aircraft systems (UAS) pose unique operating problems as the pilot is not physically connected to the aircraft through the flight deck. This poses unique human factors issues as the design of the ground control station rarely resembles a cockpit but rather a bank of computers, monitors, and associated controls. There have been more than a few high-profile incidents directly attributable to human factors issues which resulted in the loss of expensive vehicles (Carrigan, n.d.). Most of the problems associated with these events are related to a failure of the operator to follow established checklist and procedures but the underlying issues are the design of the control station itself. Unlike manned aircraft which share a commonality of basic controls and are regulated by the certificating authority, UAS do not have such standards and instead have functions dictated by the capability of the aircraft, its intended role, and the manufacturer (Hobbs, 2015). This lack of standardization can cause human factors issues across platforms and when switching to a different airframe will require a significant amount of retraining to be able to competently operate the platform. As with any complicated system, the initial designs are established and then through trial and error, accident analysis, and user input, the interface is refined to eliminate errors and increase safety. This has been the case with some of the more advanced ground control stations as the latest generations only require one operator instead of two and have a high degree automation to assist the operator and alleviate the workload on the visual sense required by the previous generation (Vasile, 2019). Smaller commercial units have also followed this trend as advances in computing power have increased portability and reduced the size and weight of the equipment while increasing functionality.
Literature Review
Detection of nuclear sources by UAV teleoperation using a visuo-haptic augmented reality interface.
This research investigates the use of haptic feedback and augmented reality to reduce errors associated with the teleoperation of a UAS. This is relevant as it describes the methods used to minimize human factors associated with ground control stations. The use of augmented reality and haptic feedback can give the operator of the unit expanded situational awareness without having to divert attention away from operating the unit. Close quarters operation can be particularly stressful on the pilot and increase the likely hood of mistakes. Reducing the distractive elements increases diligence of the operator. By investigating the use of new and novel technologies to improve the human- machine interface (HMI), it will simplify the control interface and increase interactivity by giving active feedback to the pilot.
Human Factors Analysis of Predator B Crash
This report focuses on the Predator B crash operated by the Department of Homeland Security (DHS) in 2006 that was directly attributable to human factors and the design of the ground control station. The aircraft requires a pilot and payload operator, both sides are configured identically but depending on the operator, function differently. The pilot’s section of the ground control station malfunctioned, when the pilot attempted to regain control from the payload operators station some switches were not configured correctly. This resulted in the fuel cut-off valve actuating and starving the engine. The main cause was the pilot’s failure to follow established checklists before control was transferred from the pilots to payload operators’ station and highlights the importance of procedures. This complacency caused the crash of an expensive asset.
Tactile display design for flight envelope protection and situational awareness
This research investigates the use of tactile feedback on the displays of a UAS ground control station. The teleoperation of a UAS is particularly daunting as the pilot must rely almost entirely upon their visual sense. This increase in visual load could cause the pilot to press the wrong button or control on the touch screen, by giving haptic feedback it can let the pilot know the correct button has been actuated for a certain aircraft configuration. An activation of a control out of a particular aircraft configuration could give a haptic feedback in addition to an aural warning alerting the pilot. Using modern technologies to improve the HMI will reduce the errors attributable to human factors and keep the UAS within its flight envelope for the safety of the vehicle and the completion of the mission it may be on.
A haptic interface with adjustable feedback for unmanned aerial vehicles
Haptic feedback is a useful tool to give feedback to the operator of a system. This research investigates the use of adjustable feedback based upon nearby obsticles and obstructions. This method of warning could be useful for ground control stations when UAS operate in close proximity of manned aircraft or in urban areas between buildings or other structures. The system can be configured to increase the frequency and amplitude of the haptic signal as a warning to the opeator. This addition to a ground control station could increase spatial awareness of the pilot.
Assessment of UAV operator workload in a reconfigurable multi-touch ground control station environment
This research investigates the use of configurable displays and gestures to operate a UAS. These aircraft are primarily operated through the use of computer keyboards and other peripherals such as the mouse and joystick. These can represent a steep learning curve and can be responsible for errors by operators with little experience. Gestures and visual based systems can reduce the amount errors and human factor issues associated with unfamiliar controls. Humans can be trained in gestures quicker than specific controls as the muscle memory will form rather quick. This type of system may require specialized monitors and cameras that would increase the complexity of the ground control station. Advances in technology and miniaturization could allow them be integrated into ground stations still in development as retrofitting current units may be cost prohibitive and will require retraining current operators. It may reduce the effects of human factors while reducing errors associated with the HMI.
Human factors guidelines for unmanned aircraft ground control stations
This presentation by NASA outlines the potential guidelines for the design of UAS ground control stations and how they affect pilot performance. It is relevant because it shows the problems associated with the HMI and how human factors influence the design of the control station. Although UAS may aircraft, their operation requires a different layout for the controls. They may still have the same basic instruments of an aircraft but their displays and information overlays will be vastly different. Furthermore, previous research has determined that manned aircraft pilots have more difficulty operating a UAS than those that have been avid video game players even though the first generation of ground control stations were modeled after aircraft. This research and presentation attempt to streamline ground control station development to reduce human factors errors.
New artificial intelligence approaches for future UAV ground control stations
This research investigates the use of artificial intelligence (AI) to help with the operation of multiple UAS from the same ground contorl station. It delves into the process required for a single operator to handle multiple aircraft and follw procedures in case of an emergency. The paper also investigates the use fo the AI to assist with training and to rate the operator objectively. This is relevant because of the expanding use of UAS in both the military commercial operations. As their use expands there will be a single pilot operating multiple aircraft and the use of the AI reduces the workload on the pilot and decreses the chace of error and other human factors affecting the flgiht performance of the vehicle. Tasks such as route planning, target selection, replanning and other tasks are completed for approval by the operator.
Augmented reality tool for the situational awareness improvement of UAV operators
This research investiates the use of augmented reality (AR) to improve situational awareness of the pilot for marger UAS such as long medium altitude long endurance platforms. During their flight, most of these operate in automatic mode and are executing the instructinos set forth in the initial plannign staages of the mission. This tool allows the operator to visualize on the display the three dimensional aspect fo the aircraft related to such items as its position over terrain, distance to target, other aircraft in the area, and so forth. By visualing this information it gives the operator a better understand of where the aircraft is in relation to the ground and other obsticles. On the surface it appears to increase the load on the visual sense of the operator but it actually allows the protion of the brain that processes spatial information to assist the operter and reduce workload. This information could used with AR glasses on smaller patforms to assist te operators of smaller commercial units.
The Effects of Commercial Video Game Playing: A Comparison of Skills and Abilities for the Predator UAV
This master’s thesis for the Air Force Academy researched the comparison between UAS pilot candidates that were not avid video game players as compared to those where were. It discovered that avid video game players were better at spatial awareness and strategic thinking than their no player counterparts. This is important in the design of a ground control station as it must display the relevant information for the operator and be easy to discriminate. The use of avid gamers can help design better ground control stations for the aircraft to reduce fatigue and increase safety. The impact has already been seen on the more modern ground control stations.
Consideration about UAV command and control. Ground control station
This research focuses on the use of open source tools and the information presented by commercial and military ground control stations. The configuration of control stations differs due to the size of the aircraft, their mission and expense. Most of the commercial grade UAS operate from a computer interface, although familiar to most, this requires special training as this size of aircraft must be within line of sight operations. The operator must divide their time between flying the unit and maintaining visual contact. The design of the control station must not interfere with the operation of the vehicle and the information presented must be available at a glance. Military long range UAS have different requirements and usually custom software based upon the manufacturer of the control station, so commercial operations are the focus of this research. This is relevant because as these platforms proliferate even more, the control stations may need to be standardized by the Federal Aviation Administration through regulatory means much like manned aircraft are.
Problem Statement
UAS ground control stations have a myriad of human factors issues related to their design and control scheme that have affected performance. The use of the visual sense for the primary source of information poses a burden on the operator which increases the chance for error.
Hypothesis
The use of existing technologies can address the human factors issues including haptic feedback, augmented reality, configurable displays, and gesture recognition can reduce the load on the visual sense while improving the safety of the aircraft.
Early ground control stations
Early ground control stations (GCS) attempted to replicate an aircraft cockpit as much as was practicable at the time as most of the pilots tasked to operate them were manned aircraft pilots in the military (Hobbs, 2015). The aircraft required a pilot and payload operator, and depending on who was operating it, the same controls functioned differently. This duplicate control system may have been an excellent economic choice for governments but in cases where the pilot had to switch stations due to a problem, a complicated checklist had to be completed and the payload operators station reconfigured (Carrigan, n.d.). The first generation of GCS were typically operated by the military as they were the earliest adopters of this this technology with UAS. They were large, required redundant power and cooling systems, were typically installed in transportable shipping containers, and required the pilot to use the visual sense as the primary means of information gathering. These early station designs made assumptions about the suitability of manned aircraft pilots as UAS operators which have been subsequently proven wrong as the skills do not usually transfer over (Triplett, 2008). Pilots of manned aircraft typically use all of their senses to help establish situational awareness for the aircraft, this cannot be done when the operator is detached from the aircraft. Teleoperation of any system over long distances has latency that must be accounted for, pilots of manned aircraft are used to an aircraft responding to control inputs immediately.
Human Factors Issues with the Control Interface.
This economy of configuration may have initially been good for manufacturing purposes, but lost link scenarios and hardware problems with the station creates a complicated switch over procedure (Carrigan, n.d.).This cross utilization of controls is anathema to effective human factors mitigation as it creates a complexity that hinders the rapid transfer of control from pilot’s station to payload operator. For example, the General Atomics Predator requires two operators, although the stations are identical, the throttle for the pilot also serves as the camera control for the payload operator. In times of high stress, the pilot may not complete the required items on the checklist to transfer control resulting in erroneous inputs to the aircraft, this very scenario caused the loss of a U.S. Department of Homeland Security aircraft while on patrol of the southern border (Carrigan, n.d.).
The larger ground control stations do not give sensory feedback to the pilot, this can result in a loss of situational awareness of the aircraft (Fu, 2016). Because the pilot primarily uses the visual sense it can be become overloaded and cause them to miss cues from the readouts if the screens are crowded with information. The stationary nature of the control station does not allow the pilot to feel the aircraft, this could result in the operation of the outside of its performance envelope as they cannot feel an approaching stall or other indications of problems (Vasile, 2019). Ground control stations are also model specific, much like transport category aircraft which will require extensive retraining to operate the different model. This type rating can be rendered useless due to the rapid development cycles in unmanned aircraft, software upgrades, and obsolescence of the platform.
This lack of sensory input while remaining solely on vision can cause fatigue of the operator and if the displays are not laid in the most optimum manner, then information can be missed (Hobbs, 2015). The GCS displays also have a depth of filed issue, the operator may not be able to properly judge distances relative to the aircraft, this could cause problems navigating mountainous terrain or over water. The pilot’s view is also limited by what direction the camera is pointing. Unless synthetic 3D rendering is used with multiple cameras, other aircraft may be missed when zoomed in on an area of interest. The evolution of commercial units has further increased the problems because of the many configurations and manufactures of portable ground control stations (Ruano, 2017). The size and range of these aircraft dictate that they be kept with visual line of sight of the operator, this can cause the operator to divide their attention between the aircraft and the controls to monitor vehicle health such as remaining battery or fuel, telemetry, and other important parameters. The use of a spotter is required for these operations, depending on the weather conditions, can add additional human factors issues. These laptop-based GCS are portable and use a familiar platform for the operator but the different available software configurations can become quite complicated and require specialized training to operate effectively.
Most of the human factors issue with GCS occur partly because of their complication and short development cycles for the commercial sector, whereas the military sector tends to use the equipment for as long as possible which requires backwards compatibility with newer aircraft (Vasile, 2019). The military will also duplicate the stations between payload operator and pilot for the economy of scale but the same controls will have different functions depending on who is operating it. Standardization will be the first step in solving man of these issues. Much like aircraft, this will require the intervention of a trade group with binding agreements or through regulation by the government. Voluntarily adopting control schemes and layouts across platforms will reduce training times, standardize the controls, and prevent the regulation thereby complicating the design process through bureaucratic approval.
Current Ground Control Station Evolution
The U.S. military was an early adopter of unmanned aircraft once the technology advanced to the point where it became reliable. It allows the operation of an asset without putting a pilot in danger, it was originally believed that manned aircraft pilots would be the prime candidates for unmanned operators but the GCS themselves rarely resembles a cockpit (Triplett, 2008). The early predator GCS were two banks of computer racks, one for the pilot and payload operator that contained the computers, communication equipment and controls to operate the aircraft through a satellite link. The operators had a limited field of view, they saw only what the camera did, and the arrangement of displays required a considerable amount of visual scanning. The aircraft instrumentation is overlaid onto the camera feed, and the course and map info are displayed above the camera feed (Carrigan, n.d.).
The block 50 GCS has an expanded field of view, a more ergonomic layout, and the operator does not have to move their head as much to do a complete scan and situational awareness of the aircraft is increased due to the wider field of view and three-dimensional mapping. The information available to the operator has increased significantly in comparison to the previous generations, but the layout reduces operator fatigue and the screens can be reconfigured . Commercial GCS are also adopting some of the lessons learned but on a limited scale because of computing power, line of sight operations and portability issues. Some of the larger commercial units use multiple screens and have a higher degree of automation to alleviate the problems with operating these types of vehicles.
Because of their limited range, the smaller units must be kept within line of sight during operations which divides the operator’s attention. Increasing autonomy helps alleviate these problems but the GCS must also be designed to give the operator some feedback. Current small systems do not do this but there is ongoing research that has utilized both haptic feedback for situational and obstacle awareness with the use of augmented reality (AR) with the use of off the shelf technology adopted from other systems (Aleotti, 2017). With the proliferation and expanding use of unmanned aircraft, it will require the use of a single station to control multiple vehicles, this could further complicate operations and increase the stress on the pilot. Increasing autonomy will be required which will relegate the pilot to a system operator and will only exert direct control over the aircraft if it exceeds its flight envelope or other problems develop (Ramirez-Atencia, 2017). The operators will instead occupy a supervisory role directing the vehicle to the completion of the task while leaving a majority of the path planning and other decisions to the onboard logic.
New Technologies to Reduce Human Factors Errors
The adaption of new technologies to current infrastructure of ground control stations usually occurs when research attempts to use an emerging or current technology in a novel way not related to the designer’s original intent. Current research varies, it includes the use of haptic feedback, reconfigurable displays, and augmented reality to improve the human machine interface. Some technologies are easy to adapt to current infrastructure such as haptic feedback and some limited augmented reality (Ruano, 2017). Reconfigurable displays will have to be integrated into the next generation of GCS, gesture recognitions will require the system to have sufficient visual sensors to recognize the commands of the pilot and will most likely be on advanced platforms (Hu, 2019).
Haptic Feedback
Haptic feedback is loosely defined as using the sense of touch to communicate. A basic example is the feedback given by the phone when a user presses a button on the screen (Fu, 2016). Concerning UAS, the use of haptic feedback is used to inform the operator of a condition or alert that requires their attention. This is useful when the aircraft exceeds the flight envelope, haptic feedback to the controls can alert the pilot if they are exceeding the maximum angle of attack for takeoff as they cannot experience the same forces the aircraft does. The current GCS can be adapted to use haptic feedback with minimal physical modifications, this will allow the operator the ability to gather more information about the aircraft while reducing the reliance on the visual sense (Zhang, 2018). If the aircraft is configured for landing and the speed drops below minimum requirements, the throttle and control column could vibrate alerting the pilot without having the startle effect of an aural warning (Fu, 2016). Smaller aircraft are more susceptible to sudden movements of the controls, haptic feedback will give a warning to the operator that reduces the sudden involuntary movement of muscles when startled by an unexpected noise. The haptic sensation can focus the pilot’s attention on the area of immediate concern and reduce scanning of the instruments.
Augmented Reality
A recent technology that overlays information onto a lens that is worn by the operator. This type of system will be most useful on the smaller commercial GCS systems where the operator’s attention must be divided between keeping the aircraft in sight and operating the controls (Aleotti, 2017). It can also be effective when flying in confined areas such as between buildings or to avoid other obstacles such as power lines. The smaller commercial UAS can benefit from this technology as it allows the operator to maintain visual contact with the vehicle while operating the controls. The information such as heading, artificial horizon, fuel and battery life can be displayed in the pilot’s field of view. AR could also be adopted to fixed GCS but because of the heavy visual load on the operators of these types of stations it could have the opposite effect and cause more problems for the operators rather than solving any. Motion sickness could be induced and information overload could become a problem.
Reconfigurable and Tactile Feedback Displays
The use of reconfigurable displays would allow the pilot to shift the display of information that is best for their own personal preference and performance (Haber, 2016). Similar to how a person arranges their personal computer screens. The displays should also give tactile feedback when a button is pushed to confirm the command. This feedback will reinforce the command has been received by the GCS much like the clicking and feeling of resistance of an actual switch. The ability of the panel to recognize gestures on the surface can also reduce errors and enhance controllability (Fellah, 2019). For example, to zoom in or out on a display, instead of searching for a button on a keyboard, the operator touches the screen and either moves two fingers together or apart and the systems responds accordingly. Gestures are easier to learn and muscle memory forms quicker when using it in conjunction with the other senses (Hu, 2019).
Artificial Intelligence
Utilizing AI can reduce the workload of the pilot, they are relegated to a mostly supervisory role in the operation of the aircraft by establishing the goal and allowing the machine logic to make the decisions as to how accomplish it (Ramirez-Atencia, 2017). This has the effect of reducing control input errors as the control system will not exceed established vehicle parameters during operations. Although the machine logic will carry out its instructions as directed even if they are incorrect. For example, entering the incorrect waypoints and altitudes for navigation could cause flight into terrain or fuel starvation of the engine. The operator will still be required to input the relevant information and commands and the controlling logic will accomplish those instructions.
Hand Gestures
This technology is still development but will require the use of cameras on the GCS and an AI to recognize the hand gestures of the operator. Much like touch gestures, this will require vision-based sensors for the AI to correctly interpret the commands of the operator (Hu, 2019). Due to their cost and complexity they will most likely be useful on the larger platforms operated by the military. Their use may also require the addition of augmented reality to operate effectively. The use of gestures may decrease training time as the formation of muscle memory will happen quicker as it involves not only the visual sense but also the sections of the brain that handle fine motor control (Hu, 2019). The addition of haptic gloves could also be used with the technology to give feedback to the operator (Zubrycki, 2017). This control mechanism will work concurrently with AI, resulting in the operator being a supervisory agent over the system rather than having direct control. This scheme may reduce errors because the AI will be operating the aircraft based upon the needs of the pilot rather than assuming direct control. This type of control scheme will require further research and development, it may be effective in a controlled environment but due to the methods of control and visual sensors required for the GCS it will most likely only be available on advanced systems operated by large corporations or governments due to the complexity of the equipment required. The gestures will also need to become standardized and the operators must demonstrate a level of proficiency before they can operate a live vehicle (Hu, 2019).
Conclusions and Recommendations
The development of UAS led to specialized equipment for command and control of the vehicle. Human factors issues have led to loss of aircraft that were exacerbated by the equipment in use. There have been significant developments between the early block 10 GCS and efforts have been made to reduce the human factors in the block 50 models. The problems stem from the overload of the visual sense and the separation of the pilot from the aircraft. The early systems were refined but they there is still room improvement, the use of additional technologies, primarily haptic, can enhance situational awareness of the aircraft by the operator while increasing safety and reducing errors. Because of the speed of development at which commercial units, most of the advances will happen in those smaller GCS first before being implemented in the larger aircraft due to their size and complexity. Separating the pilot form the aircraft has increased the utility of these platforms while introducing a new set of human factors issues that must be overcome
Further research is needed but it should focus on the reduction of the load on the visual sense and to incorporate haptic feedback to the operator. Advances in miniaturization has led to the development of AI that has a very small footprint and can be incorporated into the aircraft with minimal weight penalty. By allowing the machine logic to handle most of the decisions regarding aircraft control it will reduce errors by the operator while increasing the efficiency of the aircraft. In addition, the control interface should become standardized wither through government regulation over a certain weight class or ICAO treaty. The smaller commercial units such as multirotor vehicles used for inspections and fixed wing vehicles should have a control scheme agreed to by an industry standard without government intervention to keep costs low. The rapid rate at which these platforms are developed should allow the incorporation of wearable haptics and AR within the next few design cycles further reducing human factors errors. Much like the development of the layout of aircraft instrumentation, it will be in incremental steps until the best practices are discovered and implemented to reduce the human factors issues with GCS.
References
Aleotti, J. M. (2017). Detection of nuclear sources by UAV teleoperation using a visuo-haptic augmented reality interface. Sensors. doi:10.3390/s17102234
Carrigan, G. L. (n.d.). Human Factors Analysis of Predator B Crash . Retrieved from http://halab.mit.edu: https:/ /hal.pratt.duke.edu/sites/hal.pratt.duke.edu /files/u13 /Human%20Factors%20Analysis%20of%20Predator%20B%20Crash%20.pdf
Fellah, K. &. (2019). Tactile display design for flight envelope protection and situational awareness. IEEE Transactions on Haptics, 87-98. doi:10.1109/TOH.2018.2865302
Fu, S. S. (2016). A haptic interface with adjustable feedback for unmanned aerial vehicles (UAVs) -model, control, and test. 2016 American Control Conference (ACC) (pp. 467-472). Boston, MA: American Automatic Control Council (AACC). doi:10.1109 /ACC.2016.7524958
Haber, J. C. (2016). Assessment of uav operator workload in a reconfigurable multi-touch ground control station environment. Journal of Unmanned Vehicle Systems, 4(3), 203+. Retrieved from Retrieved from https://link-gale-com.ezproxy.libproxy.db.erau.edu/apps/ /A463514960/AONE?u=embry&sid=AONE&xid=1d00ac6c
Hobbs, A. &. (2015). Human factors guidelines for unmanned aircraft ground contorl stations. San Jose: National Aeronautics and Space Administration. Retrieved from NASA Human Systems Integration Division: https://hsi.arc.nasa.gov/publications /GCS_HF%20_ Prelim_Guidelines_Hobbs_Lyall.pdf
Hu, B. W. (2019). Deep learning based hand gesture recognition and UAV flight controls. International Journal of Automation and Computing, 17-29. doi:10.1007/s11633-019-1194-7
Ramirez-Atencia, C. (2017). New Artificial Intelligence approaches for future UAV Ground Control Stations. IEEE. doi:10.1109/CEC.2017.7969645
Ruano, S. C. (2017). Augmented reality tool for the situational awareness improvement of uav operators. Sensors, 297-314. doi:http://dx.doi.org.ezproxy.libproxy.db.erau.edu /10.3390/s17020297
Triplett, J. E. (2008). The Effects of Commercial Video Game Playing: A Comparison of Skills and Abilities for the Predator UAV. Defense Technical Information Center. Retrieved from https://apps.dtic.mil/dtic/tr/fulltext/u2/a483256.pdf
Vasile, P. C. (2019). Consideration about uav command and control. Ground control station. Journal of physics. Conference series, 12007-12016. doi:10.1088/1742-6596/1297/1/012007
Zhang, S. &. (2018). Workspace analysis for haptic feedback manipulator in virtual cockpit system. Virtual Reality, 321-338. doi:10.1007/s10055-017-0327-y
Zubrycki, I. Z. (2017). Novel haptic device using jamming principle for providing kinaesthetic feedback in glove-based control interface. Journal of Intelligent & Robotic Systems, 413-429. doi:10.1007/s10846-016-0392-6