Engineering-psychological and ergonomic design of the aircraft crew workplace

. The article discusses various aspects of the engineering and psychological design of the cockpit of advanced aircraft as an integral part of the overall design of aviation technics. The design process and factors that should be taken into account when solving the problem of designing the crew workplace are described. The issues of constructing the information control field of the cabin, various ways of receiving and entering information are discussed, promising methods of human-machine interaction are described. It is concluded that the issues of obtaining and entering information by the crew in the process of performing a flight task should be resolved taking into account multimodality.


Introduction
When creating new aircraft, it is necessary to devote a lot of time to the development and research of the crew workplace. One of the most important aspects of the effective functioning of an aircraft is the combination of the technical capabilities of the aircraft and the capabilities of the crew to interact seamlessly with all aircraft systems. The correct engineering and psychological ergonomic design of the aircraft cabin helps to achieve perfect interaction. The competent location of the controls increases the capabilities of the pilot, which, in turn, affects the normal functioning of the entire aircraft.
The design and layout of crew workstations is a complex process, during which a huge number of different factors are taken into account that can prevent a pilot from doing his job efficiently. The main task of engineering and psychological design in the development of the cockpit is to ensure that the crew's work in the cockpit is as comfortable as possible, and nothing distracts the crew when performing a flight task.
A large number of different instruments and controls are used in the cockpit, and each of them should be in its place. All this forms an information and control field, thanks to which pilots interact with the aircraft.
To date, new and promising ways of entering and receiving information are being developed, for example: control using voice commands, gaze control, reading brain activity using neural interfaces. All this will help in the future to reduce the load on the pilot and improve the quality of his interaction with the aircraft systems, increasing the functionality of the "pilot-aircraft" system.

Engineering and psychological design in the product life cycle
The product life cycle (PLC) of aviation technics (AT), such as an airplane, at all its stages is directly related to human participation. Idea, design, production, testing, operation and even disposal cannot be done without the participation of a huge mass of specialists of various fields and levels of training [1,2].
Today, no technical system can function effectively without human involvement in one capacity or another. The flight crew of the aircraft implements the functional purpose of the aircraft, controlling its movement and operation of its onboard systems [3]. The ground crew provides preparation for work and ground maintenance. During the exploitation of the aircraft, effective interaction of personnel and equipment should be carried out. In such conditions, it is impossible not to take into account the influence of psychophysiological factors on the design, testing and, especially, operation processes.
For the effective design of AT, it is required to apply the methods of engineering and psychological design (EPD). The main goal of EPD is to create a mechanism for humanmachine interaction and minimize the influence of the human factor on the functioning of the machine, so that the person and the machine work as one unified system [4].
The EPD includes technological and methodological support. Technological support includes the availability of a wide scientific, experimental and bench base that allows you to conduct research and development at various stages of the project.
Methodological support is a set of rules and methods that are used for such work. EPD accompanies the aircraft almost all the way through its life cycle, starting from design and ending with its disposal. EPD is carried out at the same time with the design of the aircraft and does not stop during the operation of the aircraft. As a rule, the exploitation of aviation technics is a long process, which often exceeds 25 years. During this time, the aircraft may be upgraded more than once, including the crew's workplace. This is due to the development of human-machine interaction technologies.
The content of the work on the IPP at the stages of the housing and communal services is presented in Table 1 [5]. The first stage is associated with taking into account the human factor throughout the life cycle of the HMS; the content of the first stage is determined by the problem of engineering psychology. The second stage is connected with the organizational and methodological support of work on the human factor accounting. The last stage includes the study of the influence of the human factor on the functioning of the system during operation.

Designing the crew's workplace
From the point of view of engineering psychology and ergonomics, the direct process of operating the aircraft -the work of the crew when performing a flight task-should be particularly considered. Creating conditions for its implementation is the main task of designing the cabin as a crew workplace. The work of the crew during the flight is the most typical example of the functioning of the HMS.
The design of the cabin as a crew workplace should be carried out taking into account the ergonomic requirements and psychophysiological characteristics of the human operator.
When designing the crew's workplace, such aspects of cabin ergonomics as the anthropometric characteristics of the human operator, the required view inside and behind the cabin, and the issues of the reach of controls in all flight modes should be taken into account. In addition, an analysis of information models, the human-machine interface and the pilot's workload should be carried out when performing piloting and control tasks of the on-board complex.
The design is carried out at all stages -from the initial stages to ground and flight tests. At the same time, various technological equipment is widely used, such as models and booths for search and semi-natural modeling [6]. Without the presence of such a bench base, it is impossible to fully solve the tasks set for the design of the crew cabin Modern aviation is characterized by a tendency to increase the flight characteristics of aircraft. This leads to the complexity of the cabin equipment, its "saturation" with a large amount of information, which, in turn, increases the requirements for the flight crew. Currently, it is impossible to effectively develop and apply aviation technics and air traffic control systems without taking into account ergonomics and engineering psychology [7].
Interaction of the crew with the aircraft equipment occurs through the information control field of the cabin. Ergonomic design of information display devices and controls, their location in the aircraft cabin should be based on studying the activities of pilots not only in normal flight modes, but also in emergency ones [7].
The psychophysiological capabilities of a person serve as the basis for placing in the cockpit the organs and control systems of an aircraft, as well as numerous devices of the display, control and alarm system. All numerous devices are grouped into separate groups based on their functional purpose. The most important and frequently used groups of instruments and levers in flight are placed in clearly visible and accessible areas of the cabin and instrument boards. Rarely used instruments are positioned so as not to distract the pilot's attention. They are usually equipped with a light or sound alarm system that attracts the attention of the pilot or other crew member only at the right moment. A typical workplace arrangement provides for the placement of basic instruments and controls and other equipment in certain areas of the cabin and instrument boards, which significantly reduces the possibility of erroneous actions [8].
According to the purpose and type of work, the equipment of most aircraft can be divided into several clearly delineated complexes: [9] ̶ Navigation and aerobatic complex, ̶ Radio and radar complex, ̶ Complex of equipment of an automatic onboard control system, ̶ Electrical equipment complex, ̶ Complex of power plant equipment, ̶ Complex of experimental control and recording equipment. When arranging equipment in the cockpit, it is necessary to achieve the highest possible level of immersiveness for the flight crew. Figure 1 shows an example of the organization of the crew workplace.

Ways to get and enter information
An important part of the crew's activity is to obtain various information about the state of the aircraft. The process of information perception can be divided into four stages [10]. 1. Detection -the operator selects an object from the general background, but cannot judge its shape or features.
2. Discrimination -the operator separately perceives two objects located side by side, can highlight their details.
3. Identification -the object is identified with the reference recorded in memory. 4. Recognition -the operator identifies the essential features of an object and assigns it to a specific class.
There are quite a lot of ways for a person to get information. A person perceives information with the help of the senses: sight, hearing, taste, touch, smell.
The methods of entering information in flight can also be different [11], the options for entering information are presented in Table 2. Table 2. Information entry methods [11]. If we consider these options, we can draw the following conclusions.

Contact interaction
In the information control field of modern aircraft, various methods of contact interaction are very actively used. The method of contact interaction is far from the only one, there are many promising technologies for the interaction of the aircraft crew with the machine. Such promising tools include: voice control, eye and gesture control, and neurocomputer interface control [12].
The most effective perception is carried out with joint methods of obtaining information -multimodal perception.
Advantages of multimodal interfaces: ̶ Natural human-machine interaction for the operator, ̶ The possibility of simultaneous input of information, ̶ Choose a convenient way to enter / output information, ̶ Flexibility of using the interface, ̶ Improving the overall accuracy of the system. The main advantage of multimodal interfaces is the simultaneous provision of several ways of human-machine interaction, which, of course, increases the flexibility and reliability of the human-machine system.
The crew receives most of the information through their vision. The information obtained allows you to carry out conscious purposeful activities. With the help of vision, the operator can recognize colour, light, and shape. This ensures the visual activity of a person.
In aviation, visual information comes to the pilot from instruments, they can be both analog and digital, using various light indications. In modern aircraft, as a rule, all flight information is output on displays, which can also serve as a way to enter information.
Next in importance after vision is hearing. The sensitivity of the auditory analyzer, as well as the visual one, is close to absolute and allows to distinguish selected sounds against the background of general noise.
In aviation, it can be like some kind of sound, such as a squeak, which informs the pilot about a particular situation that has arisen. Audio signals can be transmitted either in the form of tonal sounds (beeps, bells, sirens, buzzers, etc.) indicating certain events, or in speech form. At the moment, the voice notification system is more common. This system is designed for voice messages, hints or commands-prescriptions for the crew in difficult or critical situations.

View control
One of the most obvious ways to use this technology in aviation is to move the pointer on the display to select the desired function. The selection can be made either by pressing a physical button or by holding your gaze. This type of control is more suitable for selecting simple functions that do not affect flight safety.
Let's look at some ways to track the direction of a person's gaze. Eye tracking -in this case, the contact method of eye tracking is used. A special contact lens with a mirror or magnetic field sensor is attached to the eye. This method is not accurate. This is due to the inability to take into account the displacement of the contact lens during the movement of the eyeball.
Optical tracking -in this case, a non-contact optical method of tracking eye movement is used. Most often, the reflection of infrared light from the surface of the eye, the reflection is recorded by a video camera or a special optical sensor (Fig. 2). After that, based on the analysis of information about changes in reflection, the direction and speed of eye movement is determined. Video trackers for eye tracking typically use the reflection of the cornea and the center of the pupil as objects to track over time. More sensitive type eyetracker, double eye tracker Purkinje, uses reflections from the front of the cornea and back of the lens as objects to track. An even more sensitive tracking method is to image objects from inside the eye, such as retinal blood vessels, and track these objects as the eye rotates. Optical techniques, especially those based on video recording, are widely used for gaze tracking and are considered non-invasive and inexpensive. Electrical potential measurement -uses electrical potentials measured using electrodes placed around the eyes. The eyes are a source of a constant electric potential field, which can also be detected in complete darkness and when the eyes are closed. It can be modeled so that it is generated by a dipole with a positive pole on the cornea and a negative pole on the retina. An electrical signal that can be received using two pairs of contact electrodes placed on the skin around one eye is called an electrooculogram (EOG). If the eyes move from the center position to the periphery, the retina moves closer to one electrode, while the cornea moves closer to the opposite one. This change in the orientation of the dipole and, consequently, the electric potential field leads to a change in the measured EOG signal. Conversely, by analyzing these changes in eye movement, you can track them. Thanks to the sampling provided by the overall electrode setup, two separate motion components can be identifiedhorizontal and vertical. The third component of EOG is the EOG radial channel, which is the average value of the EOG channels associated with some posterior scalp electrode. This radial EOG channel is sensitive to the saccadic impulse potentials that occur in the extraocular muscles at the beginning of saccades, and allows reliable detection of even miniature saccades. This is a very lightweight approach, which, unlike modern video trackers for eye tracking, requires low processing power, works under different lighting conditions, and can be implemented as an integrated autonomous wearable system.
For use in aviation, the optical tracking method is better suited, since it does not require various sensors to be attached to the human body.
The gaze can be controlled in two modes [12]: ̶ Cursor mode -when moving the gaze across the display, the cursor always moves behind the point of view, ̶ Jump mode -when you move your gaze across the display, the cursor jumps to the element closest to the point of view.
Using the gaze control function in the cockpit of advanced aircraft will allow for a significant improvement in the information control field of the cockpit. It will also reduce the load on the pilot and expand his functionality, which, with further development of the technology, will make it possible to reduce the number of crew members.

Voice control
This way of interacting with machines has already entered our daily lives as voice assistants that everyone can find in their smartphone. But it is one thing to give commands to the voice assistant, where the consequences of a system error will not be significant, and another is to execute the pilot's commands, which must be executed without question. This method of control will greatly reduce the load on the pilot and in the future makes it possible to reduce the number of crew members.
Today, voice control is used to perform simple tasks, such as entering route points, entering radio frequency, changing pages, etc. [13]. More serious tasks are hindered by the accuracy of speech recognition in various settings. Commands may not always be given with perfect purity, the influence of human speech defects or the vagueness of the expression of commands may be possible. To do this, you need to create artificial intelligence with machine learning, which in a short time will be able to adapt to certain conditions and accurately recognize this command. But the problem with aviation is that space is always limited, and we can't use huge servers to process information. Therefore, you need a compact system that can recognize a certain number of commands.
The next problem is a large number of different interferences caused by extraneous noise. This problem complicates the speech recognition procedure. It is necessary to make sure that the machine can isolate human speech from all the sounds that it reads. We need to teach artificial intelligence how to do this. You can pre-write a database of both commands and interference to the machine's memory. The machine will compare the received information with the one available in the database and cut out unnecessary noise. But again, it is impossible to take everything into account, so it is possible that one day the computer will not be able to filter out excess sound, which will lead to a system error.
Most modern automated speech recognition systems use a modular architecture using a block that eliminates interference (speech enhancement), voice detector (VAD), a signal converter to feature vectors (front end) and the main module (search engine), which includes the keyword recognition algorithm [14]. The information received in the form of audio is first sent to the interference elimination module, where the signal quality is improved due to the removal of noise and distortion. The voice detector then detects the parts of the signal that contain speech. These sections are converted into sets of coefficients that are sent to the main module, where the command recognition is directly determined. As a result, the machine detects the presence of a command and understands what the person requires.
To date, voice control is mainly used by military fighter aircraft (Dassault Rafale, Saab JAS 39 Gripen, Eurofighter Typhoon) in order to reduce the load on the pilot in combat conditions.

Contact interaction
Contact interaction refers to the activation of various functions by direct contact of the pilot with the control panel -these can be either simple buttons and toggle switches, or more functional touch displays that display the necessary information, and through which you can control certain functions.
The use of touch-screen displays in aviation allows us to abandon outdated analog devices and various display systems and concentrate this on the display. This reduces the pilot's workload by focusing on one or more displays that show the pilot all the necessary information.
There are a large number of different types of touch displays, but the most common one is the capacitive type of display. But this type has a number of disadvantages:  Lack of reaction to the gloved hand and most foreign objects,  The coating with a conductor is located in the upper layer, which causes exposure to mechanical impact,  It is suitable for use in terminals located in enclosed spaces. Therefore, it is better to use the capacitive-projection type of touch displays. On the inner surface of the device, an electrode grid is applied, which forms a capacitor capacity when it comes into contact with the human skin. After touching the display with finger, sensors and a microcontroller process the information, and calculations are sent to the main processor, which processes the received information. Such a touch screen display has the following advantages:  These designs have all the capabilities of capacitive sensors,  It can be equipped with a film coating up to 18 millimeters thick, which provides additional protection against mechanical impact,  Dirt on hard-to-reach conductive parts is removed using a software method.

Gesture interface
A gesture interface involves human-machine interaction using gestures that are read by various sensors and interpreted using mathematical algorithms. Human movements can be carried out by different parts of the body, most often these are the hands and head. Using a gesture interface is not exactly suitable for aviation. This is due to the fact that space in the cockpit is limited, and it is not possible to perform various gesture commands. In addition, the pilot's hands are busy controlling the aircraft most of the time, so he simply does not have the ability to make gesture commands. In this case, the gesture interface, on the contrary, can complicate the work of the crew, so it is not advisable to use the gesture interface in the cockpit.

Tactile and myointerfaces
Most often, tactile feedback from the use of touch displays is used in the form of vibration. This helps a person understand some information about the interaction with the display. Vibration signals can be different in duration and frequency, which serves as an alarm. For example, if a function is successfully activated using the touch screen, a short vibration signal is sent, and if the result is unsatisfactory, the vibration signal is long, this will reduce the person's reaction time to various situations without distracting from their main functions.
The vibration signal can be supplied by various types of vibration devices: rotating eccentric flywheel, linear resonant drive, piezoelectric drives.
A rotating eccentric flywheel is a flywheel with an offset center of gravity that is fixed to the shaft of a small electric motor, and the motor itself is rigidly fixed in the touch display housing. As a result of the rotation of the motor shaft, the eccentric flywheel mounted on it creates a vibration that is transmitted to the touch screen housing.
A linear resonant drive is a drive that performs linear reciprocating movements with a resonant frequency. The most common design is an electromagnet. A copper winding is wound on the magnetic core, and when current is applied to the winding, the electromagnet is attracted to the body of the vibrating device. The magnet is reversed by means of a spring.
Piezoelectric drive -due to the potential difference on the piezoelectric element, it begins to deform. This is usually a strip that, under the influence of the control current, begins to bend at different frequencies, thereby transmitting vibration to the touch display case. The advantage of such a vibration drive is that it can operate efficiently at any frequency and amplitude.

Neurocomputer interface
A neurocomputer interface is a device that allows you to decipher neural signals from the brain related to some part of the body -say, an arm or leg [15].
The most common method of reading brain activity is an electroencephalogram (EEG), in this case, sensors are located on the scalp that pick up electrical signals from the brain from the surface of the scalp. Of course, the signals are not completely accurate, as some of the signals are lost and distorted when passing through the bones of the skull and skin.
To use this method of human-machine interaction in aviation, it is necessary to conduct serious research, since the data obtained using EEG can give an idea of the functioning of individual brain regions, which then need to be decoded into a command that is understandable to the machine, while avoiding false positives.
To date, there are cases where it was possible to achieve good results, for example, people could move the cursor on the screen along simple trajectories. But in these cases, electrodes were used in a more invasive way directly into the brain areas. This method is rather sketchy, since any surgical intervention, especially in the human brain, poses a threat to human health. But at the same time, this method has its advantages -it is the clarity of the received signals, which are free of interference and have a sufficiently strong signal level, which simplifies its processing.

Conclusion
The design of aviation technics as a complex human-machine system is one of the most complex and intense stages of the product life cycle. At this stage, engineering and psychological design plays a significant role. Unfortunately, this method is still not practiced in the aircraft industry to the required extent. The introduction of EPD into the aircraft development process will create conditions for the development of more efficient pilot-aircraft systems.
The development of the cabin design, the construction of the information and control field cannot be carried out at a high level of requirements for aviation technology without taking into account and applying the methods of engineering psychology and ergonomics.
When designing an aircraft, it should be considered as a pilot-aircraft-environment complex, otherwise it will not be possible to achieve positive results. Particular attention should be paid to the consideration of the issue of the interaction of the pilot with the aircraft. This system should be flexible and adapt to sudden changes in flight modes, which is most pronounced in combat aviation.
The issues of receiving and entering information by the crew in the process of performing a flight task should be solved taking into account multimodality -a combination of various methods of receiving, which include visual, auditory and, with some limitations, others and input, which include, in addition to tactile control of the gaze, voice and, possibly, a neurointerface. All these promising types of interaction between the pilot and the machine should be considered as complementary tools, this will help to quickly respond to the rapidly changing environment and flight modes, which will increase the efficiency of the entire complex.