Max Scheck

Captain A320 & Member RPAS Working Group, Vereinigung Cockpit e.V.

Captain Scheck completed his pilot training through the Lufthansa Flight Academy in 1993. He has accumulated over 15,500 flight hours on various aircraft, including Boeing B737, B747 and Airbus A320-series, which he currently flies as Captain for Lufthansa German Airlines out of Frankfurt, Germany.

He also performs duties as Trainer/Facilitator for Crew Resource Management (CRM) training, having trained over 2500 cockpit and cabin crew members. He assisted Lufthansa Aviation Training with the development of one of the current CRM training modules.
Additionally, he served with the US Air Force Reserve in the Judge Advocate General’s Corps. In this capacity he participated in various aircraft accident investigations; in particular aircraft accident investigation boards in accordance with Air Force Instruction (AFI) 51-503 and DoDI 6055.7. He retired in the grade of Chief Master Sergeant as the IMA Senior Paralegal to the Air Force Legal Operations Agency.

Captain Scheck holds a Masters of Aeronautical Science Degree from Embry Riddle Aeronautical University and is the Vice Chairman of the Research Network for Academic Pilot Training.

He is also actively involved in the Qualification & Training Committee and the Remotely Piloted Aircraft Systems (RPAS) Committee of the German Airline Pilots’ Association (Vereinigung Cockpit - VC), as well as the RPAS-working group of the European Cockpit Association (ECA). In this capacity Captain Scheck has represented VC and ECA (including having held numerous presentations) at various venues, to include, among other, the German Ministry of Transportation, the German Society for Aeronautics and Astronautics, Unmanned Vehicle Systems International, the European Organization for Civil Aviation Equipment and UAV DACH.

Captain Scheck has published several papers in journals/periodicals, including, among other, Aviation Security International, The International Journal of Business and Social Sciences, The Judge Advocate General’s Reporter and VC-Info.



Humans have a practical need to be free from harm, as well as a psychological need to feel free from harm. Over the course of the human evolution this has manifested itself in the concept of “safety”, which is broadly understood as “the condition of being protected against harmful conditions or events, or the control of hazards to reduce risk”.

With the industrial revolution at the end of the 18th century, humans began to introduce more and more technology into their daily lives. Because of this accidents and incidents related to this technology began to increase as well. Safety engineering and safety management established themselves as important disciplines – dealing with the “causes of accidents” (the why), as well as “the mechanisms” (the how) of accidents.

Since then, safety engineering/management has gone through several stages, introducing many good concepts, such as accident prevention programs, fault-tree analysis, probabilistic risk assessments, human reliability assessment, as well as organizational risk analysis – to name a few.

In the 1990s James Reason introduced the “swiss-cheese model”, by which he illustrated that a “safe system” needs to have adequate barriers to threats. In this context he also emphasized the need for a “Just Culture” as “…an atmosphere of trust in which people are encouraged, even rewarded, for providing essential safety-related information…”.

With the emergence of increasingly complex systems, it has become increasingly difficult to assess “safety” in both, qualitative as well as quantitative terms. Many experts in the field feel that the “traditional view on safety” (i.e. simply looking at accidents or incidents and then trying to adjust the system to avoid these accidents from happening again) is no longer adequate. Instead, the focus should be not only on “what goes wrong”, but also on “what goes right”.

Erik Hollnagel coined the term Safety II, as the logical extension of the traditional view on safety (Safety I). While Safety I focuses on accident/incident analysis (i.e. what went wrong), Safety II focuses more on what went right to draw conclusions for the enhancement of safety. Hollnagel believes both, Safety I & II, are important, but due to the increase in overall complexity and interconnectedness, processes have become significantly more dynamic and non-linear. He holds that because of this it makes more sense to draw conclusions from an abundance of “good outcomes” and the (probably) fairly high number of “near misses”, versus (only) looking at the few “bad outcomes”, which may have occurred under extremely isolated circumstances.

For both, Safety I & II, just culture and an open reporting culture are essential. Only if all stakeholders openly and freely share their experience (good and bad), with no fear of retribution or fear of someone taking advantage of the report, can safety be developed to its maximum. A project-team at UAV DACH is working on a reporting system for RPAS (open to everyone), which aims at establishing the platform for such an open and just culture.