The software for aircraft systems is developed to DO-178B standards. DO-178B primarily concerns itself with safety. Safety analysis carried out at the systems level determines to what level the software shall be developed and verified. Thus, for example, software that could result in a catastrophic failure are developed to Level A. Those that have no impact on safety are assigned Level E, and do not fall under the purview of DO-178 anymore. An example of Level A software would be the software that resides in an Engine Controller. An example of Level E software would be the software in an In-Flight Entertainment System.
But can we call a system safe if it is not secure? For a long time the aero world did not concern itself much about security of a system. A search of DO-178B for the word “security” reveals exactly ZERO results. Hacking into a system was something that banking and financial organizations should worry about. Those who worry about their privacy need to worry about security. But why would anyone hack into an aircraft and how?
The situation has of course rapidly changed. A business traveller wants to seamlessly move from home to office. She sees the aircraft as a flying office, especially on long haul flights. Another passenger wants to keep himself abreast with the latest golfing news as he flies. He might also talk to his ailing mother while flying.
Then we the pilots communicates with the ATC and Ground station using an SMS like service using radio or satellite called ACARS (Aircraft Communication Addressing and Reporting System).
Finally we have the aircraft version of Internet of Things - subsystem communicating to each other and to the OEM/Airlines to report health and potential hazards.
Suddenly we have three independent networks on the aircraft that cannot be allowed access to one another. In the ascending order of security we have Passenger Network < Crew Network < Systems Network. Unless these networks are physically separated, there is always a possibility that someone might hack into the Systems Network. The safety of the aircraft is immediately compromised.
We have now answered the question we asked ourselves previously: unsecured system cannot be called a safe system, no matter how robust the process of development / verification is.
Do-178C has now succeeded DO-178B. Let’s do the same search on this too. What do you think a search of the word “security” on DO-178C reveal? Exactly ONE! Para 2.1 includes the following text: As part of the system life cycle processes, system requirements are developed from the system operational requirements and other considerations such as safety-related, security, and performance requirements.
We can conclude that the aero experts are certainly becoming wise to security threats, but considering the fact that paragraph DO-178C mentions the word “security” only once, not much thought has gone into including secure software development life cycle (S-SDLC) processes in DO-178C. Merely including security requirements at system level may not be sufficient. At present there is a process that ensures that defined artifacts from the software life cycle process flow back into the systems level to enable safety assessment. There is no such equivalent Security Assessment process. Clearly, software development for airborne systems is still largely safety oriented.
It is this author’s strong recommendation that a DO-178C supplement be released as soon as possible to address S-SDLC issues. It is essential to standardize aspects of secure software development and verification, such as Security Risk Assessment, Threat modeling, Vulnerability Assessment, Fuzzing, Server and Network Configuration Review, etc.
Aero systems’ software developed to DO-178B standards is safe. We need to make them secure too. HCL Technologies has a dedicated Aerospace Software Engineering unit that specialises in aerospace safety.