Aerospace

In the automotive domain, the following use cases are defined:

 TOUCH AND GO ASSISTANT

The arrival of an airplane at the airport involves manual work, such as attaching and running ground units that provide power, fuel, water, cooling or heating. In the frame of this project, the Touch and Go Assistant will be applied to a specific ground unit that supplies pre-conditioned air to the airplane. Operating these devices at high efficiency currently requires two operators to be present the whole time – in the airplane and on ground. The operators communicate the necessary adjustment of the air flow via walky-talky. This “human control loop” is too cumbersome for some airlines, hence the devices run in an un-supervised and much less efficient mode.

The Touch and Go Assistant will simplify this work: An initial association of the airplane and its ground unit – in line with current procedures – is sufficient. After that, the machines will communicate and work independently at high efficiency. No need for further interaction between ground and airplane personal. The assistant can also serve in other situations, like refuelling  or general maintenance connectivity. It also lays the foundation for future
services like automatic luggage retrieval.

As the assistant is part of the on-board communication system of the airplane, the whole system must be taken into consideration and all of it must be secured against external or internal attacks. Adding functionality to this existing system will open a new interface to the external world and therefore  requires thorough security and validation investigations.

Below presented, is a screenshot of early version of PhyWiSe tool, used for communication simulation and jamming detection in UC7:

The demonstration is based around a formal mathematical model of the ‘Touch And Go Assistant’ system. The model is constructed using iUML-B, a diagrammatic front-end for Event-B which has automatic theorem provers, model-checkers and animation tools. The demonstration shows how the tools can be used to analyse a security threat by:

  • examining un-proven proof obligations,
  • running the model-checker to produce a sequence of actions leading to a breach of security and iii) animating the model to validate its behaviour. Formal modelling has been used or proposed extensively for the verification of safety properties but less so for security at the systems level.

In UC7, models of a proposed ‘Touch and Go Assistant’ system with a focus on analysing its security properties are developed, with the use of tool-generated verification obligations and counter-example traces to analyse flaws and suggest requirements that mitigate security threats.

RECONFIGURABLE VIDEO PROCESSOR FOR SPACE

The main objective of this Use Case is a detailed description of the condition and requirements for a system used to navigate and observe space conditions in a dependable and reliable way. The partners focus on:

  • Describing the space environment in which a video processor system operates
  • Defining functional requirements for the system under test from actual mission scenarios
  • Defining non-functional requirements to operate in space
  • Describing certification and validation procedures

In this Use Case, the system under test is an industrial system that consists of multicore architectures capable of in-flight reconfiguration in present actual payload data processing equipment, for video processing, and navigation sensors based on camera systems. These video processors are in charge of conditioning, processing, compressing and ciphering the images acquired by an Earth Observation Satellite before their transmission to ground for navigation purposes.

A monitoring system based on a graphical user interface is used to validate the Autonomous Vision Based Navigation system under test. The system is composed of a reconfigurable Image Processing (IP) unit in charge of generating navigation information from target asteroid surface images. Three different algorithms provide navigation information in different phases of the mission. They are integrated on a programmable HW architecture performing the reconfiguration of the IP on each mission phase and being robust to radiation induced failure events. A Guidance, Navigation & Control (GNC) unit is executed in the same target board as the IP.  A camera is the key sensor. Additional required spacecraft sensors are modelled. Real world dynamics and kinematics in the descent and landing trajectory to the target asteroid are also modelled and interfaced to the SUT board. In this way the system is fed with images, navigation solutions, sensor input data and actuators commands to the GNC, enabling the autonomous navigation to small bodies in space.

20170615_104536_2