At present, surveillance systems in the maritime domain consist of radar and visual sensors. Whereas radar is used to detect and track vessels, the visual sensors are used for securing borders in and around large infrastructures as e.g. along a coast or in a harbour. These sensors are never used in conjunction in their full capacity and have severe limitations. Radar is only capable of detecting large vessels without getting details about the type and identity, whereas visual sensors are too static and hamper 3D capabilities. Therefore, future surveillance systems will differ significantly from today’s systems in several important ways by exploiting the benefits of different sensor modalities. They will integrate high-quality (HD and 3D video), multi-sensory data inputs taken from multiple viewpoints, exchange multi-streamed data between subsystems and take action in a plug-and-play fashion, whereby the multidimensional data is analysed in realtime. This will place unprecedented demands on networks for high-capacity, low-latency, and low-loss communication paths. The APPS project will contribute to this transition by advancing the state-of-the-art in surveillance systems in three key areas:
(1) it will enable the development of plug & play solutions;
(2) it will enhance the sensor processing and intelligent decision-making capabilities and intelligent operator aids of such systems to achieve smart surveillance in large spaces such as coastal areas and harbours with critical infrastructures; and
(3) it will develop a robust communication layer over heterogeneous technologies.