ITEA is the Eureka Cluster on software innovation
ITEA is the Eureka Cluster on software innovation
ITEA 4 page header azure golden ratio

Getting ready for the future and its challenges through ITEA

By ITEA Chairwoman Zeynep Sarılar and Programme coordinator Erik Rodenbach

As we celebrate the 20th year of ITEA this year, let us celebrate what has been achieved via ITEA and take a walk along some of the achievements accomplished by ITEA projects that have a direct impact on future challenges.

Artificial Intelligence, big data, security, smart industry, modelling, simulation are some of buzzwords that we hear every day, even in our daily conversations. But how ready are we for these challenges? What is being achieved by today’s technology still needs clarification and explanation.

Here we present a set of project outcomes that have a direct impact on resolving a challenge or that identify potential solutions. These projects are just a sample of ITEA achievements and there are many outcomes like these that are shared on the ITEA website, e.g. in the ITEA Impact stream and Success stories.

20 years of ITEA 20 years of ITEA

Big data... How does it have an impact in our lives?

There is a set of ITEA projects that provides a direct solution for big data challenges. Knowledge on the potential of HPC, the newest technology on sensors, a set of new analytic tools, and platforms to keep and to understand big data are some of the verticals needed to have a sustainable big data solution. Below are some outcomes that may speed up big data solutions to create a better tomorrow faster, more reliably and more sustainably.

High-Performance Computing

High-Performance Computing (HPC) is essential in meeting the demand for increased processing power for future research and development in many domains, such as aircraft and automotive design or multimedia. The goal of the ITEA project H4H (Hybrid for HPC) was to provide a highly efficient, hybrid programming environment for heterogeneous computing clusters to enable easier development of HPC applications and to optimise application performance.

Extensive collaboration and workshops during the project generated significant innovations such as new programming approaches (e.g. OpenACC, OpenMP 3.0 tasks, PGAS), faster development phases for application developers and better energy-efficient use of application resources along with a new promising cooling technology for future extreme computing needs. The project developed a new HPC architecture, including new accelerators based on GPU and MIC technologies, and a customised development environment including optimisation tools and libraries supporting the new hardware architecture, allowing performance improvements (accelerating application execution time by a factor of up to several tens). Furthermore, 3D combustion simulation software benefited from the extremely fast software that can perform a large number of simulations to parameterise new models and verify accuracy, thereby reducing the simulation time for furnace optimisation from 12 to 1.5 hours and so boosting production and preventing slag growth.

Making Big Data a valuable asset

By positioning the target open-source architecture to support Big Data, ecosystems and value chains, the ITEA CAP (Collaborative Analytic Platform) project contributed to the development of new but sustainable business models and laid the foundation for a market value proposition of ‘Big Data as a Service’. While the arrival of enabling technologies has made a wealth of public and organisational data available for analytic processing, access to the data and to efficient analytic tools is often difficult. Furthermore, combining such sources of massive data can yield much richer applications and greater insights into intelligence reporting. This requires a collaborative platform, which makes it easy for the participants to share data securely and to easily gain access to the latest technology tools.

In cloud computing, new service models that take advantage of virtualisation and remote access have broadened the significance of multi-tenancy architecture. A Software-as-a- Service (SaaS) provider, for example, can run a particular application on a specific database and provide web access to multiple customers. In such a scenario, each tenant’s data is isolated and remains invisible to other tenants. In the CAP project, a concrete CAP platform with multi- tenant architecture enabled the development of an interactive CCTV monitoring service that analyses CCTV metadata together with data from external systems (e.g. weather, traffic, accident, etc.) and recommends key CCTV recordings and situations to focus observer attention on them. In another use case of the CAP project, several terabytes of data were analysed to qualify the quality of the data and then to extract useful conclusions about the processes, with the focus on the franking fraud and data visualisation of the real process inside a sorting centre.

In another Big Data platform, the Wind Power Icing Atlas (WIceAtlas), data from over 4500 meteorological stations worldwide with over 20 years of observation data and 35 years of MERRA reanalysis data makes it possible to estimate, for example, the resulting long-term iced turbine production losses and provide valuable Annual Energy Production (AEP) estimates for financial calculations.

 

Smart Maritime Surveillance

Current surveillance systems in the maritime domain consist of radar (to detect and track vessels) and visual sensors (for securing borders in and around large infrastructures, e.g. along a coast or in a harbour). These sensors are never used in conjunction in their full capacity and have severe limitations. Radar is only capable of detecting large vessels minus details about the type and identity, whereas visual sensors are too static and hamper 3D capabilities.

The APPS project developed a plug & play solution that improves interoperability of surveillance activities, effectiveness of operations at sea and implementation of relevant legislation and policies. The APPS platform provides data flow between an application and the APPS Data Distribution Service (DDS), interoperability and mediation with other systems or interoperability components as well as algorithms and devices to boost recognition performance. At device level, the technology allows sensors to plug & play into a surveillance system whose layers can reconfigure themselves and operate uninterrupted. At the other end of the stack, surveillance systems operate as a system of systems, exchanging and fusing information and sharing situational awareness.

The project generated several innovations such as acoustic sensors, collision detection algorithms, an automatic filter selection and vessel detection and classification.

Security is mandatory for daily life with Internet

Our life is surrounded by Internet and solutions based on Internet. And this opens a door for a crucial need for security solutions not only for the Smart city challenge but also for Smart healthcare, Smart industry, and for all other challenges of ITEA.

The dilemma between efficiency and security in intelligent buildings

Sustainable, reliable, user-friendly, efficient, safe and secure Building Management Systems in the context of Smart Critical Sites is a must. Commercial and government buildings are subject to increasingly stringent regulations and policies in terms of safety, energy efficiency, facility management, information systems and security. Identifying the limitations of existing sensor chains, optimising the network infrastructure, enabling cross-domain building model and analytics, unifying building management interfaces and enabling cyber- physical security management are challenges.

The FUSE-IT project solves the dilemma between efficiency and security in intelligent buildings by combining Building Management Systems (BMS) with Security Management Systems (SMS) and stimulating cross-domain innovation between activities that are traditionally very segmented. FUSE-IT developed a Core Building Data Processing & Analysis module that processes data reported by secured shared sensors, effectors and devices that are robustly interconnected through trusted federated energy and information networks.

Safety

The project SAFE developed new concepts to model safety and architecture as well as methods for safety analysis, variant management and safety code generation based on the existing modelling languages EAST-ADL and AUTOSAR. It enabled an effective and compliant application of the new ISO26262 safety standard in the automotive industry processes. Innovations were made on three levels: concept, tools and process.

An exchange format compliant with the existing standards and enriched with the SAFE meta-model formats allows a major step to be made in the direction of integrated, model- based design in the tool market of the automotive industry. This provides functionality for integrated development and safety analysis on each of the abstraction levels: requirements, architecture, HW design, SW modelling and coding. Finally, a guideline developed by the SAFE project and formalised in a process model that includes an assessment model provided the industry with a unique, commonly agreed interpretation.

Smart Engineering is a need for Smart Industry

Smart Industry, or Industry 4.0, needs a set of new components, including software tools, sensors and testing methodologies. In ITEA, all these subjects are defined as important challenges and hereafter are some of the outcomes of ITEA projects related to these challenges.

A quantum leap for technical authors

The complexity of software systems in safety- critical domains (e.g. avionics and automotive) has significantly increased over the years. A text and model-synchronised document engineering platform that provides a generic framework for automated traceability analysis is a necessity for complex systems. This platform needs to allow the integration of two types of reasoning: about the meaning of text and about document structure. In the Modelwriter project, a platform is delivered to reflect the quality (consistency, completeness) of documents produced by technical authors (such as software or systems engineers) as the enhanced quality of companies’ products.

ModelWriter envisions an integrated authoring environment, which combines a Semantic Parser (= the "Writer" part), capable to "understand" pieces of text and transparently creates models from them; and a Knowledge Capture Tool (= the "Model" part) that understands the semantics of industry standard notations and languages such as UML, ReqIF and Java. This allows technical authors to interactively configure and analyse traceability among different parts of work products produced in the system development process to perform different review activities such as consistency checking, change impact analysis, structural coverage and repairing broken traces.

Smart Manufacturing

While Europe has a high level of automation and high-quality products, its production systems are still highly complex and need flexible production system design, optimised time to market and extremely high product quality. Against this background the ITEA project AVANTI developed a virtual commissioning test methodology to help leading European OEMs, component and tool providers to gain a competitive edge through two key innovations: (1) virtualisation of the testing process for industrial production lines and (2) the combination of different models and tools for simulating production to create and perform tests for virtual commissioning and industrial application.

The AVANTI project developed a co-simulation framework that contains behaviour models and co-simulation, the modelling and simulation of mechatronic components, fast and lightweight FMI-based (Functional Mock-up Interface) co- simulation of physical behaviour models, and the integration of co-simulation approaches into existing processes. The Virtual Commissioning Test Generation and Execution Tool developed for users in the manufacturing sector automatically generates detailed test cases, performs them and provides a detailed overview of the results.

Advanced Co-simulation Open System Architecture

In the key area of virtual system development (‘frontloading’) aimed at reducing development times, stranded costs and time-to-market, co-simulation is a particularly promising approach for interoperable modular development. However, the coupling and integration of real- time systems into simulation environments (especially of systems of distributed HiL (Hardware in the Loop) systems and simulations) still requires enormous effort. The aim of ACOSAR was to develop both a non- proprietary "Distributed Co-simulation Protocol" (DCP) for integration of simulation and testing environments and corresponding integration methodology, which will be a substantial contribution to international standardisation (FMI). The result is a modular, considerably more flexible as well as shorter system development process for numerous industrial domains that will also enable the establishment of new business models.

Modelling and simulation represent key methods for the successful development of technical devices and machines and while the FMI is an existing standard used for the exclusive integration of simulation models, there was no standardisation for the integration of real-time systems and no standard for distributed co-simulation before ACOSAR. Within ACOSAR, the development of a DCP (Distributed Co-simulation Protocol) specification aims to reduce time and effort spent by OEMs.

The DCP was intended as an open standard for fast industry adoption, which means that having DCP compliant integration, there is no need for component providers to negotiate protocols with their customers, which gives them a competitive boost. System integrators also benefit from the DCP integration of the subsystems. The ACOSAR project also ensures that safety engineers are beneficiaries of systematic development methods and well-defined interfaces between system components – the DCP gives them confidence in the reliability of their system testing results so that they can focus on what to test rather than how to test. In this way, they can do more tests in less time, which ultimately makes products safer.

With many different components provided by different partners, the required integration is reduced to an interface specification in order to exchange IP-protected components. This supports the fast and smooth integration of heterogeneous IP blocks and leads to an open market. A publicly available, industry-accepted interface not only facilitates the horizontal approach that is needed but also opens up the possibilities for SMEs to have significant market shares.

All in all, the impact of the results of ITEA project collaborations on the challenges of the future, from technological, business and societal perspectives as well as the need for innovative software solutions, has been significant over the past twenty years. There is, then, every reason to be confident that in the coming two decades, this impact will continue to grow for the good of society.