You are here

MARVEL -Multimodal Extreme-Scale Data Analytics for Smart Cities Environments

Handling, processing, and delivering data from millions of devices worldwide is a complex and remarkable feat that hinges on edge computing systems. While edge computing brings computation and data storage closer, fog computing brings analytic services to the edge of the network. It’s an alternative to cloud computing. The EU-funded MARVEL project will develop an Edge-to-Fog-to-Cloud ubiquitous computing framework to enable multimodal perception and intelligence for audio-visual scene recognition, event detection, and situational awareness in a Smart City Environment. It will collect, analyze, and data-mine multimodal audio-visual streaming data to improve the quality of life and services to citizens within the smart city paradigm, without violating ethical and privacy limits, in an AI-responsible manner.

Goals: 
The “Smart City” paradigm aims to support new forms of monitoring and managing of resources as well as to provide situational awareness in decision-making, fulfilling the objective of servicing the citizen while ensuring that it meets the needs of present and future generations with respect to economic, social and environmental aspects. The city is a complex and dynamic system involving different interconnected spatial, social, economic, and physical processes subject to temporal changes and continually modified by human actions. Big Data, fog, and edge computing technologies have significant potential in various scenarios considering each city's individual tactical strategy. However, one critical aspect is to encapsulate a city's complexity and support accurate, cross-scale, and in-time predictions based on the ubiquitous spatio-temporal data of high-volume, high-velocity, and high-variety. To address this challenge, MARVEL delivers a disruptive Edge-to-Fog-to-Cloud ubiquitous computing framework that enables multi-modal perception and intelligence for audio-visual scene recognition, event detection in a smart city environment. MARVEL aims to collect, analyze and data mine multi-modal audio-visual data streams of a Smart City and help decision-makers improve the quality of life and services to the citizens without violating ethical and privacy limits in an AI-responsible manner. This is achieved via (i) fusing large scale distributed multi-modal audio-visual data in real-time; (ii) achieving fast time-to-insights; (iii) supporting automated decision making at all levels of the E2F2C stack; and iv) delivering a personalized federated learning approach, where joint multi-modal representations and models are co-designed and improved continuously through privacy-aware sharing of personalized fog and edge models of all interested parties.
Date: 
Sunday, 10 January, 2021 to Sunday, 31 December, 2023
Duration: 
3-years
Partners: 

Coordinated by IDRYMA TECHNOLOGIAS KAI EREVNAS, Greece

16 partners overall

Funding: 
€ 5 998 086,25