AI-enabled baseband algorithms for High Fidelity Measurements

AI-enabled baseband algorithms for High Fidelity Measurements

DESCRIPTION

Critical applications such as autonomous vehicles and machine control require high fidelity raw measurements in challenging environments. In spite of significant progress made in recent years, for those applications GNSS performance remains unsatisfactory in terms of reliability in challenging environments, therefore requiring improvement.
The main challenges lie in handling transfer functions from baseband samples to high quality / high fidelity raw measurements (i.e. with minimal local environment impairments), which would then feed highly hybridized PNT engines such as those already available today (e.g., Kalman and particle filters, DPE, PPP/RTK).
Facing very challenging environments, the traditional approach reaches its limits when deriving algorithms from theoretical models and optimal estimates. Indeed, the targeted environments cannot be modelled properly and their changes cannot be predicted effectively (as new buildings, big vehicles obstructing the way towards key satellites to good geometry and generating multipath, variation in canopy coverage and thus attenuation along the year). Hence, prominent experts and academics in PNT technologies consider that the only way to design improved algorithms is to process real data collected in the field, learn from their behaviour and tune the algorithm accordingly. In the field of sensor fusion for autonomous driving, artificial intelligence (AI) and deep-learning processes are now involved to catalyse human engineering.
Meanwhile, AI-based capabilities are becoming more accessible, in particular thanks to key technologies made available in open source by major players in that field (Google, IBM, etc.). The GNSS community is collecting more and more raw data, including raw RF samples, worldwide, along with high quality reference trajectory: this may pave the ground for efficient machine-based learning processes.
In this context, artificial intelligence and machine learning can catalyse the empirical design process of baseband algorithms, fed by the ever-growing availability of real data collected by more and more users.
The objectives of the proposed activity are to:

  • establish a new paradigm in the design of GNSS algorithms, leveraging on artificial intelligence and fed by collected data in field trials;
  • design algorithm to provide high-fidelity raw measurements, along with quality indicators related to local environments.

The algorithm is intended to be designed based on AI and machine learning (e.g. open source), fed with massive data sets (such us raw GNSS measurements and samples available at ESA) and with new sets of opportunity data becoming available during the product’s lifetime.
The tasks to be performed will include:

  • consolidation of state-of-the-art in Artificial Intelligence and machine learning processes;
  • definition of suitable AI architecture(s) for the GNSS baseband processor - one or more architecture depending on the result of the trade-offs;
  • design and implementation of AI-based algorithms and associated learning processes;
  • gathering and formatting data to feed learning processes and tune algorithms;
  • field trials to test algorithms and assess preliminary performance;
  • AI algorithms update, in order to complement learning with additional test data;
  • Comparison of performance with state-of-the-art receivers.

The results of the activity will provide:

  • innovative baseband algorithms, enabled by Artificial Intelligence, and able to provide high-fidelity GNSS raw measurements for applications operating in challenging environments;
  • a breadboard implementing the AI-based baseband processor integrated with state-of-the art positioning engine;
  • a machine-learning platform and environment.

Data and results from other ESA activities will be duly considered, assessed and used, as well as from GSA studies (e.g. ‘ESCAPE’ project for autonomous driving).