Wayfinding, Day or Night

At Acubed, the Wayfinder team’s mission is to build scalable and certifiable autonomy systems for future commercial aircraft. In our previous post, we described the expansion of our data acquisition campaign from day to nighttime as well as degraded visual conditions. In this post, we explore our recent work in Vision-Based-Landing (VBL), culminating in a machine learning model which handles both day and night visual conditions. As shown below, nighttime imaging not only means less visibility overall, but a completely different set of visible features such as runway lights instead of painted edges that are visible in daytime imagery. Moreover, runway approach lighting systems come in a variety of patterns.

Top: differences between visual features for day and nighttime imagery. Bottom: an illustration of the variety of approach lighting systems

Per one of our core strategies, we leverage simulation to generate synthetic night imagery to both bootstrap and augment data for machine learning. The images below show synthetic images of runway 30L at San Jose Airport. These required careful rendering of airport and environmental lighting, as well as improved runway models and data processing algorithms to include runway lights and other nighttime features.

Examples of synthetic images of runway 30L at San Jose airport (KSJC)

Given our improved software and expanded data set, we enhanced our machine learning models to simultaneously detect day and nighttime runways and features in a single, combined model. This model is trained on all our data: day and night, real and synthetic. At the same time, these new models conveniently interface with our overall VBL pipeline for aircraft localization with respect to the runway from onboard camera imagery. Below is an example output from this new VBL model.

An example of the localization output by the Vision-Based-Landing (VBL) system

Broadening our VBL technology to nighttime has once again demonstrated our collective team expertise in advancing the functionality. Our expanded Onboard Wayfinder Laboratory platform and perception software added new cameras and processing to handle nighttime imagery. The flight test team planned and successfully executed nighttime real data acquisition missions at virtually all local airports. The real data along with the synthetic nighttime images were then marshalled and curated by the data team for machine learning. Finally, the machine learning team transformed this wealth of data into a single unified VBL deep learning model that is capable of working seamlessly in both day and night.

Safety is paramount in aviation, and increasing automation, such as by applying Vision-Based Landing (VBL), promises to further enhance safety. However, it is vital that the operational environment or flight service envelope of any VBL include both day and nighttime visibility conditions. Background image credit.

Expanding our operations and capabilities to night imagery is a critical milestone in the development of a comprehensive VBL system. The work validates both our strategy and technology, as well as highlights areas for improvement, as we continue on the journey towards scalable and certifiable autonomy for aviation.

Where do we go from here?

Synthetic and Real Data Fusion: We are developing best practice techniques for fusing photo-realistic synthetic and high-quality labeled real world data for training deep learning models.

Generalizable AI: Compelling performance on unseen data is the critical goal of machine learning systems, and to that end, we are developing metrics and methods to measure and ensure good generalization.

Degraded Visual Conditions: A key next milestone is to further expand our flight service envelope to degraded visual conditions such as sun flare, clouds and rain.

Stay tuned for future posts on continued Wayfinder progress in VBL for day, night, rain or shine!

- Kinh Tieu and Ashish Tawari