Skip to content

nshaud/DeepNetsForEO

Repository files navigation

Deep learning for Earth Observation

http://www.onera.fr/en/dtim https://www-obelix.irisa.fr/

This repository contains code, network definitions and pre-trained models for working on remote sensing images using deep learning.

We build on the SegNet architecture (Badrinarayanan et al., 2015) to provide a semantic labeling network able to perform dense prediction on remote sensing data. The implementation uses the PyTorch framework.

Motivation

Earth Observation consists in visualizing and understanding our planet thanks to airborne and satellite data. Thanks to the release of large amounts of both satellite (e.g. Sentinel and Landsat) and airborne images, Earth Observation entered into the Big Data era. Many applications could benefit from automatic analysis of those datasets : cartography, urban planning, traffic analysis, biomass estimation and so on. Therefore, lots of progresses have been made to use machine learning to help us have a better understanding of our Earth Observation data.

In this work, we show that deep learning allows a computer to parse and classify objects in an image and can be used for automatical cartography from remote sensing data. Especially, we provide examples of deep fully convolutional networks that can be trained for semantic labeling for airborne pictures of urban areas.

Content

Deep networks

We provide a deep neural network based on the SegNet architecture for semantic labeling of Earth Observation images.

All the pre-trained weights can be found on the OBELIX team website (backup link.

Data

Our example models are trained on the ISPRS Vaihingen dataset and ISPRS Potsdam dataset. We use the IRRG tiles (8bit format) and we build 8bit composite images using the DSM, NDSM and NDVI.

You can either use our script from the OSM folder (based on the Maperitive software) to generate OpenStreetMap rasters from the images, or download the OSM tiles from Potsdam here.

The nDSM for the Vaihingen dataset is available here (courtesy of Markus Gerke, see also his webpage). The nDSM for the Potsdam dataset is available here.

How to start

Just run the SegNet_PyTorch_v2.ipynb notebook using Jupyter!

Requirements

Find the right version for your setup and install PyTorch.

Then, you can use pip or any package manager to install the packages listed in requirements.txt, e.g. by using:

pip install -r requirements.txt

References

If you use this work for your projects, please take the time to cite our ISPRS Journal paper :

https://arxiv.org/abs/1711.08681 Nicolas Audebert, Bertrand Le Saux and Sébastien Lefèvre, Beyond RGB: Very High Resolution Urban Remote Sensing With Multimodal Deep Networks, ISPRS Journal of Photogrammetry and Remote Sensing, 2017.

@article{audebert_beyond_2017,
title = "Beyond RGB: Very high resolution urban remote sensing with multimodal deep networks",
journal = "ISPRS Journal of Photogrammetry and Remote Sensing",
year = "2017",
issn = "0924-2716",
doi = "https://doi.org/10.1016/j.isprsjprs.2017.11.011",
author = "Nicolas Audebert and Bertrand Le Saux and Sébastien Lefèvre",
keywords = "Deep learning, Remote sensing, Semantic mapping, Data fusion"
}

License

Code (scripts and Jupyter notebooks) are released under the GPLv3 license for non-commercial and research purposes only. For commercial purposes, please contact the authors.

https://creativecommons.org/licenses/by-nc-sa/3.0/ The network weights are released under Creative-Commons BY-NC-SA. For commercial purposes, please contact the authors.

See LICENSE.md for more details.

Acknowledgements

This work has been conducted at ONERA (DTIM) and IRISA (OBELIX team), with the support of the joint Total-ONERA research project NAOMI.

The Vaihingen data set was provided by the German Society for Photogrammetry, Remote Sensing and Geoinformation (DGPF).

Say Thanks!