Conference Agenda

Overview and details of the sessions of this conference. Please select a date or location to show only sessions at that day or location. Please select a single session for detailed view (with abstracts and downloads if available).

 
 
Session Overview
Session
S2: Soil and Crop monitoring
Time:
Tuesday, 14/May/2024:
9:00am - 10:30am

Session Chair: Kristof Van Tricht, VITO
Session Chair: Martin Claverie, JRC
Location: Big Hall


Show help for 'Increase or decrease the abstract text size'
Presentations
9:00am - 9:12am

WaPOR Accounter: a web app to easily and interactively monitor Water Productivity at field scale

Bert Coerver, Livia Peiser

FAO, Rome

The FAO has developed the WAPOR database, a publicly accessible, near real time database containing evaporation, transpiration, interception and net primary production maps at different scales, using data gathered by a range of satellites including, among others, Sentinel-2 and meteorological data from ERA5. This database is the backbone of the WaPOR project that, now in its second phase, works with ten partner countries to build their capacity in the use of WaPOR data for its different applications, and to generate solutions to local challenges linked to water and land productivity as well as water management.

WaPOR Accounter bridges the gap between this database and end-users, by enabling them to quickly and easily convert its data into actionable information at two different scales.

At field-scale, users can select agricultural fields and instantly see WaPOR data linked to the selected fields. The app helps users to identify crop seasons for the selected fields and aggregates data accordingly, allowing comparisons of water consumption, biomass production and water productivity between neighbouring farms and/or between seasons. Additionally, users are able to compare a fields water consumption to its optimal consumption, helping them identify water-stressed or waterlogged fields.

At the basin-scale, the app lets users see a basins water-balance and related statistics and indicators, such as the basins exploitable water, utilized flows and water consumption per landuse class.



9:12am - 9:24am

Plant water monitoring in Africa – Expanding hyperspectral (PRISMA and EnMap) analysis capacity by exploiting free and open-source software

Veronika Otto1, Silke Migdall1, Jeroen Degerickx2, Heike Bach1

1Vista, Germany; 2Vito, Belgium

ARIES is exploring the potential of spaceborne hyperspectral data (PRISMA and EnMap) to address water management and food security in Africa. Within the project prototype EO products are being created and important recommendations for the design of future Copernicus missions (CHIME) are being made. The product design process takes place in close collaboration with several African Partners from Southern, Western and Eastern Africa in order to ensure usefulness and applicability as well as knowledge and capacity transfer.

Getting accurate and timely information on plant water content has been identified as one of the main challenges across the continent, especially for farmers. Plant Water, Leaf Area Index and Canopy Water have thus been chosen as key information products to be produced within the project. We are deriving these, using the agricultural applications available within the free and open source QGIS plugin EnMap Toolbox, which is also implemented on the FS-TEP platform, thus allowing access to the chosen approaches for anyone at any time even after the project lifetime.

The parameters are retrieved from EnMap and PRISMA data collected from March 2023 onwards over six different test sites located in Senegal, Mali, Niger and Zambia, using a dual approach that relies on radiative transfer modelling (PROSAIL) for leaf area retrieval and on canopy water retrieval based on a water absorption feature between 930 and 1060 nm of the electromagnetic spectrum (Wocher et al. 2018). Combining the two products, plant water content can be calculated.

During the workshop we would like to present our approach and results from our test site in Zambia, where knowledge about plant water is essential for decision making with regards to irrigation and harvest.

The project runs from September 2022 until September 2024 and is funded by ESA (ESA Contract No: 4000139191/22/I-DT).



9:24am - 9:36am

Boosting Crop Classification by Hierarchically Fusing Satellite, Rotational, and Contextual Data

Valentin Barriere1,2, Martin Claverie3, Maja Schneider4, Guido Lemoine3, Raphael d'Andrimont3

1Universidad de Chile, DCC, Chile; 2CENIA, Chile; 3JRC-Ispra, Italy; 4TUM, Germany

Accurate early-season crop type classification is crucial for the crop production estimation and monitoring of agricultural parcels. However, the complexity of the plant growth patterns and their spatio-temporal variability present significant challenges.
While current deep learning-based methods show promise in crop type classification from single- and multi-modal time series, most existing methods rely on a single modality, such as satellite optical remote sensing data or crop rotation patterns. We propose a novel approach to fuse multimodal information into a model for improved accuracy and robustness across multiple crop seasons and countries.
The approach relies on three modalities used: remote sensing time series from Sentinel-2 and Landsat 8 observations, parcel crop rotation and local crop distribution.
To evaluate our approach, we release a new annotated dataset of 7.4 million agricultural parcels in France (FR) and the Netherlands (NL). We associate each parcel with time-series of surface reflectance (Red and NIR) and biophysical variables (LAI, FAPAR). Additionally, we propose a new approach to automatically aggregate crop types into a hierarchical class structure for meaningful model evaluation and a novel data-augmentation technique for early-season classification.
Performance of the multimodal approach was assessed at different aggregation levels in the semantic domain, yielding to various ranges of the number of classes spanning from 151 to 8 crop types or groups. It resulted in accuracy ranging from 91% to 95% for the NL dataset and from 85% to 89% for the FR dataset.
Pre-training on a dataset improves transferability between countries, allowing for cross- domain and label prediction, and robustness of the performances in a few-shot setting from FR to NL, i.e., when the domain changes as per with significantly new labels.
Our proposed approach outperforms comparable methods by enabling deep learning methods to use the often overlooked spatio-temporal context of parcels, resulting in increased precision and generalization capacity.



9:36am - 9:48am

In-season Crop Type Mapping: An accuracy evaluation at European scale using the CHEAP Database

Martin Claverie1, Valentin Barriere2, Raphaël d'Andrimont1, Renate Koeble1, Marijn Van der Velde1

1European Commission, Joint Research Centre (JRC), Ispra , Italy; 2Centro Nacional de Inteligencia Artificial (CENIA), Santiago , Chile

Timely crop type mapping is crucial to inform actionable interventions for food security, serving as a key input for estimating crop area and deriving information relevant for yield forecasting. While traditional approaches rely on in-season Ground-Truth (GT) data collection, recent methods utilize deep-learning models solely based on data from previous years. This can contribute to reducing costs related to data collection. To perform well, such models require a substantial amount of data for training. In Europe, a source of data is the annual GeoSpatial Application (GSA) datasets, i.e., the farmers declarations supporting the aid application under the Common Agricultural Policy, including the agricultural parcels boundaries with cultivated crop types. When made public, these datasets become valuable sources of extensive GT data for mapping crop types but lack harmonization across national datasets. Leveraging this GT data source, we have established the CHEAP (Common Harmonized European Agricultural Parcels) database. It consists of a multi-annual parcel dataset with harmonized crop types across years and countries, spanning from 2008 to 2022, covering 13 countries (more than 22M parcels). Here, we directly integrate the CHEAP dataset with multi-annual Sentinel-2 time series to evaluate the capacity for in-season crop type prediction at parcel level. Using a stratified random sampling, we obtained raw and smoothed multi-annual time series (TS) of Sentinel-2 (S2) spectral data for 5 million parcels per year. This Satellite Image Time Series, which stands out for its distinctive spatial and temporal coverage, was employed to assess the capability of a recently published multi-modal deep-learning model (Barriere, Claverie et al. 2024, RSE) to predict crop type at parcel level during the early stages of the season for ten countries and four seasons (2019-2022). This unique and vast evaluation highlights the potential of the algorithms to perform over a large variety of cropping systems.



9:48am - 10:00am

Earth Observation for estimating and predicting crop nutrients

Mariana Belgiu1, Michael Marshall1, Gabriele Candiani2, Mirco Boschetti2, Monica Pepe2, Francesco Nutini2, Micol Rossini3, Chiara Ferrè3, Luigi Vignali3, Cinzia Panigada3, Roberto Colombo3, Stephan Haefele4, Murray Lark4, Alice Milne4, Grace Kangara4, Tobias Hank5, Stefanie Steinhauser5, Rain Vargas Maretto1, Chris Hecker1, Alfred Stein1, Andy Nelson1

1University of Twente, Faculty of Geo-Information Science and Earth Observation (ITC), the Netherlands; 2Institute for Electromagnetic Sensing of the Environment, Italian National Research Council (CNR); 3University of Milano-Bicocca, Department of Earth and Environmental Sciences; 4Rothamsted Research; 5Ludwig-Maximilians University of Munich, Department of Geography

Timely information on nutrient concentrations (micro-nutrients, macro-nutrients, protein) in staple crops over large areas is lacking which limits our understanding of how nutrients vary across various geographic areas. In the absence of this information, we cannot efficiently guide research activities dedicated to alleviating potential nutrient deficiencies through genetic biofortification or agronomic biofortification by applying fertilizers. Conventional methods for measuring the grain nutrient levels typically consist of collecting grains at harvest and performing wet chemical analysis, near-infrared spectroscopy, or hyperspectral imaging of the crop grains in the laboratory. These methods are time-consuming and cost-prohibitive and, consequently, unsuitable for consistent quantification of nutrients across large spatial extents. In addition, as the nutrients are only measured after harvest, this approach precludes effective intervention with fertilizers while the crop is still growing.

To overcome the scale and cost limitations of laboratory analysis, the EO4Nutri team is investigating the potential of various Earth Observation data including hyperspectral, multispectral, and thermal data to estimate wheat grain protein content, as well as Calcium (Ca), Iron (Fe), Magnesium (Mg), Nitrogen (N), Phosphorus (P), Potassium (K), Selenium (Se), Sulphur (S), and Zinc (Zn) in wheat, maize and rice grains and soil. The team collected soil, plant, and grain samples together with field spectroscopy data and satellite EnMAP and PRISMA satellite images in Jolanda di Savoia and Munich-North-Isar (MNI) test sites and assessed the lifecycle of nutrients of wheat, maize, and rice from the soil to crop canopy to crop grains with state-of-the-art ML analytical techniques. Preliminary results show the potential of hyperspectral data to predict the nutrients and protein in the final agriculture production.



10:00am - 10:12am

A Rapid Assessment Framework to monitor harvest progress in Ukraine

Shabarinath Sreedharan Nair1,3, Sergii Skakun2,3, Josef Wagner1,3, Yuval Sadeh4,3, Mehdi Hosseini2,3, Saeed Khabbazan5,3, Sheila Baber2,3, Blake Munshell2,3, Fangjie Li1,3, Eric Duncan2,3, Inbal Becker Reshef1,2,3

1Laboratoire ICube, Team TRIO, Université de Strasbourg, France; 2Department of Geographical Sciences, University of Maryland, MD, USA; 3NASA Harvest; 4Department of Geography, University of Monash, Australia; 5Delft University of Technology

The Russian forces invaded Ukraine on 24th February 2022 leading to widespread disruption of Ukraine's agricultural system. Ukraine is a major exporter of crops , the invasion therefore poses a significant risk to global food security. Total production is one of the critical indicators in this regard which in turn is directly proportional to the total harvested area. The majority of remote sensing based harvest detection studies require a complete satellite phenological time series and use the assumption that senescence leads to harvest. Both these conditions cannot be applied in this case as all planted fields need not be harvested.

Given these constraints and challenges, we developed a method to monitor crop harvest near-real time using high resolution Planet satellite imagery. Our method includes training a model to cluster change patterns on historic window based spatio-temporally sampled data and then identify harvest patterns in the current season. The sampling approach ensures we capture a complete representation of change patterns that exist. Clusters are assigned as ‘harvested’ or ‘non-harvested’ by visually inspecting imagery at a higher temporal resolution, using which harvest can be seen as a change event. Our method works in the absence of training labels.

In free Ukraine we found 94% of planted winter crops to be harvested and in occupied Ukraine it was 88% as of 19th September 2022. Strong visual patterns of non-harvested crops were observed along the occupation borders.We visually interpreted satellite imagery at a higher temporal frequency to generate statistically significant validation data for model accuracy calculation. We obtained an overall accuracy of 85% with an f1-score of 90% for the harvested class and 73% for the non-harvested class. Our assessments and analysis were directed to different organizations and agencies dealing with the Ukraine crisis and led to several key insights and derived interpretations.



10:12am - 10:30am

Discussion

. .

.

.