Short Term Scientific Mission (12/2022)

Working Group 2 – Data Fusion

Systematic literature review of data fusion for lidar

Duration: 4-6 weeks. 

Timing: February-March 2023. 

Location: Leiden, the Netherlands

Total estimated grant: between 3000-4000 euros depending on the duration and travelling expenses

Description: This Short Term Scientific Mission is an important part of the WG2 (data fusion) activities. During this STSM the successful candidate will perform a systematic literature review of all literature related to data fusion for lidar with other data sources (e.g. optical, microwave, in-situ). The candidate will use specific search terms to make a first selection of scientific papers. Subsequently, the candidate will use a coding scheme to classify the information in the papers based on the abstracts information. After this classification a smaller selection of papers will be chosen that will be analyzed in detail, in part by the STSM candidate. The results of this STSM will be used as input for a manuscript on the state-of-the art of data fusion techniques for lidar data for forest structure. The candidate is expected to have 4-6 weeks of full-time availability for this project, experience with lidar, (some) knowledge of data fusion approaches, and ideally experience with a systematic literature review.

Rules of the COST Action: Short-Term Scientific Mission consists in a visit to a host organization located in a different country than the country of affiliation, so in this case in it is not possible to offer STSM to scientists affiliated in the Netherlands.

Please, if you are interested send your CV and motivation to following emails Suzanne Marselis (s.m.marselis@cml.leidenuniv.nl), Markus Hollaus (markus.hollaus@geo.tuwien.ac.at) and 3DForEcoTech (3dforecotech@gmail.com).

The deadline for applying is 18.12.2022

Facebook
Twitter
LinkedIn
WhatsApp
Email
News

Latest Updates

Working Groups 2 and 5 Data preparation for Deep learning models Duration: 3 months. Timing: 20 January 2025 (flexible starting). Location: in person Total estimated grant: 4000 euros Description: This STSM Position Grant is important to the WG2 (data fusion) and WP5 (forest ecology) activities. During this STSM the successful candidate will help with the website preparation,...

One focus of the session “Remote Sensing for Forest Applications” at the upcoming European Geosciences Union General Assembly 2025 (EGU25) in Vienna, Austria (27 April – 2 May 2025), will be on the outcomes of the 3DForEcoTech COST Action (www.3dforecotech.eu). Its focus is on developing protocols for data acquisition and processing, fusion for forest inventory...

We are opening a call for a local organizer of the Interdisciplinary Summer School 2025. It is already the third year in a row that we plan to do a summer school where multiple COST Actions will get together and train a new generation of scientists on how to map, monitor and model forest ecosystem...

We are pleased to extend an invitation to submit an abstract in our  session titled “Remote Sensing for Forest Applications” at the upcoming European Geosciences Union General Assembly 2025 (EGU25), to be held in Austria (27 April – 2 May 2025). This session explores the potentials and limitations of various remote sensing applications in forestry,...

Dear forest scientists, As part of the COST action 3DForEcoTech and COST SNSF project, we aim to establish an international collaboration and create the first database for the detection of individual tree species. We are seeking i) manual tree species canopy delineations and ii) aerial RGBI (red, green, blue, and near infrared) or RGB imagery data for the delineated tree species...

Do you have data from destructive measurements (traditional approach) paired with scans from close-range technologies? This kind of data is quite scarce and involves much work. Therefore, the value of this data, combined from different locations/countries, is even greater, and we would really appreciate it if you could help us in the compilation of information to improve our knowledge...