Interview with Flora Incognita: Innovation in Citizen Science Using Machine Learning

Images: All images courtesy Flora Incognita https://floraincognita.com/de/pressemappe/

Authors
Jana Wäldchen ORCiD 0000-0002-2631-1531
Patrick Mäder ORCiD 0000-0001-6871-2707

Cite as:

DOI

10.25815/xp2t-w456

Citation format: The Chicago Manual of Style, 17th Edition

Wäldchen, Jana & Mäder, Patrick. ‘Interview with Flora Incognita: Innovation in Citizen Science Using Machine Learning’, 2019. https://doi.org/10.25815/xp2t-w456.

An interdisciplinary team has come up with a mobile app for identifying plants based on users taking a photo of the plant on their mobile. The Flora Incognita app applies machine learning to identify plant species in near real-time — flowers, plants, and trees. Simplicity and Innovation are both hard to accomplish but this is where Flora Incognita has excelled and to achieve both deserves a mention. Currently the app suite works with flora in the German Central European region, based on 4,800 species, using 1.7 million images, with a 100,000 images coming from users in 2018 alone. For Citizen Science the enthusiastic engagement of the public with Flora Incognita shows a clear path forward for more widespread uses of machine learning in public participation with science and scholarship, and in knowledge creation.


Image: Screenshots of Flora Incognita app

Download
Flora Incognita
for iOS and Android

Download
Flora Capture
for iOS and Android

GenR: Would you like to give a short introduction to yourselves and your roles in the Flora Incognita research?

Jana Wäldchen: I am leading the biological part of the Flora Incognita project at the Max Planck Institute for Biogeochemistry, Jena (BGC). I studied Landscape Management and Nature Conservation and did my PhD in ecology. Since my studies, I have been aware that accurate species identification is an important component of workflows in ecological research. Many activities, such as studying the biodiversity richness of a region, monitoring populations of endangered species, determining the impact of climate change on species distribution, and weed control actions depend on accurate identification skills. These activities are a necessity for: farmers, foresters, taxonomists, conservation biologists, technical personnel of environmental agencies, or just fun for laypersons. Automating the task and making it feasible for non-experts is highly desirable, especially considering the continuous loss of biodiversity — and ironically of taxonomists to monitor biodiversity.

Patrick Mäder: I’m a professor in software engineering and data-intensive systems at the Technical University of Ilmenau. Automating and simplifying processes has always been a major objective of my work, initially as consultant and developer in industrial projects, later in developing methods for safer software development, and today in developing methods for interactive classification processes. Recent boosts in data availability, new and improved computer hardware, accompanied by substantial progress in machine learning algorithms pushed automated, image-based species identification into reality. Jana and I had the idea together in 2011. We wrote a project proposal and started the Flora Incognita project, which I coordinate at the TU Ilmenau, in 2014. Right from the start, this was a very interdisciplinary project in which computer scientists and biologists worked together.

What can people do with the Flora-APPS suite of mobile apps?

Jana: The goal of the Flora Incognita project is developing a semi-automated plant identification tool for mobile devices. To achieve this goal we have developed three different apps (two are publicly available), which people can freely download from the Google Play Store and the iOS AppStore. With the Flora Capture App (Flora Incognita n.d.) users can capture observations of wild plants containing images from different perspectives. Botanists at the MPI-BGC are validating and identifying these observations. The user receives feedback about the identified plant species directly on their device. This means, the user gets a species identification from a human expert botanist in exchange of high-quality images depicting this plant. A steadily growing number of interested people support our research by capturing such observations.

Patrick: In return, we as computer scientists get a lot of structured plant images through the Flora Capture App, which are necessary for training the neuronal networks used for the automatic species identification in our second app, the Flora Incognita App (Flora Incognita n.d.). The Flora Incognita App allows for identifying plants automatically. The identification process adapts to the situation meaning that depending on the unknown plant’s life form (herb, tree, grass, or fern) and the current date, we ask the user to take an initial image of a certain organ of the plant. If this image allows for an accurate identification, the process completes otherwise we follow up and ask for an additional image of the plant and so on. Underneath, we utilize a sophisticated cascade of latest deep neural networks and many additional data science technologies.

How are Flora-APPS being used in Citizen Science projects? For example do people just use it as individuals or as part of an organized project like with a club, or nature project?

Jana: At the moment,people use our apps mostly for their individual purposes. The Flora Incognita App is currently used by a wide range of people, e.g., pupils, students, parents teaching their children, laypersons, and even expert ecologists and botanists. We found that Flora Capture user are often people who want to combine their hobby of photography with getting a better understanding of nature. However, there are also a growing number of initiatives and organizations that use or suggest our apps for student camps, conservation initiatives aiming to protect certain areas, as part of guided walks through nature and so on.

What was the germ of the Flora Incognita research and how did your two institutions come to work together on the project?

Jana: Actually it was really simple. In 2011, a biologist asked a computer scientist if it could be possible to automate plant identification by developing a smartphone app.

Patrick: I was immediately enthusiastic about the idea and we worked together on a project proposal. With the joint funding program of the German Federal Ministry of Education and Research (BMBF) and German Federal Ministry for the Environment, Nature Conservation and Nuclear Safety (BMU) in 2012 for the implementation of the National Strategy on Biodiversity (BMU 2007) we had a suitable opportunity to submit our research idea. Especially, the combination of science and implementation makes this project so successful.

What has been the timeline of the development of the research, significant steps, and how has the uptake of the app suite by the public developed?

Patrick: At the beginning we spent a lot of time developing and filling a data repository. In fact, we collected more than one million images from expert collections, individuals and through Flora Capture to train the neural networks and developed methods to predict which species can occur at a certain position and at a certain time. And last but not least a considerable part of the work went into the multi-platform app development.

Can you tell us about the technology being used in the research, how you brought the different parts together, and what insights and innovations have you made over the course of the development?

Patrick: Latest machine learning methods are used for analysis of multimodal data. Images are classified using deep neural networks. For this purpose, a machine-optimized network architecture consisting of more than ninety million parameters is used, which was trained by “deep learning” algorithms on a very rich dataset of already more than 1.7 million plant images. The location of each observation is analyzed in terms of environmental, geographic and climatological characteristics and also used for the identification purpose. Similarly, the current date is used for predicting regionally specific observation periods and weather-dependent flowering seasons. Using date and location factors, such as soil type, slope, and average temperature, a probability model allows us to infer plant species that are likely to occur naturally at the current location. The results of the species prediction is then fused with the result of the image analysis. On a SINGLE image without any of the other information sources, like location and time, we currently reach an average identification accuracy of 86.5% for the 4,800 supported species including all 2,770 naturally occurring German plant species. In 95.7% of the cases, the correct species is among the first five recognized plants. Using all available data and multiple images, our actual accuracy in the app is substantially higher.

How have you designed your citizen science programme of research: how did you arrive at deciding on a app suite to bring the public into contact with your subject area; what type of UX, UI, or design research methods and processes have been put together?

Jana: Originally, the computer scientists in Ilmenau designed and implemented the Flora Capture for the field recordings of the biologists in Jena. After using the Flora Capture App 2016 exclusively for the project team, we decided that we would get many more pictures if we would make it publicly available. The more images the better the recognition that can later be provided by the Flora Incognita App. To be honest we were surprised about so many observations. In 2018, citizen scientists collected more than one hundred thousand images through the Flora Capture App. In 2019, the community of helpers further increased and we receive many Flora Capture observations every day.

Patrick: Our interdisciplinary team has among others a substantial background in software engineering and machine learning. From the beginning, we employ latest design principles like agile development, continuous integration (CI), test-driven-development. We use a CI setup where each major code change results automatically in a new build of the changed apps, which are then automatically distributed to a closed group of testers continuously evaluating the latest app versions and reporting issues. UX and UI are very important to us. We conducted several studies with users to identify requested features and especially on how they interact with the app. We try to continuously improve UX, but we also learned that different users sometimes completely disagree what is a good or bad UX and that it is almost impossible to satisfy everybody.

You are combining existing taxonomies and methods from your area of the study of flora with new technologies of machine learning and I assume a complex technology stack. Can you tell us about the experiences of marrying these two knowledge areas and skill sets?

Patrick: Automated plant species identification is a topic mostly driven by academics specialized in computer vision, machine learning, and multimedia information retrieval so far. Only a few studies have been conducted by interdisciplinary groups of biologist and computer scientists during the last two decades. Increasingly, research is moving towards more interdisciplinary endeavors. As our project shows, effective collaboration between people from different disciplines and backgrounds is necessary to gain the benefits of joined research activities and to develop widely accepted approaches.

What are your views on the value and uses of Citizen Science as researchers who have embraced its use and as practitioners of Citizen Science? And what do you think other researchers can learn from you in developing their Citizen Science projects in other disciplines?

Jana: For me as a scientist it is a new and inspiring experience to work in a research project with a large number of volunteers willing to support our work. I have learned that there are many citizens who want to participate actively and who see themselves as citizen scientist also as part of our project. We invest a lot of resources into communication with these people through social media, via e-mails but also receive many phone calls of interested people. I think it is important to be approachable and also to share the scientific knowledge that results from the project with the people. Furthermore, it is very rewarding how people use and often love technology that we invented; or to give a presentation and have an audience that is prepared with many concrete questions and is interested in every new development step.

What are your future plans for Flora Incognita?

We are continuously working on improving the apps and especially the underlying machine intelligence services. In the near future, this will include features for organizing and working with observations, adding more species and making the whole process even more intelligent and intuitive. We would like Flora Incognita to become a standard tool for the identification of plants. This will not only need more technical advances but also a stable community of users that support our work with Flora Capture observations and contributions on species information.

References

Flora Incognita. Flora Capture App. Accessed 6 June 2019. https://floraincognita.com/de/apps/plant-image-capture/.

Flora Incognita. Accessed 6 June 2019. https://floraincognita.com/apps/flora-incognita-app/.

BMU. ‘Nationale Strategie Zur Biologischen Vielfalt – BMU-Publikation’, 2007. https://www.bmu.de/PU225.

Flora Incognita

Posted by Flora Incognita

Use your smartphone to identify wild flowers! German # deeplearning research project. Download version 2.0 now for free! Authors Jana Wäldchen ORCiD 0000-0002-2631-1531 Patrick Mäder ORCiD 0000-0001-6871-2707

Leave a Reply

Your email address will not be published. Required fields are marked *