Is the project heavily dependent on the weather?
The biggest problem at the moment is the wind. If the drone flies around the tree for several minutes, and the images it takes during this time are to form the 3D model, then we need the tree to stay motionless. So we need to find out: How strong does the wind need to be for it to pose a problem? When can we fly our drone and be sure that the data it records is actually usable? And of course: What can we possibly do to ensure that we can still reconstruct useful data in an emergency? Another problem is heavy rainfall, since we can’t fly the drone in that either. Fortunately, the drone has something called RTK GPS, which allows us to locate it to within a few centimeters. This is important, if only because it lets us safely steer the drone around the trees. Normal GPS isn’t sufficient, because the deviation can be as large as one meter.
How is the project progressing?
The project started in late 2021 and is expected to continue into the third quarter of 2024, so we’re currently right in the middle of it. The first year, we had a delay because the drone and its accessories were either not available or had very long delivery times. We then improvised a bit and tried to recreate the whole thing by standing on a ladder with a camera, just to generate any data at all. If you don’t have the data, you can’t move forward on anything else. Now we have the equipment, and this season we were at a point where we could record the development of the buds right up to their flowering. In the meantime, we’re already seeing the first fruits and are preparing for the harvest.
Then it’s a matter of evaluating the data. After all, it’s about more than just making this cherry tree look beautiful in three dimensions; we’re also interested in things that would be helpful for the fruit grower to know. For instance: How is my tree developing structurally? What yield can I expect? When would be the best time to harvest? Furthermore, we’d like to see whether we can easily detect diseases, pest infestation or stress conditions such as drought in order to present this to the fruit grower as usable information.
Who’s involved in the project, and what does Fraunhofer IIS bring to the table?
Fraunhofer IIS is represented by its Development Center X-ray Technology and Communications Systems divisions. The consortium also includes the Friedrich-Alexander-Universität Erlangen-Nürnberg, which is responsible for 3D reconstruction as an expert in this field. Weihenstephan-Triesdorf University of Applied Sciences has expertise in fruit growing. And the district of Forchheim grants us access to the Franconian Switzerland Fruit Information Center in Hiltpoltstein. Here, trials are conducted with cherries as well as with other types of fruit.
Colleagues from the Communications Systems division have given us access to their mobile 5G campus network, which we’re using to transmit data from the drone to the ground. The Development Center X-ray Technology provides expertise in sensor technology, so that the drone can fly smoothly and automatically. Furthermore, our know-how in plant typing and modeling plays a role. Data evaluation is done through image processing. Here, our knowledge of conventional algorithms or conventional models, such as a leaf model, helps us. AI methods are also playing an increasingly large role in breaking the tree down into its component parts and categorizing them. As a result, you can find out exactly where the cherries are, where the leaves are, where the trunk is, and so on. That’s my department’s job. Other colleagues in the division are trying to develop a suitable data model for the digital twin, so that this information can all be collected in one place and read out again in a useful way.
What is the vision for the project? What might a future look like in which the digital twin is fully developed and in use?
We’re already in the process of testing what we’re now developing for wild cherries as an example and transferring it to apples. One possible option is that the drone will automatically take off at regular intervals, fly over the plantation and collect data. Using the methods we’re developing, this data is then reconstructed and analyzed to provide the orchardist with a digital overview of their plantation. They can see on the tablet what the condition of the entire plantation is, but also get information on individual trees that may be showing signs of disease or drought stress. The main point here is not to just blindly apply pest control to all trees or water them all equally, but deploy pesticides and scarce resources exactly where they’re needed.
In addition, with the crop forecast function, you can figure out how much yield to expect and when would be a good time to harvest it.
If you take this even further, then you could combine this technology with, say, a field robot. The drone can quickly fly over large areas and collect data, which it then provides to the field robot, showing it where to go. This is followed by the precise use of the robot’s sensor technology, which is designed to collect data even closer to an object and thus can map completely different characteristics. That would be one reasonable combination for the future.