Date of Award

January 2023

Document Type

Dissertation

Degree Name

Doctor of Philosophy (PhD)

Department

Biology

First Advisor

Susan N. Ellis-Felege

Abstract

Technology is rapidly improving and being incorporated into field biology, with survey methods such as machine learning and uncrewed aircraft systems (UAS) headlining efforts. UAS paired with machine learning algorithms have been used to detect caribou, nesting waterfowl and seabirds, marine mammals, white-tailed deer, and more in over 19 studies within the last decade alone. Simultaneously, UAS and machine learning have also been implemented for infrastructure monitoring at wind energy facilities as wind energy construction and use has skyrocketed globally. As part of both pre-construction and regulatory compliance of newly constructed wind energy facilities, monitoring of impacts to wildlife is assessed through ground surveys following the USFWS Land-based Wind Energy Guidelines. To streamline efforts at wind energy facilities and improve efficiency, safety, and accuracy in data collection, UAS platforms may be leveraged to not only monitor infrastructure, but also impacts to wildlife in the form of both pre- and post-construction surveys.

In this study, we train, validate, and test a machine learning approach, a convolutional neural network (CNN), in the detection and classification of bird and bat carcasses. Further, we compare the trained CNN to the currently accepted and widely used method of human ground surveyors in a simulated post-construction monitoring scenario. Last, we establish a baseline comparison of manual image review of waterfowl pair surveys with currently used ground surveyors that could inform both pre-construction efforts at energy facilities, along with long-standing federal and state breeding waterfowl surveys. For the initial training of the CNN, we collected 1,807 images of bird and bat carcasses that were split into 80.0% training and 20.0% validation image sets. Overall detection was extremely high at 98.7%. We further explored the dataset by evaluating the trained CNN’s ability to identify species and the variables that impacted identification. Classification of species was successful in 90.5% of images and was associated with sun angle and wind speed. Next, we performed a proof of concept to determine the utility of the trained CNN against ground surveyors in ground covers and with species that were both used in the initial training of the model and novel. Ground surveyors performed similar to those surveying at wind energy facilities with 63.2% detection, while the trained CNN fell short at 28.9%. Ground surveyor detection was weakly associated with carcass density within a plot and strongly with carcass size. Similarly, detection by the CNN was associated with carcass size, ground cover type, visual obstruction of vegetation, and weakly with carcass density within a plot. Finally, we examined differences in breeding waterfowl counts between ground surveyors and UAS image reviewers and found that manual review of UAS imagery yielded similar to slightly higher counts of waterfowl.

Significant training, testing, and repeated validation of novel image data sets should be performed prior to implementing survey methods reliant upon machine learning algorithms. Additionally, further research is needed to determine potential biases of counting live waterfowl in aerial imagery, such as bird movement and double counting. While our initial results show that UAS imagery and machine learning can improve upon current techniques, extensive follow-up is strongly recommended in the form of proof-of-concept studies and additional validation to confirm the utility of the application in new environments with new species that allow models to be generalized. Remotely sensed imagery paired with machine learning algorithms have the potential to expedite and standardize monitoring of wildlife at wind energy facilities and beyond, improving data streams and potentially reducing costs for the benefit of both conservation agencies and the energy industry.

Share

COinS