Mapping uncertainty: sensitivity of wildlife habitat ratings to expert opinion.

Published online
19 Jan 2005
Content type
Journal article
Journal title
Journal of Applied Ecology

Johnson, C. J. & Gillingham, M. P.
Contact email(s)

Publication language
British Columbia & Canada


Expert opinion is frequently called upon by natural resource and conservation professionals to aid decision making. Where species are difficult or expensive to monitor, expert knowledge often serves as the foundation for habitat suitability models and resulting maps. Despite the long history and widespread use of expert-based models, there has been little recognition or assessment of uncertainty in predictions. Across British Columbia, Canada, expert-based habitat suitability models help guide resource planning and development. We used Monte Carlo simulations to identify the most sensitive parameters in a wildlife habitat ratings model, the precision of ratings for a number of ecosystem units, and variation in the total area of high-quality habitats due to uncertainty in expert opinion. The greatest uncertainty in habitat ratings resulted from simulations conducted using a uniform distribution and a standard deviation calculated from the range of possible scores for the model attributes. For most ecological units, the mean score, following 1000 simulations, varied considerably from the reported value. When applied across the study area, assumed variation in expert opinion resulted in dramatic decreases in the geographical area of high- (-85%) and moderately high-quality habitats (-68%). The majority of habitat polygons could vary by up to one class (85%) with smaller percentages varying by up to two classes (9%) or retaining their original rank (7%). Our model was based on only four parameters, but no variable consistently accounted for the majority of uncertainty across the study area. We illustrated the power of uncertainty and sensitivity analyses to improve or assess the reliability of predictive species distribution models. Results from our case study suggest that even simple expert-based predictive models can be sensitive to variation in opinion. The magnitude of uncertainty that is tolerable to decision making, however, will vary depending on the application of the model. When presented as error bounds for individual predictions or maps of uncertainty across landscapes, estimates of uncertainty allow managers and conservation professionals to determine if the model and input data reliably support their particular decision making process.

Key words