Document Type : Research Paper
MSc. Graduate in Environmental Science, University of Gorgan, Gorgan, Iran
Associate Professor., Faculty of Agriculture and Natural Resources, University of Gorgan, Gorgan, Iran
Assist Professor, Faculty of Agriculture and Natural Resources, University of Gorgan, Gorgan, Iran.
Assist Professor, Faculty of Agriculture and Natural Resources, University of Gorgan, Gorgan, Iran
Several methods have been developed to aid in selecting a network of biodiversity protected areas. One of these methods is artificial intelligence which includes a number of different computer algorithms that use an objective function to find the best solution. An algorithm is a mathematical process or a set of rules used for problem solving. Recently, site selection algorithms have been widely used to identify areas of high conservation value. Two main types of site selection algorithms are optimal algorithms and heuristic ones. In this research, three types of heuristic algorithms, namely, annealing, greedy, and rarity simulated in the selection of the protected areas were compared to find the best regions for efficient environmental protection in Mazandaran Province. The effects of different parameters including conservation goals, scale, algorithms, and compactness of zones on the results were also investigated.
Materials and methods
In this research, 26 forest cover types, the habitats of 8 mammal species, and prominent distribution area for 4 groups of birds were used as input criteria to select the candidate areas for environmental protection in Mazandaran Province. Multi-criteria evaluation method and Echelon analysis were used to model the mammals’ habitats and the important bird distribution areas, respectively. Watersheds larger than 50 ha in size were defined and used for planning units in the process.
Simulated annealing, greedy, and rarity algorithms were used in the selection of best regions through Marxan software. Marxan delivers a decision support for reserve system design. The real strength of Marxan lies in its use of simulated annealing. However, Marxan is also capable of using such simpler, but more rapid methods, as greedy, rarity, and iterative improvement algorithms. The main aim of this research is the selection of a conservation network with minimum possible area so as to achieve all targets. The conservation target is to preserve minimum area of each forest cover type and the species habitat.
In the first scenario, the effect of different conservation targets including 30, 40, 50 and 60 percent of areal coverage of each protection criteria was investigated. In this scenario, Marxan was performed with 100 repeat runs, 10,000,000 iterations of simulated annealing and four different values of boundary length modifier (BLM) including 0,10,30,60 and 100. The BLM is used to control the level of fragmentation in the proposed conservation network, such that an increase in would result in a decrease of the network total boundary.
In the second scenario, the result of the simulated annealing was compared with the greedy and the best and average rarity algorithms. In this scenario, the target is 30 percent of each protection criteria with a BLM of 60.
In the third scenario, the effect of scale was considered in the six sub basins and the target is 30 percent of each protection criteria with a BLM of 60.
Results and discussions
The results of the first scenario showed for all targets that an increase in BLM will increase the total area of the proposed conservation network and decrease its total perimeter. Moreover, the result for the best scenario showed for targets of 30, 40, 50, and 60 that 18.43, 23.95, 31.29, and 38.82 percent of the province is necessary for conservation, respectively. Selection of the best target is one of the preliminary steps in the selection of best protected areas network.
Figure 1 shows the graph of total reserve boundary length versus total area in the second scenario. According to this figure, the simulated annealing provides the best result in terms of area and perimeter. Using greedy algorithm, the total perimeter of the proposed conservation network was increased and the zones were much more dispersed. On the contrary, it can be understood from Figure 2 that the rarity algorithms produced compacter zones which have much larger area.
Figure 1. Total reserve boundary length versus total area
Figure 2. The proposed conservation networks by a) the simulated annealing, b) the greedy, c) the best rarity, and d) the average rarity algorithms. Data mapped in Marxan are with 30% conservation target, 100 repeat runs, 10,000,000 iterations, and four different values of BLM= 60.
Moreover, the time spent for the simulated annealing, the greedy, and the best and average rarity algorithms were 00:28:40, 1:49:43, 5:00:06 and 2:12:25, respectively. Implementation on a smaller scale and lower planning units reduced the time spent for the greedy and the rarity algorithms. The simulated annealing provided the best result in terms of area and perimeter.
The results of the third scenario showed that the sub basin-scale analysis identifies a larger area of high conservation value than that of the province-scale analysis. This was expected because all conservation targets needed to be met in each sub basin. The overlap between the proposed conservation network in the province-scale and sub basin-scale are 42.33 percent. This result showed that spatial scale impacts the distribution and number of zones which were considered high priority for conservation within an area.
In this research, irreplaceability analysis was performed on the proposed conservation networks generated by different algorithms. According to Leslie et al. (2003), we defined irreplaceability as the number of times a planning unit was included in the proposed conservation network out of 100 runs. For the simulated annealing, 7 planning units were chosen during every one of the 100 runs and were absolutely irreplaceable. Also, 29.84 percent of the planning units were never chosen during the 100 runs. For the greedy algorithm, 10 percent of the planning units were chosen during every one of the 100 runs and 76.65 percent of the planning units were never chosen. For the best and average rarity algorithms, 21.13 and 36.75 percent of the planning units were respectively chosen during every one of the 100 runs. For the average rarity algorithm, 63.19 percent of the planning units were also never chosen during the 100 runs. These results showed that the greedy and rarity algorithms produced many fewer different solutions in comparison to the simulated annealing.
According to Leslie et al. (2003), the efficiency of representation was defined as the number of those protection criteria with corresponding values close or equal to 1.0. This indicates that the proposed conservation network meets the conservation targets for the criteria. The results showed that, for the simulated annealing, the targets of the 29 to 32 criteria were met in the proposed conservation network. This value for the greedy algorithm was equal to 25 criteria. For the best and average rarity algorithms, 24 and 32 criteria were overrepresented in the proposed conservation network, respectively.
The results of this study have shown that the conservation goals, scale, algorithms, and compactness of zones are influencing parameters on the selection of the best regions for efficient environmental protection. Consequently, determination of the appropriate values for these parameters is among the most important steps in conservation planning. The investigated parameters showed that the simulated annealing algorithm provides plausible results in all the cases and its application helps in the identification of best protection zones.