in

Using satellite imagery to evaluate precontact Aboriginal foraging habitats in the Australian Western Desert

‘Foraging habitat suitability’ is a reference to the favorability of a patch of land for day-to-day subsistence. Here, suitability is an index value ascribed to each potential foraging patch (grid cell) captured in a raster image, based on terrain movement costs and the proximity of each patch to water and green vegetation. We constructed our foraging habitat suitability model using satellite-derived environmental data, digital terrain information and anthropological field data on foraging range (Fig. 3). The model’s environmental foundation is based on more than two decades of continuous near bi-weekly Landsat-5 satellite observations, allowing for the systematic detection and measurement of water recurrence and vegetation condition for every 30-x-30 m image pixel. This period of observation is long enough to observe multiple fluctuations in this highly variable environment and to not be restricted to a single short-term climatic state, such as a bushfire or drought. Therefore the time frame provides a reliable observation and measurement of maximum vegetation greenness, regardless of temporary drops in NDVI. Similarly, maximum extent and occurrence of surface water is systematically measured through long-term satellite observations, avoiding measurements only of phases of drought or irregular rainfall. For this reason our model focuses on maximal values to represent the best environmental conditions that would have been available for past foraging activities since the last glacial, based on the contemporary climatic regime.

Figure 3

A satellite derived model of foraging habitat suitability for the Australian Western Desert. Foraging habitat suitability is highly variable within IBRA boundaries and throughout the Western Desert. Several massive areas of low-ranked foraging habitats are evident throughout the region. IBRA codes and excavated rockshelter sites (lime green- numbered) are defined in the Fig. 1 caption. Map created in ESRI ArcGIS Desktop 10.5.1 (https://desktop.arcgis.com), linear stretch (1.0%) visualization. See “Methods” section for source raster information.

Full size image

The model also uses the ALOS World 3D 30 m digital elevation data product to quantify terrain ruggedness across the study area46. Terrain ruggedness is a geomorphometric measure of land surface ruggedness, where elevation variability is used to infer ease of traversal when walking between locations in the landscape. Terrain ruggedness is suggestive of potential energy expenditure, assuming that increasingly rugged terrains necessitate higher levels of physical activity and caloric intake. Here, we integrated measures of ruggedness with environmental satellite data, providing an indication of which patches of vegetation and water are most easily accessed in regards to minimum changes in elevation.

Walking time to observed surface water is the final spatial parameter incorporated with the model. It is calculated using Tobler’s47 hiking algorithm and information on daily foraging practices. Historic anthropological data indicates Western Desert foraging activities typically operated for 4–6 h each day1,48, with foragers moving up to a day from ephemeral water sources in their food quest1. In accordance with these ethnographic statements we spatially delineated land areas where regular foraging activities may have occurred by first calculating the walking time from water, then weighting all areas that were less than 8 h walk from water more heavily in the input which went into our final suitability model. Since resources are said to be permanent in uplands4,5, we assume mountainous refugia were always suitable foraging habitats, so these refuge areas have been masked and removed from consideration (see mountain ranges in Fig. 3).

Appropriate elements from all of the aforementioned satellite datasets were combined to produce our foraging habitat suitability model (Fig. 3). The ~ 30 m spatial resolution of the data facilitates the construction of a spatially-explicit, geographically broad, yet fine-grained ecological model to visually observe and critically appraise foraging habitat suitability at a variety of scales, offering new perspectives on regional human behavioral ecology. The model provides a continuous ranking of the relative foraging value for each landscape patch (or 30 m grid cell in this instance). Interpretation of patch values is based on the proposition that foragers know the conditions in all parts of the landscape they visit, and they organized their daily foraging movements in accordance with the factors outlined above.

Our habitat suitability model illustrates the highly varied favorability of foraging patches across the Western Desert (Fig. 3), as calculated from data on natural resource distribution, terrain attributes, and daily foraging range. The model is conceptual, based on quantitative environmental variables that have been well documented to influence desert foraging activity. In regards to the model’s robustness, the input variables are equally weighted and statistically independent (see “Methods” section). The equal weighting reflects the concepts and assumptions of earlier research, particularly of existing landscape mapping, offering a coherent and consistent modelling approach. Advanced mathematical modelling, incorporating sensitivity analysis49,50, could be used to modify the weighted contribution of each variable, and such modelling will be the subject of future papers. Until more detailed knowledge of past forager land use and contemporary resources becomes available there is little benefit in arbitrarily substituting other input values in our model.

The model comprises a matrix of nearly 1.3 billion data cells, each of which has been individually analyzed and ascribed each foraging patch a value indicative of potential habitat suitability. The computational power required to statistically analyze the dataset is massive, so to simplify computing and broadly characterize intraregional variation, we scaled up using nationally defined IBRA subregions. We used IBRA boundaries to group and rank the patch values into low, moderate, and high foraging habitat suitability classes and then calculated the land area occupied by each class (Fig. 4 and Table S1). Higher-ranked localities are well positioned in relation to suitable resources and easily traversed terrains. Lower-ranked patches are considered poorly-suited habitats due to their considerable distance from water and plant resources, and they are in comparatively rugged terrains. Areas deemed to have moderate foraging suitability have mixed accessibility to resources and variable terrain ruggedness.

Figure 4

Percentage of land area (km2) occupied by low, moderate, and high-ranked habitat suitability patches for the eleven largest IBRA bioregions of the Western Desert (Table S1). The histogram is ordered left to right based of the percentage of high-ranked foraging habitat within each bioregion. The percentages for the entire Western Desert are presented on the far right.

Full size image

The results show that during times of maximum water abundance and vegetation greenness, 36.6% of the Western Desert has high-ranked habitat suitability (Fig. 4 and Table S1). Moderately suitable areas constitute 48.9% and low-ranked patches encompass 13.1%. Breaking these findings down further, we calculated the ranked land areas for the eleven largest IBRA subregions (> 10,000 km2) of the Western Desert (Fig. 4 and Table S1). At a broad bioregional level, intra-upland zones (Fig. 4; CER01) and desert plains (Fig. 4; GAS02, GVD01, and NUL01) offer a greater percentage of high-ranked foraging habitats. Bioregions dominated by dunefields have considerably less high ranked land areas compared to uplands, plains, and areas of low relief, although it is important to note that there is also considerable patchiness amongst suitable foraging areas in sandridge desert regions (Fig. 3). For instance, the centrally located Gibson Desert dunefield area (Fig. 4; GID02) has very little area of high-ranked habitat (10.8%), which is far less than other sandridge desert bioregions (Fig. 4; GSD02, LSD02, GVD02, GVD03, and GVD04) where high-ranked suitability areas range between 22.3 and 39.9%. Similarly, the Gibson Desert stony desert bioregion (Fig. 4; GID01), which is dominated by lateritic surface gravels, records only 25.5% high-ranked habitat areas. Thus, at a coarse-grained scale, it seems that some central core regions of the Western Desert are more environmentally hostile and offer less high-ranked foraging opportunities compared to more peripheral bioregions. This generality does not imply such areas were unutilized by desert peoples, but rather some areas were on average volatile and had low productivity.

Foraging potential is highly varied amongst bioregions and land systems

When viewed at a fine-grained scale, our model clearly shows that there is an uneven gradient of suitable foraging habitats across the Western Desert, and foraging suitability trends are not pervasive throughout particular bioregions or land systems (Fig. 3). Away from montane uplands, water permanence is always temporary, and land systems with low topography, such as plains, stony plains, and sandridge desert, have highly varied foraging suitability, even when characterized in the best environmental conditions.

The implications of this variation are important to understanding human ecology of the ethnohistoric period and the late Holocene archaeological record of the past 2000 years, when climatic conditions and landscapes were much like the present day36,40,42. Many scholars have noted that the historic desert peoples were familiar with the distribution of regional natural resources1,5,7. It has been argued that resource knowledge was articulated with socioeconomic strategies, and that groups routinely utilized all areas of the Western Desert during times of good rainfall and resource abundance. However, our suitability model reveals that there are large, expansive areas of the desert landscape that would have presented substantial challenges for survival, even in the best environmental circumstances (Fig. 3).

Our model further suggests that low-ranked locations of foraging suitability were always below average productivity and were always comparatively unsuited as foraging habitats. To carry out that measure, we needed an independent indicator of land productivity, NDVI. We used satellite observations of maximum vegetation greenness to quantify how land productivity differs amongst low, moderate, and high ranked foraging habitats (Table S2). Variation in mean (µ) NDVI for each habitat class illustrates how land productivity differs within and amongst the most prominent Western Desert bioregions (Fig. 5). Given the below average NDVI of all low-ranked desert lowlands, we hypothesize that broad clusters of extremely unsuitable localities would be unlikely to provide adequate returns (Fig. 5), even when foragers were pursuing low-variance or lower quality resources51. Based on the distribution of low-ranked patches (Fig. 3), we agree with earlier research that the entire desert region was not equally economically viable for foraging, and that substantial tracts of land were not economically attractive to resident populations4,5,14,32. We also recognise that the distribution of massive-sized sub-optimal patches may be an important factor shaping the patterns of movement through the landscape, with foragers potentially preferring movement along high suitability corridors. However, unlike earlier research, our suitability model shows that unfavorable foraging areas are not correlated with large units of biogeography alone. Our model depicts the environmental variability of the Western Desert at a much higher resolution than its predecessors, revealing several massive land tracts where unfavorable foraging conditions occur (Fig. 3). If ethnographic patterns of land use were in place, we predict that many of these large areas would have been rarely utilized or perhaps some were purposefully avoided due to known deficiencies in the resource energy base12. This proposition is readily testable because it predicts that archaeological sites with poorly sorted, low densities of artefacts will be found in these places12. Defining the appropriate scale will be the key to testing our model, since we have demonstrated that broad biogeographic units are heterogenous and yet at a fine-grained scale, small areas of low suitability, which are often a local geographic feature (e.g., sand dune, bare rock outcrop, or erosional area), need not have been obstacles. Model testing will need an intermediate scale commensurate with daily foraging radii.

Figure 5

Boxplot of mean NDVI values and one standard deviation for low, moderate, and high-ranked suitability classes for the eleven largest IBRA subregions (ak) of the Western Desert (l). Mean NDVI for individual bioregions and the spatial bounds of the Western Desert study area denoted as dashed black line and solid green line, respectively. IBRA subregion boxplot groups (ak) are presented in order of increasing percentage of high-ranked foraging habitat, after Fig. 4. Table S2 offers the precise summarised NDVI values for each bioregion and suitability class.

Full size image

At present, the archaeological land use pattern of low-ranked foraging habitats is not something that is well-understood from the Western Desert, although periodic and short term use of impoverished, low productivity patches has been predicted12. Studies of contemporary Western Desert groups indicate that human-induced firing of the landscape enhances biodiversity and land productivity51,52,53,54, so it is possible that low productivity patches may have occasionally benefited from anthropogenic burning, especially in the past 1500 years51. However, research also suggests that cultural burning practices did not have widespread regional impacts51,52,53,54,55. Human influence on landscape modification is localized within day-range foraging areas around residential camps and frequently traversed pathways51,52,53,54. Low productivity patches away from residential camps were probably unlikely targets for either anthropogenic burning or foraging if higher-ranked patches were closer.

Elsewhere, in the eastern Australian arid zone, periodic use of climatically harsh desert localities is known from archaeological sequences. While in some cases preservation may explain chronological discontinuities56, there is compelling evidence for irregular occupation in several desert areas10,57,58,59,60. For instance, in the western Strzelecki Desert broad portions of dunefield landscapes were periodically abandoned for centuries or even millennia57,60, while in semi-arid portions of southeastern Australia sequences of occupation were separated by decades or centuries of local/regional abandonment58,59. Fluctuations in local foraging suitability may well be a factor producing discontinuous land use across the Australian arid lands, and we suggest that in the Western Desert there were patches with chronologically varying foraging potential. The key test of this prediction would be to investigate whether archaeological sites in locations of fluctuating habitat suitability over time also display histories of discontinuous visitation. Such sites could be identified through local palaeoenvironmental records but we suggest that selections based on time-series analysis of vegetation greenness from the past few decades would be more readily used to establish samples and would facilitate comparison of archaeological sites in terms of local foraging suitability and NDVI values, as well as archaeological records of continuous or discontinuous visitation.

Satellite data reveals a more nuanced understanding of land use

Australian archaeological research has relied heavily on biogeographic principles to distinguish the ‘barriers and boundaries’ of Aboriginal subsistence and settlement in the arid zone4,5,61. While equating particular land use practices with specific bioregional areas was initially useful for generalized conceptualizations of traditional foraging behaviors, the coarse analytical scale of earlier approaches is now problematic. Subsequent research has shown the dynamics of Aboriginal occupation and land usage in the Western Desert to be more complex and variable across spatial and temporal scales than originally conceived9,24,30,33. To gain a more nuanced understanding of past land use and foraging patterns, finer-scale methods of analysis are required.

We used satellite imagery to tackle the issue of scale, allowing for a sharper and more spatially explicit examination of desert environments and landscapes. For example, as we focus at higher resolution on various areas of the Western Desert, our model clearly shows that foraging suitability is highly varied across all desert lowlands (Figs. 3, 4 and 6). In sandridge desert areas, proposed to have been a barrier at times in the past4, the model shows there are many well-watered and amply vegetated localities where good foraging is possible when rainfall is high and surface water is abundant (Fig. 6a). In this context, interdunal swales are hardly barriers to occupation because they can be lush with water, plant, and wildlife resources after local rain, and the energy expenditure required to walk along interdunal swales is low in comparison to the requirements needed to repeatedly scramble across a sea of loose sands and undulating dunes. Thus, it seems entirely plausible that resident groups could navigate and forage in many dunefield areas by following a well-resourced network of swales during times of good environmental conditions. The fine-grained nature of this observation opens up the possibility that many sandridge deserts were not necessarily broad barriers to occupation and that precontact land use behaviors varied in different dunefield contexts.

Figure 6

High resolution perspectives of various Western Desert landforms (e.g., sandridge, stony plain and sandy plain contexts) with generally higher-ranked and lower-ranked areas of foraging suitability. This figure illustrates the fine-grained scale of our habitat suitability model (Fig. 3), which has implications for better understanding localized land use behaviors. Juxtaposed areas, as mapped in Fig. 3 are: (a) Higher-ranked sandridge habitats vs. (b) lower-ranked sandridge land system. (c) Higher-ranked stony desert habitat vs. (d) lower-ranked stony desert areas. (e) Higher-ranked sandplain land systems vs. (f) lower-ranked plain habitats. Maps created in ESRI ArcGIS Desktop 10.5.1 (https://desktop.arcgis.com), linear stretch (1.0%) visualization. See “Methods” section for source raster information.

Full size image

We also highlight that the resource-rich swale pattern is not found in all dune systems (Fig. 6b), and it is plausible that some of these areas were periodic barriers to occupation, as previously suggested in more generalized ecological models4. There are substantial areas of sandridge desert, especially within central areas of the Western Desert (e.g., GID02), where survival would have always been extremely difficult, even during times of abundance (Fig. 6b). This variability is also expressed in stony desert contexts, where southern areas of the lateritic Gibson Desert (GID01) offer better habitat suitability (Fig. 6c) than the northern areas (Fig. 6d). On a fine scale, plain land systems also exhibit a wide range of habitat suitability, where high-ranked habitat suitability appears fairly widespread in some areas of the Nullabor Plain (NUL01; Fig. 6e), yet other areas of the plain were poorly-suited for foraging (Fig. 6f).

In previous ecological models, stony desert and plain land systems are considered more favorable than sandridge desert4; however, as shown above, the modelled data clearly illustrate that there are substantial areas of plains and stony desert landscapes that vary considerably between high and low-ranked habitat suitability (Figs. 3, 4 and 6). The fine-grained scale of our model adds to a growing body of research5,9,24,30,33 that demonstrates how previous pan-continental characterizations of deserts as ‘corridors’ and ‘barriers’ for foragers oversimplify the link between human behavior and biogeography. When scrutinized at high resolution, extremely unsuitable foraging and very well-suited foraging areas can potentially occur in any area of the Western Desert, regardless of the biogeography or other physical characteristics. Thus, fine-grained ecological models allow for a more nuanced and spatially-explicit understanding of the past land use behaviors that led to the formation the desert archaeological record.

Using environmental remote sensing to infer LGM habitat suitability

There is no doubt that the Western Desert environment has changed and evolved over time, through both natural and human-induced processes8,36,37,51. The region has undergone considerable environmental fluctuations over time, resulting in landform transformations (dune aggradation, in particular) and changes in vegetation cover. The long-term physical impact of these environmental changes clearly place limitations on how modern satellite data can be used to interpret deep-time patterns of occupation and land use. However, our model shows the likely distribution of low-ranked foraging habitats when climatic conditions were much drier than present.

Ethnoarchaeological accounts depict resident populations as being low density and highly mobile, frequently moving and foraging across vast expanses of territory, thereby necessitating intermittent patterns of settlement1,2,3,4. Such a mobile strategy means that large swathes of the desert could not have been continuously occupied. Our habitat suitability model (Fig. 3) makes sense of the impermanent and mobile land use strategy seen historically. For example, we document several massive areas of the Western Desert where, in combination, surface terrains are physically challenging, the nearest proximity to surface water is greater than 2 days walk, and vegetation cover, density and condition is substandard, even in the best of documented environmental conditions! These exceptionally large areas were poorly-suited to foraging, and could not have been permanently occupied in the historical period. They would only have been visited rarely, perhaps only in atypical short-term climatic events, and may even have constrained forger movement between more favorable parts of their territories. Given current palaeoclimatic evidence we infer that in the Pleistocene these low ranked habitats would have been even more inhospitable to foragers than in recent times. During the LGM, the resource yields in such areas would have been more diminished than present, making conditions for survival even more difficult than today. Consequently, we predict that unless radically different economic strategies were being employed in the Pleistocene those areas would have been only rarely visited since the peak of the last glacial cycle, ~ 24–18 ka, even though adjacent desert areas may have supported regular or at least sporadic visitation.

Our hypothesis is clear, detailed, and framed to be testable by archaeological fieldwork. The number of Western Desert sites with old archaeological sequences is growing, but the sample is small, site distribution is widely scattered, and none are located in the harsher core areas identified in this study (Figs. 1 and 3). Thus, it is evident that archaeological fieldwork in those impoverished landscapes as well as environmentally richer and more reliable landscapes is necessary to understand historical land use patterns and to make statements about earlier phases of regional occupation. Our work highlights how future models of forager land use across Australia’s desert regions can comprehend the environmental complexity and fine scale of resource variability in these vast, remote and diverse places.


Source: Ecology - nature.com

A graduate student who goes to extremes

MIT students and alumni “hack” Hong Kong Kowloon East