Department of Computer Science, The University of Montana, Missoula, Montana, USA.
Larry D. Hinzman, Elizabeth K. Lilly
Water and Environmental Research Center, University of Alaska, Fairbanks, Fairbanks, AK, USA.
Hydrologic Research Lab, NOAA/NWS, Silver Spring, MD, USA.
Engineering Research Center and Laboratory, University of Alaska, Fairbanks, AK, USA.
The biological and physical processes which characterize the arctic ecosystem are all affected by the thermal and hydrologic boundary conditions of a shallow zone above the permafrost called the active layer. The thermal and hydrological regimes are so intricately interrelated to each other and to the character and structure of the active layer, that any modification to one parameter will ultimately affect the entire system.
Understanding the various interactions that take place in arctic
ecosystems is important for a better comprehension of the arctic's
role in affecting global climate, and the possible consequences of
global climatic change on the arctic. The present knowledge of hydrologic
and thermal processes in the arctic is deficient, so it is necessary to
construct and verify computer methods for simulating the numerous
interactions that take place (see Figure 1).
Researchers at the University of Alaska's Water Research Center have developed independent thermal and hydrologic computer models for simulating processes in a northern Alaska watershed (See Figure 2). The thermal model uses spatially distributed meteorological data such as wind and temperature values, and then applies finite element methods to predict vertical temperature profiles, and subsequently, thaw depth. The topography for the modeled region is viewed as a two-dimensional collection of evenly-spaced nodes with varying elevations. This model concentrates mostly on the thermal regime in the vertical direction and does not rigorously define the hydrologic influences. For example, soil moisture, which affects thermal conductivity, is assumed constant in both temporal and spatial domains.
The hydrologic model also reads spatially distributed
meteorological data and
uses various balance equations to predict snowmelt, soil moisture, and
evapotranspiration, as well as discharge at points of interest.
Through a series of timesteps, the model accounts for the disposition
of water in the form of energy and flow processes. Flow processes
include the movement of water over the surface, through channels, and
the subsurface (Hinzman et al., 1991). The hydrologic model deals
mostly with the movement of water, and does not rigorously treat
certain aspects such as thaw depth. For example, the thaw depth
determines how much water the underlying soil can absorb and how
much will run off, so its accuracy is important in a hydrologic
model. However, current treatment of thaw depth in the hydro
model is to make it spatially constant and temporally varying
as a function of the day of the year.
The thermal model had recently been modified for parallel execution on the Cray T3D by making use of Cray-specific compiler directives and had been tested on 1 to 32 processors with respectable performance gains (Li, 1996). The hydrologic model had been parallelized more generally for execution on architectures ranging from clusters of Linux PC's to the Cray T3E, using Message Passing Interface (MPI) (Morton et al., 1998). Thus, both models were independently mature, but the scope of our investigation focused on the coupling of these two existing models to produce more realistic simulations of the hydrologic and thermal processes in an arctic regime.
In a realistic model, the hydrological processes would be influenced
by a spatially varying thaw depth throughout the problem domain, and,
likewise, the thermal processes which affect the thaw depth would
be influenced by changing moisture content throughout the problem
domain. A coupling of the two models, in which certain parameters
are transmitted from one model to another, rather than being estimated
through crude empirical data, would provide a more realistic
representation of the actual processes occuring in the hydrologic
and thermal regimes of an arctic environment (See Figure 3).
The primary emphasis in this project was to take the existing thermal and hydrological models and couple them so that they ran simultaneously and synchronized at specified intervals in order to exchange pertinent coupling parameters. As much as possible, the original codes would remain intact, and coupling would be implemented through the introduction of new subroutines rather than modifying the original source code. Additionally, both codes had been parallelized, and it was desirable that both codes continue to run coupled, in parallel, on the same architecture, without excessive modification to original, parallel code. Finally, funding for this specific coupling project came from Cray Research, Inc., so primary emphasis was placed on Cray MPP architecture implementation, though there was interest in making the code portable enough to run on a broad class of architectures ranging from clusters of Linux PC's and workstations to state of the art supercomputers.
The infrastructure which needed to be created was one in which two originally independent models, each capable of execution on a parallel computer, could run simultaneously on the same computer and synchronize at specified intervals to exchange certain "coupling" data. Because some supercomputer architectures (the Cray T3E, for example) and software support only the execution of identical executables on parallel processors (Known as Single Program Multiple Data, in which each processor runs the same program, but with different data), it was necessary to first combine the two separate codes into a single code. So, the first step was to transform the main program in each code to a subroutine, and then construct a master, driving program which would call the appropriate subroutine. For example , Figure 4 depicts the code structure of the previously independent thermal and hydro models.
program thermal program hydro /**************** /***************** self-contained self-contained program for program for modeling thermal modeling hydro processes. processes. *****************/ ******************/ end endFigure 4: Separate programs for thermal and hydro models.
program hydrotherm NumHydroProcesses = 8 NumThermProcesses = 4 /******************* Driving program for the hydro and thermal subroutines. ********************/ if (MyProcessNumber < NumHydroProcesses) then call hydro else call thermal endif end subroutine thermal subroutine hydro /**************** /***************** Self-contained Self-contained subroutine for subroutine for modeling thermal modeling hydro processes. processes. *****************/ ******************/ return return end endFigure 5: Coupled program for thermal and hydro models.
Additionally, because the two codes were originally parallel, it was necessary to ensure that each code would still run in parallel, just as before, communicating amongst its processors. At the same time, it would also be necessary to construct a mechanism that allowed the two separate codes to communicate with each other (see Figure 6).
The use of Message Passing Interface (MPI) (Pacheco, 1997) for interprocessor communication allowed us to achieve many of our goals. MPI provides an environment and an associated collection of functions which facilitate identical copies of programs to be run simultaneously and communicate with each other. Because MPI had been implemented on numerous architectures and operating systems, we could be assured that the parallel mechanisms in our code would port to systems ranging from Linux PC's to Cray T3E's.Additionally, MPI allowed us to construct "communication groups" of related processes. By default, all processes in the parallel implementation belong to the "world" communication group, MPI_COMM_WORLD, and are all able to communicate with each other. However, we can partition processes so that some go to one group, and the rest go to the other group. Thus, we can collect the thermal processes in one group and the hydrological processes in a separate group. Under this scenario, the processes within each group will communicate with each other as if they were running alone on a parallel machine. With MPI we can also construct intercommunicators for exchanging messages between different communications groups. This mechanism, therefore, can provide us with the capability to exchange coupling parameters between the thermal and hydrologic communication groups. A sample "skeleton" code is provided in the appendix. An explanation of this sample code is far beyond the scope of this paper, but the code is provided to aid others who may be interested in pursuing similar work.
The initial coupling between the two models was set up so that the thermal model would periodically provide thaw depth data to the hydro model, and the hydro model would provide soil moisture storage values to the thermal model. The thermal model ran on daily timesteps, whereas the hydro model ran on hourly timesteps. Thus, the hydro and thermal models would "rendezvous" every time the hydro model had completed another 24 timesteps and the thermal model had completed one more timestep.
The region being modeled was the Kuparuk River basin extending
from the north slope of Alaska's Brooks Range to the Arctic coast.
The topography comes from digital elevation
data for a 111 km x 237 km region with a
resolution of 1000 meters. The region is bordered on the south
by the Brooks Range and slopes down northward to the Arctic Ocean
(see Figure 7).
The simulation period is a span of 109 days, beginning in mid May.
Results are still quite preliminary. However, coupling has been
achieved, and test runs have been made with the models running in
coupled mode and uncoupled mode. The graphics below show the predicted soil
moisture storage (expressed as a percentage) at Julian day 208
(approximately late July) of the simulation. Figure 8 shows values
obtained from the hydro model running independently, receiving no
data from the thermal model. In Figure 9, the values are a result of
the hydro model using predicted thaw depths sent to it by the thermal
model. Recall that the original, uncoupled hydro model used thaw depths
that were based on empirical data, and these values were held constant
through the whole region at each timestep.
The results of these preliminary uncoupled and coupled simulations suggest that coupling the hydrologic and thermal models will indeed produce substantially different results. The addition of spatially varying thaw depth into the hydrologic model should, and does, result in greater heterogeneity within the problem domain. Clearly, coupling of the two models better captures the many interacting processes that occur within the hydrologic and thermal systems.
The work described here is quite evolutionary in nature. The central idea has been to take two completely independent computer models and couple them in order to capture some of the feedback mechanisms that exist in the real world. It is anticipated that the models will be modified extensively in the future, so it is desirable to allow the two models to remain as independent and modular as possible, yet, at the same time each model must be modified somewhat in order for coupling to occur.
In the future, more ambitious coupling mechanisms will be explored, and each model will be enhanced in order to make better use of the data coming from the other model. For example, the flow of moisture from one region to another carries latent heat, which in turn may be released downstream to promote thawing, and this should be more accurately represented in the thermal model. With time-varying soil moisture data from the hydro model, this is now possible.
This work is still preliminary, and more effort will be put into verification of modeled results. However, the work has resulted in a computer model capable of coupling two intricately interacting systems. Such a model, with proper controls and verification, will allow us to now investigate more intricate issues of an arctic ecosystem.
This work was supported by Cray Research, Inc. and the University of Alaska's Arctic Region Supercomputing Center under a University Research and Development Grant. Additional support was provided by the National Science Foundation, Arctic System Science Program, grant number OPP-9318535.
Hinzman, L.D., Kane, D.L., Gieck, R.E., Everett, K.R.: Hydrologic and Thermal Properties of the Active Layer in the Alaskan Arctic, Cold Regions Science and Technology, 19 (1991) 95-110.
Li, Shu: Development of Parallel Simulation Software For the Study of Thermal Processes in Kuparuk River Basin of Alaska. Masters Thesis, University of Alaska Fairbanks, 1996.
Morton, D.J., Zhang, Z., Hinzman, L.D. and O'Connor, S.: The Parallelization of a Physically Based, Spatially Distributed Hydrologic Code for Arctic Regions. In Proceedings of the 1998 ACM Symposium on Applied Computing, Atlanta, GA, (1998) 684-689.
Pacheco, Peter S.: Parallel Programming with MPI, Morgan Kaufmann Publishers, Inc., 1997.
Zhang, Z: Development of a Spatially Distributed Model of Arctic Thermal and Hydrologic Processes (MATH). PhD Dissertation, University of Alaska Fairbanks, May 1998.
The code provided below is fully functional Fortran77/MPI code which implements the basic infrastructure needed to couple two existing parallel codes. Execution of this code begins in the main program with all processes enrolling in MPI. Next, numerous MPI routines are called to create two separate groups of processors - one group for the hydro code, the other group for the thermal code. Finally, the hydro processors will call the subroutine for the hydro code, and, likewise, the thermal processors will execute the code in subroutine thermal.
The hydro and thermal subroutines illustrate some basic communication between processors in the same group, and between the two groups. The hydro subroutine has the hydro processors send their processor number to the master hydro process, which sums these values and then sends it to the thermal master process in the other group. Then, the hydro master process waits for a message from the thermal master process, and when received, broadcasts it to the rest of the hydro processes.
Likewise, the thermal master process begins by waiting on the value sent by the hydro master process, then sends its own value back to the hydro master process.
Click here to view the source code.