Hydrology is rich in methods that use information theory to evaluate monitoring networks. Yet in most existing studies, only the available data set as a whole is used, which neglects the intraannual variability of the hydrological system. In this paper, we demonstrate how this variability can be considered by extending monitoring evaluation to subsets of the available data. Therefore, we separately evaluated time windows of fixed length, which were shifted through the data set, and successively extended time windows. We used basic information theory measures and a greedy ranking algorithm based on the criterion of maximum information/minimum redundancy. The network investigated monitored surface and groundwater levels at quarter-hourly intervals and was located at an artificially drained lowland site in the Spreewald region in north-east Germany. The results revealed that some of the monitoring stations were of value permanently while others were needed only temporally. The prevailing meteorological conditions, particularly the amount of precipitation, affected the degree of similarity between the water levels measured. The hydrological system tended to act more individually during periods of no or little rainfall. The optimal monitoring setup, its stability, and the monitoring effort necessary were influenced by the meteorological forcing. Altogether, the methodology presented can help achieve a monitoring network design that has a more even performance or covers the conditions of interest (e.g., floods or droughts) best.
Decreasing groundwater levels in many parts of Germany and decreasing low flows in Central Europe have created a need for adaptation measures to stabilize the water balance and to increase low flows. The objective of our study was to estimate the impact of ditch water level management on stream-aquifer interactions in small lowland catchments of the mid-latitudes. The water balance of a ditch-irrigated area and fluxes between the subsurface and the adjacent stream were modeled for three runoff recession periods using the Hydrus-2D software package. The results showed that the subsurface flow to the stream was closely related to the difference between the water level in the ditch system and the stream. Evapotranspiration during the growing season additionally reduced base flow. It was crucial to stop irrigation during a recession period to decrease water withdrawal from the stream and enhance the base flow by draining the irrigated area. Mean fluxes to the stream were between 0.04 and 0.64 ls(-1) for the first 20 days of the low-flow periods. This only slightly increased the flow in the stream, whose mean was 57 ls(-1) during the period with the lowest flows. Larger areas would be necessary to effectively increase flows in mesoscale catchments.
Decreasing groundwater levels in many parts of Germany and decreasing low flows in Central Europe have created a need for adaptation measures to stabilize the water balance and to increase low flows. The objective of our study was to estimate the impact of ditch water level management on stream-aquifer interactions in small lowland catchments of the mid-latitudes. The water balance of a ditch-irrigated area and fluxes between the subsurface and the adjacent stream were modeled for three runoff recession periods using the Hydrus-2D software package. The results showed that the subsurface flow to the stream was closely related to the difference between the water level in the ditch system and the stream. Evapotranspiration during the growing season additionally reduced base flow. It was crucial to stop irrigation during a recession period to decrease water withdrawal from the stream and enhance the base flow by draining the irrigated area. Mean fluxes to the stream were between 0.04 and 0.64 ls(-1) for the first 20 days of the low-flow periods. This only slightly increased the flow in the stream, whose mean was 57 ls(-1) during the period with the lowest flows. Larger areas would be necessary to effectively increase flows in mesoscale catchments.