Irrigation Efficiency Series (Part 1 of 2): Is efficiency the answer to managing water shortages?

By John Stevenson

After a particularly wet winter and spring and an above-average snowpack, it’s easy to put the past behind us and forget the several years of drought our region recently experienced.  But drought happens, as they say, and will certainly happen again.  So it is worth reflecting on how irrigators will better cope when drought returns. Across the West, irrigation efficiency has gained attention in this context, as a way to stretch the number of days that irrigation water is available when drought hits.

Sprinklers spraying water under a pivot irrigation system in a field
Furrow irrigation, traditionally used in the Northwest, and one of the targets for improving irrigation efficiency. Photo by Flickr user Hanna 3232, under CC BY-NC-ND 2.0.

In general, there are two important places where efficiency plays a role.  The first is in moving water from stream to farm, known as ‘conveyance efficiency.’ The second the ‘application efficiency,’ or how much of the water conveyed to a field is available to the plants themselves. Both types of efficiency are being considered for improvements. In this article I discuss the technological changes that can lead to improved efficiency, and in the second article in this series I will discuss whether those improvements translate readily to solving water shortages.

There is no single number for the overall efficiency of a ‘traditional’ irrigation system, like those built in the early to mid 20th century. But generally, an open, unlined canal with sandy or loamy soils may deliver to a field about 60 to 70 percent of the water diverted from a stream or river.  Of course, you should take these figures with a grain of salt because they will vary widely depending on the soil type and the length of the diversion.  Traditionally, fields were irrigated using furrow irrigation, which takes the water delivered to the field and directs it into furrows that channel water across the field, saturating the soil. With furrow irrigation, about 50 to 60 percent of that water is actually consumed by the crop being grown.

A system with a 60 percent conveyance efficiency (CE) and a 60 percent application efficiency (AE), ends up with an overall efficiency (CE x AE) of about 35 percent. In other words, about a third of the water supply diverted for irrigation actually makes it to the plants. And while that is a lot of water that is not available to the crop, it’s important to note that the remaining two thirds of diverted water are not necessarily ‘wasted’.  For example, tailwater runoff from a field may make it back to the ditch and be used by other irrigators lower in the system, while water lost to ditch seepage will in most cases make it back to the river downstream as shallow groundwater return flows. That said, most of us would prefer to hold on to a higher portion of our share, whether it’s water, whiskey, or a paycheck. So there has been an understandable interest in improving this overall efficiency.

With technological improvements, irrigation efficiency has increased.  Some of the ‘success’ stories during several seasons of drought (if we can call them that without overlooking the hardships) were grounded in years of effort to modernize conveyance and on-field application of water. The Vale and Three Sisters irrigation districts in eastern and central Oregon exemplify some of the changes being made in the West. They have lined the bottom of canals with materials to prevent water seepage, and in some cases used piping, that can eliminate seepage altogether. In 2015, this resulted in these irrigation districts being able to stretch their water supplies from a shut-off sometime in early July—as other irrigators faced—to two to three weeks later.  While that is still much earlier than mid-October when irrigation usually ends, it was enough to help finish early maturing crops such as grains or cereals that many growers rely on during drought years.

Similarly, improvements in application efficiency are frequently achieved through advancements in sprinkler technology.  For example, the University of Idaho and Washington State University have made strides in this area through testing of Low Elevation Precision Application (LEPA) sprinkler systems. By delivering water from a center pivot modified with hoses that reach below a crop canopy, this system reduces water losses from wind drift and evaporation. Results suggest that these systems may improve efficiency by 20 to 50 percent reduce energy use. Their application has been expanding in the Pacific Northwest as producers take advantage of federal programs to help retrofit existing center pivot systems (here is an informative article from the Capital Press if you want to know more).

Water in field rows
Testing a Low Elevation Precision Application sprinkler system. Once the crop grows, the canopy helps reduce water losses due to wind drift and evaporation. Photo by Kay Ledbetter/Texas A&M AgriLife Research, under CC BY-NC-ND 2.0.

These efficiency success stories were part of a video series that CIRC and Oregon Sea Grant produced about water use last summer. So we have the technology and engineering know-how to improve both conveyance and application efficiency, and we know that this allows individual growers to keep a higher proportion of their water share. The catch? Do these efficiency improvements actually reduce overall water use? There is an intellectual kerfuffle about this question, which I will cover, with some important caveats, in Part 2 of this series. Stay tuned.

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *