Using average net returns and risk measures to compare irrigation management strategies

Date

2017-08-01

Journal Title

Journal ISSN

Volume Title

Publisher

Kansas State University

Abstract

Risk and uncertainty are inherent in agriculture especially when lack of precipitation needed for crop production is common. Precipitation in the High Plains is highly variable. To supplement precipitation, the Ogallala Aquifer, a large underground water storage reservoir, was developed for irrigation. However, as the saturated thickness of the aquifer decreases, the rate at which water can be extracted (i.e., well capacities) decreases. Limited well capacities induce risk in agricultural production because producers may not be able to irrigate sufficiently in dry years.
This study’s objective was to develop a method to assist producers in comparing alternative irrigation management strategies in the face of risk due to a limited well capacity. The objective was accomplished by simulating average net returns for 172 different irrigation strategies across 30 years (1986-2015) of historical weather (Kansas Mesonet 2016). Management strategies include different combinations of corn and wheat production with full irrigation, moderate irrigation, deficit irrigation and dryland production. The three risk measures were Value at Risk (VaR), expected shortfall, and standard deviation.
The risk-return tradeoff is estimated for management strategies for two well capacities, 300 GPM (gallons per minute) and 600 GPM. Estimating these risk measures can help producers better evaluate the optimal management strategy compared to the approach of only equating average net returns.

Description

Keywords

Aquifer, Irrigation, Well capacity, Risk management

Graduation Month

August

Degree

Master of Agribusiness

Department

Department of Agricultural Economics

Major Professor

Nathan P. Hendricks

Date

2017

Type

Thesis

Citation