A digital model of the Bayou Bartholomew aquifer-stream system in Arkansas was calibrated for the purpose of predicting hydrologic responses to stresses of water development. The simulated-time span for model calibration was from 1953 to 1970, during which time the system was stressed largely by ground- and surface-water diversions for rice irrigation. The model was calibrated by comparing groundwater-level and streamflow data with model-derived groundwater levels and streamflow. In the calibrated model, the ratio of model-derived to observed streamflows for 17 subbasins averaged 1.1; the ratios among the subbasins ranged from 0.8 to 1.6. The average deviation of the differences between model-derived and observed groundwater levels at 47 nodes was 0.2; the average among the nodes ranged from -2.3 to 10.4. The average standard deviation of the differences between the model-derived and observed groundwater levels was 3.5; the average among the nodes ranged from 0.4 to 10.5. The model will provide projections of changes in the potentiometric surface resulting from (1) changes in the rate or distribution of groundwater pumpage or (2) changes in the stage of streams and reservoirs. The model will provide only approximate projections of the streamflow. (USGS)