A Comparison of Screening Tests for Soil Pb
Soil has been identified as a significant source of lead (Pb) exposure for both children and adults. Therefore, identifying possibly contaminated soils by soil testing is important to protect public health. Soil Pb test results are usually reported as total Pb (mg kg(-1)), carried out using a concentrated nitric acid digestion procedure by hot plate (EPA method 3050) or microwave (EPA method 3051) followed by inductively coupled plasma atomic emission spectrometry to determine total Pb in the digest. However, this procedure is both time-consuming and expensive, sometimes costing homeowners and gardeners over $50 per sample. To make soil Pb testing more economically accessible to homeowners and gardeners, several university soil-testing laboratories offer less expensive screening tests designed to estimate total soil Pb. The first objective of this study was to compare three commonly used screening tests, modified Morgan (MM), Mehlich 3 (M3), and 1 M nitric acid (HNO(3)), to the standard total Pb testing method (EPA method 3051) to find which extractant is the most reliable predictor of total Pb. The second objective was to investigate the effect that different degrees of soil grinding have on the total Pb test and the extracted Pb concentration measured from the 1 M HNO(3) test. Results indicate that the strongest predictor of total Pb is 1 M HNO(3), followed by M3, and MM, and that thorough grinding is necessary if using less than five grams of soil in a Pb test, in order to adequately homogenize Pb-contaminated samples and achieve acceptable testing reproducibility.
Water relations, gas exchange, and nutrient response to a long term constant water deficit
Wheat plants (Triticum aestivum) were grown for 43 days in a micro-porous tube nutrient delivery system. Roots were unable to penetrate the microporous tube, but grew on the surface and maintained capillary contact with the nutrient solution on the inside of the tube through the 5-micron pores of the porous tube. Water potential in the system was controlled at -0.4, -0.8, and -3.0 kPa by adjusting the applied pressure (hydrostatic head) to the nutrient solution flowing through the microporous tubes. A relatively small decrease in applied water potential from -0.4 to -3.0 kPa resulted in a 34% reduction of shoot growth but only a moderate reduction in the midday leaf water potential from -1.3 to -1.7 MPa. Carbon dioxide assimilation decreased and water use efficiency increased with the more negative applied water potentials, while intercellular CO2 concentration remained constant. This was associated with a decrease in stomatal conductance to water vapor from 1.90 to 0.98 mol m-2 s-1 and a decrease in total apparent hydraulic conductance from 47 to 12 micromoles s-1 MPa-1. Although the applied water potentials were in the -0.4 to -3.0 kPa range, the actual water potential perceived by the plant roots appeared to be in the range of -0.26 to -0.38 MPa as estimated by the leaf water potential of bagged plants. The amount of K, Ca, Mg, Zn, Cu, and B accumulated with each unit of transpired water increased as the applied water potential became less negative. The increase in accumulation ranged from 1.4-fold for K to 2.2-fold for B. The physiological responses observed in this study in response to small constant differences in applied water potentials were much greater than expected from either the applied water potential or the observed plant water potential. Even though the micro-porous tube may not represent natural conditions and could possibly introduce morphological and physiological artifacts, it enables a high degree of control of water potential that facilitates the investigation of many aspects of water relations not practical with other experimental systems.