You are on page 1of 18

Precision Agriculture, 1, 199 216 1999. 1999 Kluwer Academic Publishers. Manufactured in The Netherlands.

Vision Guided Precision Cultivation


D. C. SLAUGHTER, P. CHEN AND R. G. CURLEY dcslaughter@ucdavis.edu Biological and Agricultural Engineering, Uni ersity of California, Da is, California 95616

Abstract. A color machine vision based automatic guidance system was developed for precision guidance of an agricultural cultivator. The guidance system was designed to operate in weedy row crop fields at the time of first cultivation. The performance of the system varied from an RMS guidance error of 7 mm under low weed loads to 12 mm under high weed loads and was capable of operating at travel speeds up to 16 kmrh. Keywords: cultivation, guidance, machine vision, robotics, weed control

Introduction Agricultural field equipment is still largely guided by manual means despite several decades of effort to develop automatic guidance systems for agriculture Jahns, 1983; Tillett, 1991.. Despite the lack of adoption, the goal of mitigating the drudgery of repetitive manual operation of field equipment is still venerable. Automatic guidance also has the potential to increase productivity by eliminating redundant or missed implement andror chemical coverage between two adjacent passes Palmer and Matheson, 1988; Palmer, 1984.. Successful automatic guidance could also alleviate the stress and tedium of row crop operations and enhance safe and precise practices. Chancellor 1981. studied the principle of substituting information for energy in agriculture and observed that the efficiency of a technology depends upon its ability to use information to guide and time the application of resources. Chancellor cited examples where the operational processes were not changed but added information was used to actuate control so that optimal energy savings were possible. Chancellor observed in his information-for-energy study that microcomputers offered opportunities for efficient handling and processing of more management information than traditional manual methods and that these improvements offered potential energy savings of about 15% in existing agricultural operations. In this regard, if the exact location of each crop plant was known then cultivation tools could be positioned to remove a higher percentage of the weeds with a minimal increase in energy use. The added energy input to precisely position these tools would be greatly offset by the reduction in manual labor required to remove weeds by hand hoeing. Kaminaka et al. 1981. studied the steering performance of a tractor operator as the complexity of the task increased from steering only the tractor to steering the tractor plus monitoring an implement located behind the tractor. They found that

200

SLAUGHTER, CHEN AND CURLEY

operator performance significantly degraded when the operator shared attention between steering and rear monitoring and that the level of degradation increased under simulated increase in vehicle speed. The adverse effect of weed competition on crop yield is well known with early season weed control being critical to maximizing crop yields e.g. Buchanan and Burns, 1970; Dowler and Hauser, 1975.. Parish et al. 1995. noted that precision guided cultivation using cone guide wheels can allow cultivation of all but a 150 mm band centered about the row which is then treated with chemical herbicides or by hand hoeing. California farmers, in an effort to reduce the amount of chemical herbicides and hand hoeing costs, attempt to cultivate to within a 75 mm band but frequently experience difficulty in employing workers who have the necessary skills to operate cultivation equipment with the desired precision and at the desired travel speed. Some farmers in the Sacramento valley have attempted to solve this problem by using field equipment that is controlled by two operators; one operator steers the tractor while the second steers the rear mounted implement. There are several commercial systems for automatically guiding the rear mounted implement via an electro-hydraulically controlled shifting three-point-hitch e.g. Navigator or Sukup Slide Guide.. These commercial systems typically use an electro-mechanical sensor which is designed to sense the location of a salient feature such as the edge of a furrow by contact. These systems are not widely used in California due to problems with performance similar to those noted by Tillett 1991. for other mechanical guidance systems. Reid and Searcy 1987. and Fehr and Gerrish 1995. have attempted to use run-length encoding on a thresholded image as a means of identifying the crop row edge for the guidance of an agricultural tractor. Reid and Searcy reported offset errors up to 10 cm and Fehr and Gerrish reported a range of standard deviations from a straight line of 1.6 cm to 3.9 cm depending upon travel speed. Billingsley and Schoenfisch 1997. used linear regression in each of three crop row segments or viewports. and a cost function analogous to the moment about the best fit line to detect lines fit to outliers i.e. noise and weeds. as a means of identifying row guidance information and indicated an accuracy of 20 mm using this technique. The Hough 1962. transform is a computationally efficient procedure for detecting discontinuous lines in pictures Duda and Hart, 1972.. This technique was proposed by Reid and Searcy 1986. to facilitate row identification for agricultural guidance and was employed by Fujii and Hayashi 1989. in their method for automatic guidance of a combine harvester. Marchant and Brivot 1995. used a custom transputer network to compute the Hough transform for row tracking in real-time 10 Hz. and noted that their technique was tolerant to outliers i. e. weeds. only when the number of outliers was reasonably small compared to the number of true data points. Marchant and Brivot were able to obtain binary images which were relatively free of weeds in a transplanted cauliflower field where the cauliflower plants were substantially larger than the weeds using only a single near infrared image. Marchant et al. 1997. reported an overall RMS error of 20 mm in lateral position at a travel speed of 0.7 mrs using this technique to guide an agricultural vehicle. Unfortunately, many California row crops such as processing

VISION GUIDED PRECISION CULTIVATION

201

tomatoes are direct seeded and Tian 1995. was unable to differentiate tomato seedlings from the similarly sized weeds such as nightshade using either visible or near infrared characteristics alone. In this case, the opportunity to obtain images free of outliers at the time of first cultivation is unrealistic. The objective of this research was to determine the feasibility of using machine vision to both autonomously and precisely position a tractor drawn mechanical cultivator with respect to plants in a weedy row at the time of first cultivation. The system was to operate at a real-time rate sufficient to guide a cultivator drawn at a traditional travel speed of 5 kmrh and was to be constructed of low cost, off-the-shelf machine vision hardware.

Methods and materials Mechanical and electronic hardware The machine vision guidance system was evaluated using a 6 m wide custom tool sled which consisted of two portions, a stationary tool bar and a movable tool bar Figure 1.. The stationary tool bar was supported in the center by an A-frame member which was connected to the tractors John Deere, model 7800. three-point

Figure 1. Schematic top view showing the custom tool with guidance cameras attached to the movable toolbar via Alloway cultivation tools.

202

SLAUGHTER, CHEN AND CURLEY

hitch and at the two ends by a set of cone wheels 52 cm dia... The tractors wheel spacing was set to either 1.5 m for tomatoes or 2 m for cotton or lettuce which allowed it to straddle one or two beds respectively during travel. The two cone wheels traveled along the outside shoulder of the beds immediately outside each tractor tire. In addition to the two cone wheels, pneumatic tired gauge wheels 40 cm dia.. were also attached to the stationary toolbar for height control and traveled along the top of outside beds adjacent to the rear tractor tires. The movable tool bar was connected to the stationary tool bar via a four-bar linkage which allowed it to move "15 cm in the direction perpendicular to the row of seedlings and parallel to both the stationary tool bar and the soil surface. Two opposing single ended hydraulic cylinders Sawyer Machine Works, 3.8 cm = 30.5 cm. were used to position the movable tool bar relative to the stationary tool bar. The mounting of the hydraulic cylinders and the hydraulic circuit were designed to provide equal velocity travel in either direction. Hydraulic power was supplied by the tractor. The position of the movable toolbar was controlled using a machine vision guidance system which consisted of two solid-state color video cameras with CCD-Iris Sony, model SSC-C370., a computer Macintosh, model IIfx, with Motorola 40 MHz 68030 CPU., two manual iris video camera lenses Cosmicar, model C815B C30811. 8.5 mm focal length, f 1:1.5., two high-speed 24 bit color video frame acquisition boards RasterOps, model 24xltv., and an electronically controlled, proportional hydraulic control valve Double A, model VPQF5MCLCY510A1DC12.. A linear potentiometer Waters model LFW-12r300-0D5. of the same length as the hydraulic cylinders was mounted in parallel to one of the cylinders. The linear potentiometer and associated electronic interface was used by the computer to determine the position of the movable toolbar relative to the stationary toolbar. A portable electric generator Honda model EM2500. was used to provide electric power to the computer, video cameras, and all supporting electronics. A step test of the custom toolbar and hydraulic circuit was conducted using closed loop proportional control to provide background information on system performance prior to development of the machine vision algorithm. In this test, the movable toolbar was required to move from its default centered position to a position 123 mm to one side. This test was designed to characterize the response of the system and exceeded the required position response of the system under actual cultivation Figure 2.. The cameras were each mounted in protective housings and attached to the standard of a cultivation tool Alloway. which was clamped to the movable toolbar Figure 3.. Each camera was mounted directly above the center of a pair of cultivation disks in such a way that the centerline of the camera was directly above the center of the two tools below. The camera and the pair of tools moved as a unit. In this configuration, if the camera was centered above the crop the tools below were also centered, eliminating the additional computational step described by Tillett 1989. to translate from vision coordinates to cultivation tool coordinates required for other configurations where the camera is not centered above the object being controlled and eliminating any errors associated with changes in the alignment between the tractor and the toolbar associated with systems where

VISION GUIDED PRECISION CULTIVATION

203

Figure 2. Step test response of hydraulic control system, position control only.

Figure 3. Schematic side view showing how the camera was mounted on the cultivation tool.

204

SLAUGHTER, CHEN AND CURLEY

the camera is mounted on the tractor and not the tool itself e. g . Fujii and Hayashi, 1989; Fehr and Gerrish, 1995; or Marchant et al., 1997.. Each camera was mounted to the cultivator standard on its side i.e. rotated 90 about the optical axis of the lens. so that the aspect ratio of the image sensor more closely matched the long narrow shape of the row of seedlings being viewed. The cameras were positioned facing the direction of travel and pointed in a downward position that was adjustable so that each cameras field of view could observe plants directly in front of the toolbar. The tilt of the camera was adjustable to allow for variations in planting configurations and variations in germination rates or other factors that could affect the number of crop plants per unit length of row. The position shown in Figure 3 is typical for a direct seeded crop with few missing crop plants and would allow all plants from 15 cm to 135 cm directly in front of the toolbar to be viewed. A passive mechanical shade was designed to provide uniform diffuse illumination of the seedline at all times independent of tractor orientation Figure 4.. Without illumination control, shadows caused by the tractor or the cultivation tools obscured portions of the image at given orientations of the tractor relative to the sun. The shade consisted of a rectangular parallelepiped frame 137 cm long by 53 cm wide by 48 cm high with fixed metal louvers mounted on the sides and front. The top of the frame was covered with cotton canvas shade cloth and the rear of the frame had a cotton shade cloth with a custom opening that enclosed the camera. The louvers were mounted so that no direct sunlight could enter the chamber but that adequate levels of indirect sunlight provided diffuse illumination of the seedline. The passive shade provided uniform diffuse illumination which on a typical California sunny day provided adequate illumination levels for the CCD cameras described previously from sunrise to sunset independent of tractor orien-

Figure 4. Schematic front view of passive louvered shade used to provide uniform diffuse illumination of the seedline.

VISION GUIDED PRECISION CULTIVATION

205

tation. Active light sources were mounted within the frame to provide night-time illumination. Machine ision guidance algorithm The machine vision algorithm used to identify the crop location was based upon color segmentation of the 24 bit true color image into a binary image and a stochastic pattern recognition technique to identify a consistent linear region of crop plants among a chaotic i.e. randomly positioned. group of weeds. When the computer examined an image to determine the location of the crop it was restricted to a trapezoidal region of interest ROI. which coincided to the perspective view of the region in which the crop was planted. For small plants one to two true leaf stage. the region of interest was from four to six inches wide. The restricted region of interest reduced the time required to locate the crop since the entire image need not be examined. The computer examined pixels in the ROI and compared the color of each pixel to the corresponding logical value in a predefined color lookup table similar to that described by Slaughter 1987.. If the logical value was one 1. then the computer considered the point to be a possible crop point, otherwise the pixel was ignored. The color segmentation was conducted using a color lookup table which contained the two class crop plant and all others. classification for all 16.78 million colors possible for a 24 bit true color system. The classification was conducted off-line using Bayesian probability theory. The probabilities of each class were calculated using Bayes Law which is expressed using the following discriminant function Duda and Hart, 1973.: Di x . s y 1 2
T 1 < < x y i . Cy i x y i . q log C i . q log P wi . .

where, i is the mean color vector of class wi , Ci is the covariance matrix of class wi , P wi . is the a priori probability of class wi and x is the particular color vector to be classified. The probabilities were converted into logical values where a zero 0. indicated that the color was not likely to be part of a crop plant and a value of one 1. indicated that the color was considered likely to be part of a crop plant. The table was used by the computer to rapidly determine which pixels were most likely to be associated with crop plants. The red, green, and blue rgb. values of tomato seedlings 2nd true leaf. and non-crop objects were characterized from video images collected in commercial processing tomato fields over a two year period and included a wide range of natural illumination conditions, tomato varieties, weed species and soil types. The range of rgb values observed in this two year period were condensed into the mean and covariance matrices shown in Tables 1 and 2 and used to implement the color classifier. The framegrabber used in the prototype digitized the rgb levels into discrete values ranging from 0 to 255, 8 bits each..

206

SLAUGHTER, CHEN AND CURLEY Table 1. Mean and covariance values used to classify crop tomato. plants. Mean Red Red Green Blue 140.6 181.3 163.8 Red Green Blue 1463.4 826.8 353.2 Covariance Green 826.8 1572.3 1217.7 Blue 353.2 1217.7 1674.4

In many cases the color of the crop and that of other plant material are indistinguishable. To identify the seedline in this situation a stochastic pattern recognition technique was used to estimate where the centerline of the crop plants was located. In a preliminary study, Slaughter et al. 1997. evaluated several stochastic pattern recognition techniques that could be used to identify a crop seedline at the first true leaf stage, prior to cultivation when weeds were present, and which could be conducted at a minimum real-time rate of 10 Hz using only a simple frame grabber and a low cost computer. They found that a technique based upon a spatial median was the least sensitive to wind and changes in the image due to ArD conversion noise, was less sensitive to skews in the distribution of plants due to outliers weeds., and could be conducted using only the processing power provided by a 40 MHz Motorola 68030 CPU. The spatial median was also capable of determining the seedline location when there was a gap in the seedline because some of the crop plants were missing. This situation commonly occurs in direct seeded commercial plantings of processing tomatoes in California where it is not uncommon to see 50 cm or longer gaps in the seedline with the missing crop plants replaced by weeds. For these reasons, the median was selected for development into a machine vision guidance algorithm. The median is a statistical technique for estimating the central tendency of a population and is less biased by outliers than the mean or linear regression techniques. A spatial median is used in this case as a unique way to estimate the location of the crop row. The key to calculating the median in real-time using a general computer platform such as the one used here is to avoid the normal requirement to sort the data by the parameter of interest in this case the position offset of the seedline.. To accomplish this the computer scans the ROI in a systematic manner starting at the upper left corner which corresponds to the view directly in front of the camera and to the left side. The computer scans across the

Table 2. Mean and covariance values used to classify non-crop objects everything else.. Mean Red Red Green Blue 56.0 57.6 51.8 Red Green Blue 88.1 50.7 62.1 Covariance Green 50.7 65.9 57.5 Blue 62.1 57.5 85.0

VISION GUIDED PRECISION CULTIVATION

207

image perpendicular to the direction of travel which is also perpendicular to the crop row.. Whenever a pixel is identified as a possible crop plant the computer increments a counter corresponding to the position along the line perpendicular to the crop. There is one counter for each possible pixel position along this line. For the RasterOps framegrabber this line is digitized into 480 pixels and so the algorithm had 480 counters. The process is then repeated at fixed increments toward the opposite side of the image always scanning from left to right until the opposite side of the image is reached. Once the entire image is scanned, the values in the counters represent a cumulative cross-sectional spatial distribution or histogram of the row. Since the image was scanned in a systematic manner the median of this histogram can be calculated directly by adding up the values in each position counter until a value equal to 50% of the population is reached. The position of the corresponding counter that causes the total sum to equal 50% of the population is the median row position. A flowchart of the spatial median algorithm is shown in Figure 5. In this flowchart hist. is an array of counters holding the cumulative cross-sectional spatial distribution, X is the coordinate parallel to the seedline, Y is the coordinate perpendicular to the seedline, K is the space increment in the X direction and J is

Figure 5. Flowchart of algorithm to determine the spatial median of a color row crop image.

208

SLAUGHTER, CHEN AND CURLEY

the space increment in the Y direction. The values of K and J are selected to optimize spatial resolution and real-time performance. In practice, the spatial resolution was adjusted so that 3,000 to 15,000 pixels in the trapezoidal ROI were analyzed depending upon plant size and travel speed. With the computer hardware used in this study, the algorithm shown in Figure 6 could identify the crop row from a 3,000 pixel ROI at a rate of 45 Hz and at a rate of 10 Hz when the resolution was increased to a 15,000 pixel ROI. If the field contains weeds that are substantially larger than the crop or the crop stand is very poor the ROI can be subdivided into several smaller regions and a median value is calculated for each. The subregion median values are then compared and any portion of the image that is substantially different from the remainder is ignored. This helps the algorithm ignore very large weeds. Once the location of the crop is known, the deviation between the current position and the desired position was calculated and used as the input or feedback to a closed loop control algorithm that controlled the position of the cultivator. The closed loop control system included the computer vision system, an electronically controlled hydraulic valve and a hydraulic actuator. The prototype used proportional velocity control to control the position of the cultivator, where the flowrate of hydraulic fluid to the hydraulic cylinders positioning the toolbar was directly proportional to the position error determined by the computer vision system. When no plants were present in the field of view, or if the field of view was completely obscured by plant material, or if the operator desired to position the cultivator manually, the linear potentiometer was used by the computer in a closed loop control loop to maintain the desired position. A custom control panel consisting of two joy sticks one for coarse control and one for fine control. was used by the operator to adjust the cultivators position. The control panel also allowed the operator to offset the desired position of the seedline a fixed amount

Figure 6. Illustration of color segmentation, spatial histogram and median on color row crop image. left., spatial histogram & median; middle., segmented image; right., color image.

VISION GUIDED PRECISION CULTIVATION

209

from the center of the image to compensate for plants that are consistently leaning to one side due to wind or sun angle. Row crops are often planted several rows at a time e.g. processing tomatoes are typically planted three beds at a time in California.. This system was designed to have one or more of the pairs of cultivation tools equipped with cameras. When the system has one camera then all the pairs of cultivation tools are controlled based upon the view of the single camera. When multiple cameras are present then the position of the tools is based upon the joint view of the cameras involved. Multiple cameras decreases the chance of guidance errors because the computer ignores any camera views that are substantially different from the rest. If two cameras are used then the computer can either ignore both if they do not agree or choose the most conservative of the two.

Characterization of precision planted tomato seedling colinearity Marchant et al. 1997. described the nature of defining the true position of the row for establishing the accuracy of generic row following as subjective due to vagaries of individual plant position and rows not being exactly straight. They settled on using the middle of the plant leaf area as seen from above as the definition of the row location. This definition is best suited for spraying tasks such as the precision guided spraying task described by Giles and Slaughter 1997.. In the case of mechanical cultivation the best definition of individual plant position is the location where the plant stem emerges from the soil because the foliage may not be centered above the stem and it is possible for a mechanical cultivator to be centered about the foliage while the stem is being cut by the cultivation disk. To estimate the lack of colinearity of direct seeded processing tomato plants in a typical commercially planted seedline a 2 m metal straight edge was placed on the soil parallel to the seedline and gently positioned until the straight edge touched the stems of at least two tomato seedlings. The perpendicular offset of each tomato seedling from the straight edge was then measured along the 2 m length. This process was replicated at four random locations in a commercial tomato field in Yolo County, CA. The overall root mean square RMS. deviation of the seedling position from a straight line was then calculated. Characterization of manual r cone wheel guided culti ation To establish a baseline by which to judge the performance of a machine vision guidance system the performance of several traditional manualrcone wheel guided cultivators was evaluated. Four commercial processing tomato farms near the UC Davis campus were randomly selected and the cultivator guidance performance of their usual cultivator operator was measured. All tests were conducted immediately after cultivation and without any prior knowledge of the operator. For each test, 60 m subsections of four different passes were randomly selected and the distance between the two marks left in the soil by the close cultivation disks and the

210

SLAUGHTER, CHEN AND CURLEY

distance between the mark on one side and the point at which the tomato plant entered the soil were measured at 1 m intervals. To adjust for variations in cultivation width due to irregular height control each distance between a tomato plant and the corresponding cultivation mark was multiplied by the ratio of the actual distance between the cultivation disks when at the proper height to the distance between the two disks soil marks. The overall RMS error between the desired cultivation disk to plant distance and the actual distance measured was determined for each operator. On one farm a second operators performance was also evaluated at a later time when the usual operator was unavailable due to illness. This second operator had significantly less experience operating the cultivator than the usual operator. Field testing of machine ision guidance system A series of four tests were conducted in commercial processing tomato fields single rowrbed. to evaluate the performance of the machine vision guided cultivator under different weed loads. Three of the tests were conducted at the traditional time with the crop at the second true leaf stage. Three fields were selected in which the weed to tomato ratio calculated on a top view area basis. was 1:10 defined as low., 1:1 defined as moderate., and 3:1 defined as high.. The fourth test was conducted in a weed free defined as none. field when the tomato plants were at the cotyledon stage and represented the ideal visual condition since the stems were well centered below the foliage. All tests were conducted on windless days, using two cameras positioned as shown in Figure 1, and at a travel speed of 8 kmrh. The tractor was operated by students with no commercial experience operating a cultivator. The tests were conducted in the middle of the day without the shade shown in Figure 4 because no shadows obscured the seedline at this time. The tilt angle of the cameras was set so that the field of view included approximately 2 m of seedline. The overall RMS error between the desired cultivation disk to plant distance and the actual distance measured was determined for each weed load condition in the same replicated manner described for the human operators. Additional field tests of the machine vision guided cultivator were conducted in other row crops Iceberg lettuce and cotton.. The weed loads in these fields was low for lettuce and low to moderate for cotton and both crops were cultivated at the second true leaf stage. All tests were conducted on windless days. The tractor was operated by a person with some tractor experience but with no commercial experience operating a cultivator. The tests were conducted in the middle of the day without the shade shown in Figure 4 as no shadows obscured the seedline at this time. In both cases the uniformity of crop plants in the seedline was very good and only one camera was used. The tilt angle of the camera was set so that the field of view included approximately 2 m of seedline. For the cotton test the travel speed was 8 kmrh. The soil conditions in the lettuce field were such that travel speeds of 16 kmrh were possible without substantial vibration of the machine vision system and so the tests in lettuce were conducted at this speed. The overall

VISION GUIDED PRECISION CULTIVATION

211

RMS error between the desired cultivation disk to plant distance and the actual distance measured was determined for each crop in the same replicated manner described for the human operators. A final test of the machine vision guided cultivator was conducted on the UC Davis campus. For this test the movable toolbar was used to plant tomatoes in a curvy row. The purpose of this test was to demonstrate the row following capability of the guidance system under an extreme condition and one for which a standard cone wheel guided cultivator was not capable of following. For this test tomatoes were planted in a row with a single S shaped curve of 12 cm peak to peak amplitude in the middle of a 150 m long test plot. Initially the row was located at the center of a standard 1.5 m wide tomato bed. At the end of the S shaped curve the row was located 12 cm to the left of the bed center. When the plants were at the 2 to 3 true leaf stage the test bed was cultivated using the machine vision guidance system. The weed load was low at the time of cultivation. The test was conducted at night using the shade frame shown in Figure 4 equipped with lights. There was no wind during the test and the travel speed was 8 kmrh. The overall RMS error between the desired cultivation disk to plant distance and the actual distance measured was determined in the same manner described for the human operators, however the test was not replicated as in previous tests.

Results The average RMS deviation of tomato seedlings from a straight line over 2 m subsections of the row in one commercial processing tomato field was found to be 7.5 mm. Among the factors affecting seedling colinearity were soil texture e.g. clods. and seed bounce during planting. Observation of precision planting showed that despite precise release tomato seeds often landed more than 7 cm from the point of initial impact due to bouncing. A lack of colinearity in the seedline introduces a limit to the guidance accuracy that can be achieved by any system which uses data from multiple seedlings to establish the center of the row. The overall RMS cultivator positioning error for the five operators studied ranged from 6.7 mm to 19.3 mm Table 3.. Operator no. 1 was generally regarded by his supervisor as a very good cultivator operator and had the lowest error rate, traveled the slowest, and had his cultivation tools the closest together of the operators evaluated. Operator no. 4 was the substitute for Operator no. 1 when he was unavailable due to illness. The skill level of Operator no. 4 was inferior to no. 1 with nearly double the RMS error rate. Generally, operators adjust the spacing between the cultivation tools to match their skill level. In this study only operators 4 and 5 were observed killing crop plants with the cultivation tools. A computer simulation using 10 million random observations generated from each operators tool spacing and RMS error and the RMS deviation of the tomatoes reported above was conducted to determine the probability that the operator would kill crop plants due to errors in steering. In general the probabilities were quite low, with operator no. 4 being the highest at 0.013, indicating that the tool spacing were fairly well matched to the operators skill level.

212

SLAUGHTER, CHEN AND CURLEY

Table 3. Cultivator guidance performance of five manualrcone guided cultivators. Distance between cultivation tools mm. 76 127 127 76 127 Probability of steering errors killing crop plants 0.00016 0 0 0.013 0.0022

Operator 1 2 3 4 5

Skill level Very good Average Average Below Average Poor

Travel speed kmrh. 3 5 5 4 5

Overall RMS error mm. 6.7 10.7 10.8 13.3 19.3

In general, the color segmentation of the true color images into crop and background classes was acceptable. The foliage color of many weeds in Northern California is similar to that of processing tomato foliage. A color image of a typical commercial California processing tomato field prior to first cultivation is shown in Figure 6a. In Figure 6b, the performance of the Bayesian color classifier can be seen where the pixels classified into the background class were changed to black and the color of the crop class pixels were not modified. Many of the weeds in this image 6b. were classified as crop because their color was similar to the crop. In actual field use no pixels in the digitized image were actually modified as shown in Figures 6b or 6c, these images were created to illustrate the system performance. The spatial histogram of this moderately weedy tomato seedline is shown in red in Figure 6c. To allow improved real-time performance only those pixels in a trapezoidal ROI centered about the seedline shown by the blue or yellow pixels in Figure 6c. were analyzed. The non-linear spacing of the pixels analyzed in the ROI in the direction of travel was selected to subsample the image at roughly equal spacing on the actual seedline. To construct the spatial histogram each pixel in the ROI is classified as background or crop. In Figure 6c those pixels classified as background are shown in blue while those classified as crop are shown in yellow. The yellow pixels of Figure 6c correspond to the ROI subset of the crop plant pixels shown in Figure 6b. Only the crop plant pixels shown in yellow in Figure 6c were used to build the spatial histogram overlaid in red at the bottom of Figure 6c. The spatial histogram shows the distribution of crop colored foliage across the bed for the length of bed shown in the image. The spatial median of the spatial histogram is also shown in Figure 6c as a red line parallel to the direction of travel and overlaid on top of the image, and in this example gives an accurate estimate of the seedline location. Also shown in Figure 6c as red lines perpendicular to the direction of travel are the subregion division lines. The subregion medians were also calculated for this example and are shown as fine red lines parallel to the direction of travel within a subregion. However in 5 of the 7 subregions in this example the subregion medians were identical to the overall median and are hidden by the red line showing the location of the overall spatial median. The performance of the machine vision guided cultivator under different weed loads is shown in Table 4. The Overall RMS error ranged from 4.2 mm under ideal

VISION GUIDED PRECISION CULTIVATION Table 4. Cultivator guidance performance of the machine vision guided cultivator at various weed loads. Weed to tomato area ratio 1:10 1:1 3:1 Overall RMS error mm. 4.2 6.6 9.1 11.9 Minimum cultivation tool spacing mm. 65 76 89 107

213

Weed load None Low Moderate High

Tomato plant size Cotyledon 2nd true leaf 2nd true leaf 2nd true leaf

conditions to 11.9 mm under high weed loads. The machine vision guided cultivator performed comparably to the very good human operator under low weed conditions and comparably to the average operators when the weed load was moderate to high. All tests were conducted at a travel speed of 8 kmrh which was more than twice the speed of the very good operator. The guidance error over a 60 m row subsection typical of the low weed condition is shown in Figure 7. The minimum theoretical cultivation tool spacing that would provide the machine vision guided cultivator with the same probability of killing crop plants as the very good operator is also shown in Table 4. These results show that operators with average or below average skills would benefit in both performance and travel speed from the use of a machine vision guidance system.

Figure 7. Guidance performance of the machine vision guided cultivator operating in a field with a low weed load.

214

SLAUGHTER, CHEN AND CURLEY

The performance of the Bayesian color classifier trained on tomato seedlings was found to perform acceptably for both lettuce and cotton. Cotton leaves tended to have a more glossy surface than tomato or lettuce leaves and images of cotton leaves collected without the shade shown in Figure 4 while traveling toward the sun at sunrise were subject to substantial amounts of glare. Glare caused the associated pixels to saturate and resulted in white spots in the image resulting in poor color classification. Use of a shade allowed the system to operate normally at under these conditions. The performance of the system Table 5. was comparable to that observed in low to moderately weedy tomato fields. The soil conditions in the lettuce field were such that it was possible to increase the travel speed to 16 kmrh without substantial vibration of the toolbar. In other fields the soil texture e.g. clods. was such that significant amounts of vibration occurred causing problems with the quality of the video signal. It was thought that these problems were mainly associated with electrical connections that became erratic under vibration. Excessive vibration also caused problems such as the camera lens coming unscrewed after several hours of operation. These results indicate that significant increases in travel speed could be achieved if the vibration issues could be resolved. The performance of the machine vision guidance system following a 12 cm peak-to-peak S curve in the seedline under low weed conditions is shown in Figure 8. The RMS error in following the curve was 5.2 mm. Analysis of the error at the peaks of the curve show a slight amount of overshoot, but the error was comparable to a very good operator following a basically straight seedline. This extreme test validates the ability of the spatial median to accurately locate the center of the seedline and illustrates the advantage that machine vision guidance has over conventional cone guided cultivation in following the seedline independent of condition or location of the edge of the bed.

Conclusions This study demonstrated that color segmentation and an algorithm based upon the median of the spatial distribution of the seedline can be used with off-the-shelf machine vision hardware to develop a real-time guidance system for row crop cultural practices such as cultivation. The precision of the system was comparable to that of an average manualrcone guided cultivator and was demonstrated at travel speeds up to 16 kmrh. Under low weed loads the performance of the system was comparable to a very good cultivator operator, but with twice the travel speed.

Table 5. Cultivator guidance performance of the machine vision guided cultivator in cotton and lettuce fields. Crop Cotton Lettuce Travel speed kmrh. 8 16 Overall RMS error mm. 5.8 7.6

VISION GUIDED PRECISION CULTIVATION

215

Figure 8. Guidance performance of the machine vision cultivator following a S curve in the seedline.

This system creates an opportunity for improved cultivation results using less skilled operators. The ability of the system to operate 24 hours a day provides farmers with added flexibility in completing cultivation tasks when schedules are disrupted by unfavorable weather conditions. The increased travel speed may allow larger farms to reduce costs by reducing the number of cultivators operating at the same time. Additional work is needed to increase the robustness of the system associated with issues such as vibration and dust and to develop cultivation tools specifically designed for high speed operation.

Acknowledgments The authors would like to thank Gene Miyao of the University of California and the California Tomato Research Institute for their support of this research.

References
J. Billingsley and M. Schoenfisch, The successful development of a vision guidance system for agriculture. Computers and Electronics in Ag. 16, 147 163 1997.. G. A. Buchanan and E. R. Burns, Influence of weed competition on cotton. Weed Sci. 18, 149 154 1970.. W. J. Chancellor, Substituting information for energy in agriculture. Trans. of ASAE 244., 802 807, 813 1981..

216

SLAUGHTER, CHEN AND CURLEY

C. C. Dowler and E. W. Hauser, Weed control system in cotton on Tifton sandy loam soil. Weed Sci. 23, 40 42 1975.. R. O. Duda and P. E. Hart, Use of the Hough transformation to detect lines and curves in pictures. Comm. of the ACM 151., 11 15 1972.. R. O. Duda and P. E. Hart, Pattern Classification and Scene Analysis John Wiley and Sons, Inc., New York, 1973.. B. W. Fehr and J. B. Gerrish, Vision-guided row-crop follower. Applied Eng. in Ag. 114., 613 620 1995.. Y. Fujii and M. Hayashi, Boundary detecting method and apparatus for automatic working vehicle, US Patent No. 4,868,752 1989.. D. K. Giles and D. C. Slaughter, Precision band spraying with machine-vision guidance and adjustable yaw nozzles. Applied Eng. in Ag. 401., 29 36 1997.. P. V. C. Hough, Method and means for recognizing complex patterns, US Patent No. 3,069,654 1962.. G. Jahns, Automatic guidance in agriculture a review. ASAE Paper No. 83-404 ASAE, St. Joseph, MI, 1983.. M. S. Kaminaka, G. E. Rehkugler, and W. W. Gunkel, Visual monitoring in a simulated agricultural machinery operation. Human Factors 232., 165 173 1981.. J. A. Marchant and R. Brivot, Real-time tracking of plant rows using a Hough transform. Real-time Imaging 1, 363 371 1995.. J. A. Marchant, T. Hague, and N. D. Tillett, Row-following accuracy of an autonomous vision-guided agricultural vehicle. Computers & Electronics in Ag. 16, 165 175 1997.. R. J. Palmer, Energy developments. Energex Conf. Proc. May, pp. 691 696 1984.. R. J. Palmer and S. K. Matheson, Impact of navigation on farming. ASAE Paper No. 88-1602. ASAE, St. Joseph, MI, 1988.. R. L. Parish, D. B. Reynolds, and S. H. Crawford, Precision-guided cultivation techniques to reduce herbicide inputs in cotton. Applied Eng. in Ag. 113., 349 353 1995.. J. F. Reid and S. W. Searcy, Detecting crop rows using the Hough transform. ASAE Paper No. 86-3042, ASAE, St. Joseph, MI, 1986.. J. F. Reid and S. W. Searcy, Automatic tractor guidance with computer vision. SAE Technical Paper Series No. 871639 1987.. D. C. Slaughter, Color Vision for Robotic Orange Harvesting. PhD Dissertation, University of Florida, 1987. D. C. Slaughter, P. Chen, and R.G. Curley, Computer vision guidance system for precision cultivation. ASAE Paper No. 97-1079 ASAE, St. Joseph, MI, 1997.. L. Tian, Knowledge Based Machine Vision System for Outdoor Plant Identification. PhD. dissertation, University of California, Davis, 1995. N. D. Tillett, Automatic guidance sensors for agricultural field machines: a review. J. Agric. Eng. Res. 50, 167 187 1991.. R. D. Tillett, A calibration system for vision-guided agricultural robots. J. Agric. Eng. Res. 42, 267 273 1989..

You might also like