California Agriculture
California Agriculture
California Agriculture
University of California
California Agriculture

All Issues

Consistent annual treatment helps future olive leaf spot control

Share using any of the popular social networks Share by sending an email Print article
Share using any of the popular social networks Share by sending an email Print article

Authors

Beth L. Teviotdale , Department of Plant Pathology
G. Steven Sibbett, UC Cooperative Extension

Publication Information

California Agriculture 49(5):27-32. https://doi.org/10.3733/ca.v049n05p27

Published September 01, 1995

PDF  |  Citation  |  Permissions

Author Affiliations show

Abstract

Tulare County research revealed that an orchard's disease history influences how well annual copper fungicide treatments will work in controlling olive leaf spot. Elevated disease levels are not easily reduced in 1 year, and consistent annual treatment is important for future disease control as well as for protection in the current year.

Full text

Above, olive leaf spot, also known as peocock spot and Cycloconium leaf spot, causes brightly colored lesions to appear on olive leaves. Near right, greasy black lesions are seen in winter and spring. Far right, spores are produced on the margins of oversummering lesions. Topical copper applications can prevent these spores from causing further infection.

Above, olive leaf spot, also known as peocock spot and Cycloconium leaf spot, causes brightly colored lesions to appear on olive leaves. Near right, greasy black lesions are seen in winter and spring. Far right, spores are produced on the margins of oversummering lesions. Topical copper applications can prevent these spores from causing further infection.

Olive leaf spot (OLS), also known as peacock spot and Cycloconium leaf spot, is caused by the fungus Spilocaea oleaginea (Cast.) Hughes. Susceptible olive Olea europaea L. cultivars grown commercially in California include ‘Mission’, ‘Sevillano’, ‘Manzanillo’, ‘Ascolano’, ‘Barouni’ and ‘Nevadillo’. The principle symptom of OLS is one or more dark green-to-black lesions, sometimes surrounded by a yellow halo, on the upper surface of leaves. Petioles, fruit and fruit stems are susceptible but rarely attacked. Premature drop of infected leaves causes twig death and exposes leaf scars to infection by the olive knot pathogen, Pseudomonas syringae pv. savastanoi. A loss of 10 to 20% of fruiting wood has been attributed to OLS when the disease is severe. Although serious industry-wide damage is not common in California, OLS is a chronic problem in some orchards and a potential threat to others.

In California, infection occurs during the rainy season in late fall, winter and early spring. New lesions and leaf drop become evident in spring, and the disease is inactive during the hot dry summers. The OLS fungus oversummers in lesions on leaves in the tree. In the fall the margins of these lesions expand laterally into adjacent healthy tissue, where spores called conidia are produced. These conidia are the principle inoculum for subsequent infection and are dislodged and moved by water.

Tree-to-tree spread is slow. Continuous free moisture for 24 to 36 hours is required for germination and infection, so wet weather favors the disease. Heavy rainfall years promote more OLS, and the disease is often more prevalent in the lower portions of trees, reflecting the downward movement of waterborne inoculum. In many olive-growing regions of California, dense fog prevails for several weeks during winter. Free moisture frequently is present on leaf surfaces for hours or days. OLS infections can occur under such conditions even when rainfall is not unusually abundant. OLS is more serious in humid olive-growing areas of the world, where extended periods of wet weather permit repeated disease cycles.

Topical copper applications prevent olive leaf spot. The copper is present in various products that are generally known as either “fixed coppers” or “Bordeaux.” (Fixed copper is an inclusive term for any of several basic copper salts. Bordeaux is a combination of copper sulfate and hydrated lime.) One annual copper fungicide application made after harvest in late fall, usually late October or early November to precede winter rains and fog, is recommended for prophylactic treatment of California olive groves. Coincidentally, this treatment also protects trees from infection by the olive knot bacterium.

Grower treatment is often inconsistent. Some orchards are not treated every year, while others are treated a second time in late winter or early spring for added disease control. Skipping years of treatment saves application costs but risks loss to disease, whereas two annual applications may help reduce olive knot but have little effect on OLS.

Disease and treatment history may have cumulative effects on the severity of OLS in any given year. Early researchers observed that OLS declined in trees treated annually, and in one experiment trees that were treated one season had less OLS than nontreated trees the following year, when all the trees were left untreated. In previous work, we noticed that it is difficult to reduce disease levels by annual treatment. In this article we report on the results of two experiments that describe the long-term impact of annual and biennial treatment schedules of one and two applications of various copper fungicides on OLS persistence.

Two experiments

Each experiment was conducted in a separate, mature commercial Manzanillo olive orchard in Tulare County, California. Experiment 1 had 45 trees and Experiment 2 had 35 trees. Trees in both orchards were spaced 30 feet by 30 feet, and in most years prior to the experiments had been treated in November with a copper fungicide. One orchard (Experiment 1) was irrigated with flat furrows and the other (Experiment 2) with solid set sprinklers. The orchards were lightly pruned each experimental year. Experimental materials were applied with a 100-gallon-capacity FMC Bean handgun sprayer operated at 300 psi. Approximately 7.5 gallons of fungicide solution were applied per tree.

Unless otherwise specified, rainfall data are reported for the period September through May of each year. Data were procured from the UC Lindcove Field Station, located approximately 4 miles from Experiment 1, and from the California Irrigation Management Information System in Visalia, located 6 miles from Experiment 2.

Experiment 1

In 1985 and 1986 we compared the efficacy of several fixed copper fungicides with Bordeaux for control of OLS. After this experiment was completed, we observed residual control of OLS in the experimental trees following 1 year of no treatment and a second year of commercial treatment.

One and two applications of three fixed copper fungicides and Bordeaux were tested for control of OLS in 1985-86. The fixed coppers, 5.0 pounds product per 100 gallons water, were (1) cupric hydroxide, 53% copper; (2) tribasic copper sulfate, 53% copper; and (3) cuprous oxide, 50% copper. Bordeaux consisted of 10.0 pounds copper sulfate and 10.0 pounds hydrated lime per 100 gallons water, commonly referred to as 10-10-100 Bordeaux. The fungicides were applied to the experimental trees for 2 years. Trees were sprayed on November 2, 1984, and November 7, 1985. Those trees designated to be treated twice each year were sprayed again the following January — on January 10, 1985, and January 6, 1986. A nontreated control treatment was included. There were five single-tree replications of each treatment arranged in a randomized complete block design.

In the third winter (1986-87) all trees were left untreated. In November 1987 (winter 1987-88) they were again treated by the grower using 5-10-100 Bordeaux, 500 gallons of solution per acre. The grower treatment was applied by an air-carrier orchard sprayer operated at 2 miles per hour.

After each of two winters of treatments, we evaluated disease levels on April 28, 1985, and April 26, 1986; after a winter of no treatment, on April 22, 1987; and after the winter of grower treatment, on May 5, 1988. In 1985 very little disease developed, and leaf lesions were difficult to find. In that year we used a scale of 1 to 5, based upon ease of finding lesions on a tree, to evaluate disease severity. The ratings were 1 = no lesions; 2 = holdover lesions infrequent, no new lesions present; 3 = holdover and new lesions infrequent; 4 = holdover and new lesions few; and 5 = several holdover and new lesions present. In all other years we arbitrarily selected 20 one-year-old shoots, (identified as those having at least 10 nodes [20 leaves] and bloom present on the day of evaluation) from the lower portion of the periphery of each tree. Mature leaves at 10 adjacent pairs of leaf nodes on each shoot were classified as healthy or diseased (one or more OLS lesions present), and each category was expressed as percent of total leaves.

Data were analyzed by analysis of variance, and two contrasts were performed (all treatments vs. control, and all annual vs. all biennial treatments). In addition, means for materials were separated independently for one and two applications by Duncan's multiple range test.

Results of Experiment 1

Disease data gathered during the 1985-86 efficacy experiment are repeated here to facilitate discussion of our present objective, which was to observe residual control following a year of no treatment and a second year of uniform commercial treatment.

During the experiment (1985 and 1986), there were significantly fewer diseased and more healthy leaves in the treatments than in the control (table 1). No significant differences were found between one and two applications or among materials for disease rating, or for percent diseased or healthy leaves.

In the first spring after the efficacy experiment ended (1987), following the winter of no grower treatment, there were significantly fewer diseased and more healthy leaves in treatment trees than in control trees (table 1). In spring 1988, after the winter of grower treatment of all trees, there were significantly more diseased and healthy leaves in treatment trees than in control trees. Control of OLS was superior where two applications had been made during the efficacy experiment: percent diseased leaves was less and percent healthy leaves greater in the two-application treatments in 1987 (when none of the trees had been treated the previous winter), and percent healthy leaves also was greater in these two-application treatments in 1988 (when all trees had been treated commercially the previous fall). In 1987, following the winter of no treatment, experimental trees treated once with cuprous oxide had significantly fewer diseased leaves than those treated once with Bordeaux, cupric hydroxide or tribasic copper sulfate. Differences in diseased or healthy leaves were not found among trees that had previously been treated twice. In spring of 1988, trees treated twice during the efficacy experiment with cuprous oxide had the fewest diseased leaves. As measured by percent healthy leaves, cuprous oxide and Bordeaux provided equivalent control; cuprous oxide was superior to cupric hydroxide and tribasic copper sulfate; and Bordeaux was similar to cupric hydroxide and better than tribasic copper sulfate.

Experiment 1: Extended effects of one and two annual applications of four copper fungicides on incidence of olive leaf spot. Average of 400 leaves evaluated late April or early May each year.

Fig. 1. Experiment 1: Extended effects of one and two annual applications of four copper fungicides on incidence of olive leaf spot. Average of 400 leaves evaluated late April or early May each year.

Experiment 1 — Extended effects of one and two annual applications of four copper fungicides on incidence of olive leaf spot, Tulare County

TABLE 1. Experiment 1 — Extended effects of one and two annual applications of four copper fungicides on incidence of olive leaf spot, Tulare County

Experiment 2

Our objectives in Experiment 2 were (1) to compare annual and biennial schedules of one (November or January) and two (November and January) applications per season for control of OLS; and (2) to determine the effects of these strategies on disease levels following 3 years of commercial treatment. Annual applications of 3.0 pounds cupric hydroxide, 53% copper, per 100 gallons of water, approximately 7.5 gallons solution per tree, were made using the handgun sprayer described earlier. Single applications were made on November 2, 1988, November 6, 1989, and November 5, 1990; or on January 4, 1989, January 11, 1990, and January 9, 1991. Trees that received two applications were sprayed on the November and January dates each year. All biennial treatments were identical except that applications were not made in winter 1989-90. A nontreated control was included. After our experiment was completed, the grower treated all trees annually in November for the next three winters, using 15.0 pounds cupric hydroxide per acre in 500 gallons of water. These commercial treatments were applied by aircarrier orchard sprayer operated at 2 miles per hour.

There were five single-tree replications of each treatment, arranged in a randomized complete block design.

Prior to the experiment, we selected 10 one-year-old shoots, each bearing 10 adjacent pairs of healthy mature leaves on the north side of each experimental tree. The north side of olive trees in this location often remains shaded and wetter for longer periods and thus is more prone to OLS infections. Tags were attached behind the 10th pair of mature leaves distal to the apex on each selected shoot. Following treatment, these leaves were rated as diseased or healthy on May 3, 1989, May 11, 1990 and May 1, 1991. The severe freeze in December 1990 destroyed several shoots and caused excessive leaf drop on others. Thus data from May 1991 include some replications for which there were fewer than 10 shoots.

After termination of our treatments, we continued disease evaluation following the three winters of grower treatment. Because all leaves on the tagged shoots were lost to disease, weather or natural aging by spring 1992, we used the same method of selecting 1-year-old shoots described for Experiment 1, except that shoots were chosen only on the north side of the tree. We measured disease on each experimental tree on May 15, 1992, May 7, 1993, and May 13, 1994. At the end of this postexperiment observation period in 1994, we assessed the OLS level in the orchard near but outside the experimental area (where commercial annual November treatment continued throughout the course of our activities). The OLS incidence was measured on May 18, 1994, on 20 arbitrarily selected 1-year-old shoots on the north side of each of five adjacent trees located two rows north of the experimental site.

Experiment 2 — Extended effects of annual and biennial applications of cupric hydroxide on incidence of olive leaf spot, Tulare County

TABLE 2. Experiment 2 — Extended effects of annual and biennial applications of cupric hydroxide on incidence of olive leaf spot, Tulare County

Data were analyzed by analysis of variance, and four contrasts were performed (all treatments vs. control; all annual vs. all biennial treatments; all one-application vs. all two-application treatments; and all November vs. all November and January treatments).

Results of Experiment 2

Our first objective was to compare annual and biennial schedules of one (November or January) and two (November and January) applications per season for OLS control. Percent diseased leaves was significantly lower in treated trees than in nontreated control trees in all 3 years of the experiment (1989, 1990 and 1991), and percent healthy leaves was significantly greater in treated trees in 1990 (table 2). Only in 1991 was annual treatment significantly better than biennial treatment in reducing percent diseased leaves and increasing percent healthy leaves. Overall, two applications were superior to one in lowering percent diseased leaves in all 3 years, and in increasing percent healthy leaves in 1990 and 1991. However, one application did not differ significantly from two applications in reducing percent diseased leaves in 1989 and 1990 or in increasing percent healthy leaves in 1989, 1990 and 1991.

The second objective of Experiment 2 was to determine the effects of annual and biennial schedules of one and two applications per season made during our experiment on OLS levels after our experiment, following 3 years (1992, 1993 and 1994) of uniform commercial treatment. In 1992 and 1993, control trees had significantly more diseased and fewer healthy leaves than did experimental treated trees. Following grower treatments each previous fall, trees experimentally treated annually had significantly fewer diseased leaves in 1992, 1993 and 1994 and more healthy leaves in 1993 and 1994 compared to those experimentally treated biennially. (The contrast of annual vs. biennial treatment was significant at P = .079 for percent healthy leaves in 1992.)

Where two experimental applications per season had been made, percent diseased leaves was significantly less in 1992, 1993 and 1994 and percent healthy leaves significantly greater in 1993 and 1994 than where one application per season had been made. The single-application treatment did not differ significantly from the two-application treatment in percent diseased leaves in 1992, 1993 and 1994, or in percent healthy leaves in 1992 and 1994. At a less rigorous significance level, P = 0.10, the November and January applications significantly reduced percent diseased leaves in 1993 and increased percent healthy leaves in 1994 compared to one previous November application.

In the five trees nearby and outside the experimental site, evaluated in 1994, the average percent diseased and healthy leaves were 11.4 and 82.8, respectively. (Our decision to evaluate disease in late April or early May was a compromise between missing infected leaves in an early observation before some lesions developed and later missing other infected leaves lost to defoliation. We believe some natural, healthy leaf drop — similar in all treatments — occurred prior to measurement resulting in somewhat less than 100% accountability of diseased and healthy leaves.) These values were comparable to those found in our annual November treatment (the grower standard) in 1994, indicating that the OLS level in the experimental area was similar to that in the orchard at the completion of the experiment.

Past treatments affect control

Disease patterns established by our experiments in each orchard were not obscured by 2 or 3 years of subsequent uniform treatment by the growers. We believe that the grower applications were constant among the experimental trees, and that the differences in OLS incidence reflected patterns established during the experiments.

Our decision to evaluate disease in late April or early May was a compromise between missing infected leaves in an early observation before some lesions developed and later missing other infected leaves lost to defoliation. We assumed that natural leaf drop was similar among the trees, regardless of treatment, whereas loss to OLS varied according to treatment: more infected leaves would reduce the number of healthy leaves, whether or not infected leaves had fallen when disease was evaluated. For example, in 1988 in Experiment 1, both percent healthy leaves and percent diseased leaves in the control trees were low and differed greatly from those in treatment trees. It is our opinion that OLS was more severe in control trees. Therefore fewer healthy leaves were present, and most diseased leaves had fallen prior to evaluation, so that fewer diseased leaves remained when counted. A similar situation occurred in the nontreated control in Experiment 2: percent diseased leaves was low and essentially unchanged during the 3 years of the experiment, but percent healthy leaves declined during that period, indicating leaf loss to increased infection.

Although significant differences among treatments did not develop early in our experiments, trends were established that in later years became statistically separable. This suggests that past OLS levels, as influenced by treatment strategy, contributed to the success of future control practices.

Past OLS levels determine the amount of potential initial inoculum present in the orchard. Although most infected leaves are shed, some remain, and some lesions on these become sources of inoculum. Most conidia produced in spring in California are not viable by summer; but those that survive may contribute to the initial inoculum, along with new conidia produced in fall. As our experiments show, disease (and inoculum) may gradually increase over time in trees left untreated, or treated sporadically.

Experiment 2: Extended effects of annual and biennial applications of Kocide 101 on incidence of olive leaf spot. Disease evaluated early May each year; average of 200 leaves during the experiment and 400 leaves after the experiment, respectively.

Fig. 2. Experiment 2: Extended effects of annual and biennial applications of Kocide 101 on incidence of olive leaf spot. Disease evaluated early May each year; average of 200 leaves during the experiment and 400 leaves after the experiment, respectively.

Prior treatment may also play a role in subsequent treatment success. Copper residues, shown in previous research to be higher on leaves treated twice than on those treated once, persisted at least 10 months after application (California Agriculture, Sept-Oct 1989). In that study, the highest levels of copper residues were found on trees that had been treated twice with cuprous oxide. In the work reported here, the best OLS control in spring 1988 occurred on trees that had been treated twice with cuprous oxide 2 and 3 years before. We speculate that longevity of copper fungicide on olive leaves, coupled with redistribution of the material when wetted, provides some protection all year and into the next winter. The current year's shoot growth would receive its first copper with the annual November treatment; thus 2-and 3-year-old leaves may benefit from the accumulated copper applied over their lifetimes. Any reduction in disease that results should also reduce potential sources of inoculum for the following year.

Maintaining low OLS levels is important for future disease control. The continued increase in damage found in the experimental nontreated control trees after the experiments ended and grower treatment resumed indicates that once elevated, OLS level is difficult to reverse.

In Experiment 1, OLS was barely detectable when the experiment began but increased dramatically, especially in the nontreated control, in 1986 when rainfall was 16.4 inches (average rainfall is 9.7 inches). In spring 1987, after all experimental trees had been left untreated, OLS was more severe than during the experiment in 1986. Disease again increased in 1988, and OLS patterns established by our experiment persisted even though all trees had been uniformly treated by the grower the previous fall (fig. 1). The winters of 1986-87 and 1987-88 were relatively dry (7.7 and 8.5 inches). However, rainfall was less (3.5 inches) in 1986-87 than in 1987-88 (5.8 inches) during November through January, when most OLS infections occur.

In Experiment 2, OLS levels were much higher in the wet year 1993 (13.9 inches) than in the previous drier year (8.2 inches), and these higher OLS levels were maintained or further increased in 1994, even though all trees were treated by the grower in all 3 years. Benefits of our annual treatments were apparent the last year of the experiment and continued for the 3 additional years of grower treatment (fig. 2).

Summary

Our work demonstrates the importance of treatment consistency as a basis of management strategy for long-term control of OLS. One annual application (usually late October or early November to precede rain) should control the disease in most California orchards, but an added treatment in December or early January may be beneficial where conditions favorable to disease development frequently occur. Although the two-application treatments usually did not separate significantly from the single November application treatments in these experiments, OLS levels were consistently lower in the two-application (November and January) treatments from 1990 through 1994. Biennial or other sporadic treatment schedules are not recommended because this strategy promotes a gradual increase in OLS, which makes disease control more difficult over time.

Return to top

Author notes

The authors wish the thank Dennis M. Harper, John Soares, Bret Allen and Nancy Goodell for technical assistance, Carol Adams for statistical advice, and the California Olive Commission for its support of this research.

Consistent annual treatment helps future olive leaf spot control

Beth L. Teviotdale, G. Steven Sibbett
Webmaster Email: sjosterman@ucanr.edu

Consistent annual treatment helps future olive leaf spot control

Share using any of the popular social networks Share by sending an email Print article
Share using any of the popular social networks Share by sending an email Print article

Authors

Beth L. Teviotdale , Department of Plant Pathology
G. Steven Sibbett, UC Cooperative Extension

Publication Information

California Agriculture 49(5):27-32. https://doi.org/10.3733/ca.v049n05p27

Published September 01, 1995

PDF  |  Citation  |  Permissions

Author Affiliations show

Abstract

Tulare County research revealed that an orchard's disease history influences how well annual copper fungicide treatments will work in controlling olive leaf spot. Elevated disease levels are not easily reduced in 1 year, and consistent annual treatment is important for future disease control as well as for protection in the current year.

Full text

Above, olive leaf spot, also known as peocock spot and Cycloconium leaf spot, causes brightly colored lesions to appear on olive leaves. Near right, greasy black lesions are seen in winter and spring. Far right, spores are produced on the margins of oversummering lesions. Topical copper applications can prevent these spores from causing further infection.

Above, olive leaf spot, also known as peocock spot and Cycloconium leaf spot, causes brightly colored lesions to appear on olive leaves. Near right, greasy black lesions are seen in winter and spring. Far right, spores are produced on the margins of oversummering lesions. Topical copper applications can prevent these spores from causing further infection.

Olive leaf spot (OLS), also known as peacock spot and Cycloconium leaf spot, is caused by the fungus Spilocaea oleaginea (Cast.) Hughes. Susceptible olive Olea europaea L. cultivars grown commercially in California include ‘Mission’, ‘Sevillano’, ‘Manzanillo’, ‘Ascolano’, ‘Barouni’ and ‘Nevadillo’. The principle symptom of OLS is one or more dark green-to-black lesions, sometimes surrounded by a yellow halo, on the upper surface of leaves. Petioles, fruit and fruit stems are susceptible but rarely attacked. Premature drop of infected leaves causes twig death and exposes leaf scars to infection by the olive knot pathogen, Pseudomonas syringae pv. savastanoi. A loss of 10 to 20% of fruiting wood has been attributed to OLS when the disease is severe. Although serious industry-wide damage is not common in California, OLS is a chronic problem in some orchards and a potential threat to others.

In California, infection occurs during the rainy season in late fall, winter and early spring. New lesions and leaf drop become evident in spring, and the disease is inactive during the hot dry summers. The OLS fungus oversummers in lesions on leaves in the tree. In the fall the margins of these lesions expand laterally into adjacent healthy tissue, where spores called conidia are produced. These conidia are the principle inoculum for subsequent infection and are dislodged and moved by water.

Tree-to-tree spread is slow. Continuous free moisture for 24 to 36 hours is required for germination and infection, so wet weather favors the disease. Heavy rainfall years promote more OLS, and the disease is often more prevalent in the lower portions of trees, reflecting the downward movement of waterborne inoculum. In many olive-growing regions of California, dense fog prevails for several weeks during winter. Free moisture frequently is present on leaf surfaces for hours or days. OLS infections can occur under such conditions even when rainfall is not unusually abundant. OLS is more serious in humid olive-growing areas of the world, where extended periods of wet weather permit repeated disease cycles.

Topical copper applications prevent olive leaf spot. The copper is present in various products that are generally known as either “fixed coppers” or “Bordeaux.” (Fixed copper is an inclusive term for any of several basic copper salts. Bordeaux is a combination of copper sulfate and hydrated lime.) One annual copper fungicide application made after harvest in late fall, usually late October or early November to precede winter rains and fog, is recommended for prophylactic treatment of California olive groves. Coincidentally, this treatment also protects trees from infection by the olive knot bacterium.

Grower treatment is often inconsistent. Some orchards are not treated every year, while others are treated a second time in late winter or early spring for added disease control. Skipping years of treatment saves application costs but risks loss to disease, whereas two annual applications may help reduce olive knot but have little effect on OLS.

Disease and treatment history may have cumulative effects on the severity of OLS in any given year. Early researchers observed that OLS declined in trees treated annually, and in one experiment trees that were treated one season had less OLS than nontreated trees the following year, when all the trees were left untreated. In previous work, we noticed that it is difficult to reduce disease levels by annual treatment. In this article we report on the results of two experiments that describe the long-term impact of annual and biennial treatment schedules of one and two applications of various copper fungicides on OLS persistence.

Two experiments

Each experiment was conducted in a separate, mature commercial Manzanillo olive orchard in Tulare County, California. Experiment 1 had 45 trees and Experiment 2 had 35 trees. Trees in both orchards were spaced 30 feet by 30 feet, and in most years prior to the experiments had been treated in November with a copper fungicide. One orchard (Experiment 1) was irrigated with flat furrows and the other (Experiment 2) with solid set sprinklers. The orchards were lightly pruned each experimental year. Experimental materials were applied with a 100-gallon-capacity FMC Bean handgun sprayer operated at 300 psi. Approximately 7.5 gallons of fungicide solution were applied per tree.

Unless otherwise specified, rainfall data are reported for the period September through May of each year. Data were procured from the UC Lindcove Field Station, located approximately 4 miles from Experiment 1, and from the California Irrigation Management Information System in Visalia, located 6 miles from Experiment 2.

Experiment 1

In 1985 and 1986 we compared the efficacy of several fixed copper fungicides with Bordeaux for control of OLS. After this experiment was completed, we observed residual control of OLS in the experimental trees following 1 year of no treatment and a second year of commercial treatment.

One and two applications of three fixed copper fungicides and Bordeaux were tested for control of OLS in 1985-86. The fixed coppers, 5.0 pounds product per 100 gallons water, were (1) cupric hydroxide, 53% copper; (2) tribasic copper sulfate, 53% copper; and (3) cuprous oxide, 50% copper. Bordeaux consisted of 10.0 pounds copper sulfate and 10.0 pounds hydrated lime per 100 gallons water, commonly referred to as 10-10-100 Bordeaux. The fungicides were applied to the experimental trees for 2 years. Trees were sprayed on November 2, 1984, and November 7, 1985. Those trees designated to be treated twice each year were sprayed again the following January — on January 10, 1985, and January 6, 1986. A nontreated control treatment was included. There were five single-tree replications of each treatment arranged in a randomized complete block design.

In the third winter (1986-87) all trees were left untreated. In November 1987 (winter 1987-88) they were again treated by the grower using 5-10-100 Bordeaux, 500 gallons of solution per acre. The grower treatment was applied by an air-carrier orchard sprayer operated at 2 miles per hour.

After each of two winters of treatments, we evaluated disease levels on April 28, 1985, and April 26, 1986; after a winter of no treatment, on April 22, 1987; and after the winter of grower treatment, on May 5, 1988. In 1985 very little disease developed, and leaf lesions were difficult to find. In that year we used a scale of 1 to 5, based upon ease of finding lesions on a tree, to evaluate disease severity. The ratings were 1 = no lesions; 2 = holdover lesions infrequent, no new lesions present; 3 = holdover and new lesions infrequent; 4 = holdover and new lesions few; and 5 = several holdover and new lesions present. In all other years we arbitrarily selected 20 one-year-old shoots, (identified as those having at least 10 nodes [20 leaves] and bloom present on the day of evaluation) from the lower portion of the periphery of each tree. Mature leaves at 10 adjacent pairs of leaf nodes on each shoot were classified as healthy or diseased (one or more OLS lesions present), and each category was expressed as percent of total leaves.

Data were analyzed by analysis of variance, and two contrasts were performed (all treatments vs. control, and all annual vs. all biennial treatments). In addition, means for materials were separated independently for one and two applications by Duncan's multiple range test.

Results of Experiment 1

Disease data gathered during the 1985-86 efficacy experiment are repeated here to facilitate discussion of our present objective, which was to observe residual control following a year of no treatment and a second year of uniform commercial treatment.

During the experiment (1985 and 1986), there were significantly fewer diseased and more healthy leaves in the treatments than in the control (table 1). No significant differences were found between one and two applications or among materials for disease rating, or for percent diseased or healthy leaves.

In the first spring after the efficacy experiment ended (1987), following the winter of no grower treatment, there were significantly fewer diseased and more healthy leaves in treatment trees than in control trees (table 1). In spring 1988, after the winter of grower treatment of all trees, there were significantly more diseased and healthy leaves in treatment trees than in control trees. Control of OLS was superior where two applications had been made during the efficacy experiment: percent diseased leaves was less and percent healthy leaves greater in the two-application treatments in 1987 (when none of the trees had been treated the previous winter), and percent healthy leaves also was greater in these two-application treatments in 1988 (when all trees had been treated commercially the previous fall). In 1987, following the winter of no treatment, experimental trees treated once with cuprous oxide had significantly fewer diseased leaves than those treated once with Bordeaux, cupric hydroxide or tribasic copper sulfate. Differences in diseased or healthy leaves were not found among trees that had previously been treated twice. In spring of 1988, trees treated twice during the efficacy experiment with cuprous oxide had the fewest diseased leaves. As measured by percent healthy leaves, cuprous oxide and Bordeaux provided equivalent control; cuprous oxide was superior to cupric hydroxide and tribasic copper sulfate; and Bordeaux was similar to cupric hydroxide and better than tribasic copper sulfate.

Experiment 1: Extended effects of one and two annual applications of four copper fungicides on incidence of olive leaf spot. Average of 400 leaves evaluated late April or early May each year.

Fig. 1. Experiment 1: Extended effects of one and two annual applications of four copper fungicides on incidence of olive leaf spot. Average of 400 leaves evaluated late April or early May each year.

Experiment 1 — Extended effects of one and two annual applications of four copper fungicides on incidence of olive leaf spot, Tulare County

TABLE 1. Experiment 1 — Extended effects of one and two annual applications of four copper fungicides on incidence of olive leaf spot, Tulare County

Experiment 2

Our objectives in Experiment 2 were (1) to compare annual and biennial schedules of one (November or January) and two (November and January) applications per season for control of OLS; and (2) to determine the effects of these strategies on disease levels following 3 years of commercial treatment. Annual applications of 3.0 pounds cupric hydroxide, 53% copper, per 100 gallons of water, approximately 7.5 gallons solution per tree, were made using the handgun sprayer described earlier. Single applications were made on November 2, 1988, November 6, 1989, and November 5, 1990; or on January 4, 1989, January 11, 1990, and January 9, 1991. Trees that received two applications were sprayed on the November and January dates each year. All biennial treatments were identical except that applications were not made in winter 1989-90. A nontreated control was included. After our experiment was completed, the grower treated all trees annually in November for the next three winters, using 15.0 pounds cupric hydroxide per acre in 500 gallons of water. These commercial treatments were applied by aircarrier orchard sprayer operated at 2 miles per hour.

There were five single-tree replications of each treatment, arranged in a randomized complete block design.

Prior to the experiment, we selected 10 one-year-old shoots, each bearing 10 adjacent pairs of healthy mature leaves on the north side of each experimental tree. The north side of olive trees in this location often remains shaded and wetter for longer periods and thus is more prone to OLS infections. Tags were attached behind the 10th pair of mature leaves distal to the apex on each selected shoot. Following treatment, these leaves were rated as diseased or healthy on May 3, 1989, May 11, 1990 and May 1, 1991. The severe freeze in December 1990 destroyed several shoots and caused excessive leaf drop on others. Thus data from May 1991 include some replications for which there were fewer than 10 shoots.

After termination of our treatments, we continued disease evaluation following the three winters of grower treatment. Because all leaves on the tagged shoots were lost to disease, weather or natural aging by spring 1992, we used the same method of selecting 1-year-old shoots described for Experiment 1, except that shoots were chosen only on the north side of the tree. We measured disease on each experimental tree on May 15, 1992, May 7, 1993, and May 13, 1994. At the end of this postexperiment observation period in 1994, we assessed the OLS level in the orchard near but outside the experimental area (where commercial annual November treatment continued throughout the course of our activities). The OLS incidence was measured on May 18, 1994, on 20 arbitrarily selected 1-year-old shoots on the north side of each of five adjacent trees located two rows north of the experimental site.

Experiment 2 — Extended effects of annual and biennial applications of cupric hydroxide on incidence of olive leaf spot, Tulare County

TABLE 2. Experiment 2 — Extended effects of annual and biennial applications of cupric hydroxide on incidence of olive leaf spot, Tulare County

Data were analyzed by analysis of variance, and four contrasts were performed (all treatments vs. control; all annual vs. all biennial treatments; all one-application vs. all two-application treatments; and all November vs. all November and January treatments).

Results of Experiment 2

Our first objective was to compare annual and biennial schedules of one (November or January) and two (November and January) applications per season for OLS control. Percent diseased leaves was significantly lower in treated trees than in nontreated control trees in all 3 years of the experiment (1989, 1990 and 1991), and percent healthy leaves was significantly greater in treated trees in 1990 (table 2). Only in 1991 was annual treatment significantly better than biennial treatment in reducing percent diseased leaves and increasing percent healthy leaves. Overall, two applications were superior to one in lowering percent diseased leaves in all 3 years, and in increasing percent healthy leaves in 1990 and 1991. However, one application did not differ significantly from two applications in reducing percent diseased leaves in 1989 and 1990 or in increasing percent healthy leaves in 1989, 1990 and 1991.

The second objective of Experiment 2 was to determine the effects of annual and biennial schedules of one and two applications per season made during our experiment on OLS levels after our experiment, following 3 years (1992, 1993 and 1994) of uniform commercial treatment. In 1992 and 1993, control trees had significantly more diseased and fewer healthy leaves than did experimental treated trees. Following grower treatments each previous fall, trees experimentally treated annually had significantly fewer diseased leaves in 1992, 1993 and 1994 and more healthy leaves in 1993 and 1994 compared to those experimentally treated biennially. (The contrast of annual vs. biennial treatment was significant at P = .079 for percent healthy leaves in 1992.)

Where two experimental applications per season had been made, percent diseased leaves was significantly less in 1992, 1993 and 1994 and percent healthy leaves significantly greater in 1993 and 1994 than where one application per season had been made. The single-application treatment did not differ significantly from the two-application treatment in percent diseased leaves in 1992, 1993 and 1994, or in percent healthy leaves in 1992 and 1994. At a less rigorous significance level, P = 0.10, the November and January applications significantly reduced percent diseased leaves in 1993 and increased percent healthy leaves in 1994 compared to one previous November application.

In the five trees nearby and outside the experimental site, evaluated in 1994, the average percent diseased and healthy leaves were 11.4 and 82.8, respectively. (Our decision to evaluate disease in late April or early May was a compromise between missing infected leaves in an early observation before some lesions developed and later missing other infected leaves lost to defoliation. We believe some natural, healthy leaf drop — similar in all treatments — occurred prior to measurement resulting in somewhat less than 100% accountability of diseased and healthy leaves.) These values were comparable to those found in our annual November treatment (the grower standard) in 1994, indicating that the OLS level in the experimental area was similar to that in the orchard at the completion of the experiment.

Past treatments affect control

Disease patterns established by our experiments in each orchard were not obscured by 2 or 3 years of subsequent uniform treatment by the growers. We believe that the grower applications were constant among the experimental trees, and that the differences in OLS incidence reflected patterns established during the experiments.

Our decision to evaluate disease in late April or early May was a compromise between missing infected leaves in an early observation before some lesions developed and later missing other infected leaves lost to defoliation. We assumed that natural leaf drop was similar among the trees, regardless of treatment, whereas loss to OLS varied according to treatment: more infected leaves would reduce the number of healthy leaves, whether or not infected leaves had fallen when disease was evaluated. For example, in 1988 in Experiment 1, both percent healthy leaves and percent diseased leaves in the control trees were low and differed greatly from those in treatment trees. It is our opinion that OLS was more severe in control trees. Therefore fewer healthy leaves were present, and most diseased leaves had fallen prior to evaluation, so that fewer diseased leaves remained when counted. A similar situation occurred in the nontreated control in Experiment 2: percent diseased leaves was low and essentially unchanged during the 3 years of the experiment, but percent healthy leaves declined during that period, indicating leaf loss to increased infection.

Although significant differences among treatments did not develop early in our experiments, trends were established that in later years became statistically separable. This suggests that past OLS levels, as influenced by treatment strategy, contributed to the success of future control practices.

Past OLS levels determine the amount of potential initial inoculum present in the orchard. Although most infected leaves are shed, some remain, and some lesions on these become sources of inoculum. Most conidia produced in spring in California are not viable by summer; but those that survive may contribute to the initial inoculum, along with new conidia produced in fall. As our experiments show, disease (and inoculum) may gradually increase over time in trees left untreated, or treated sporadically.

Experiment 2: Extended effects of annual and biennial applications of Kocide 101 on incidence of olive leaf spot. Disease evaluated early May each year; average of 200 leaves during the experiment and 400 leaves after the experiment, respectively.

Fig. 2. Experiment 2: Extended effects of annual and biennial applications of Kocide 101 on incidence of olive leaf spot. Disease evaluated early May each year; average of 200 leaves during the experiment and 400 leaves after the experiment, respectively.

Prior treatment may also play a role in subsequent treatment success. Copper residues, shown in previous research to be higher on leaves treated twice than on those treated once, persisted at least 10 months after application (California Agriculture, Sept-Oct 1989). In that study, the highest levels of copper residues were found on trees that had been treated twice with cuprous oxide. In the work reported here, the best OLS control in spring 1988 occurred on trees that had been treated twice with cuprous oxide 2 and 3 years before. We speculate that longevity of copper fungicide on olive leaves, coupled with redistribution of the material when wetted, provides some protection all year and into the next winter. The current year's shoot growth would receive its first copper with the annual November treatment; thus 2-and 3-year-old leaves may benefit from the accumulated copper applied over their lifetimes. Any reduction in disease that results should also reduce potential sources of inoculum for the following year.

Maintaining low OLS levels is important for future disease control. The continued increase in damage found in the experimental nontreated control trees after the experiments ended and grower treatment resumed indicates that once elevated, OLS level is difficult to reverse.

In Experiment 1, OLS was barely detectable when the experiment began but increased dramatically, especially in the nontreated control, in 1986 when rainfall was 16.4 inches (average rainfall is 9.7 inches). In spring 1987, after all experimental trees had been left untreated, OLS was more severe than during the experiment in 1986. Disease again increased in 1988, and OLS patterns established by our experiment persisted even though all trees had been uniformly treated by the grower the previous fall (fig. 1). The winters of 1986-87 and 1987-88 were relatively dry (7.7 and 8.5 inches). However, rainfall was less (3.5 inches) in 1986-87 than in 1987-88 (5.8 inches) during November through January, when most OLS infections occur.

In Experiment 2, OLS levels were much higher in the wet year 1993 (13.9 inches) than in the previous drier year (8.2 inches), and these higher OLS levels were maintained or further increased in 1994, even though all trees were treated by the grower in all 3 years. Benefits of our annual treatments were apparent the last year of the experiment and continued for the 3 additional years of grower treatment (fig. 2).

Summary

Our work demonstrates the importance of treatment consistency as a basis of management strategy for long-term control of OLS. One annual application (usually late October or early November to precede rain) should control the disease in most California orchards, but an added treatment in December or early January may be beneficial where conditions favorable to disease development frequently occur. Although the two-application treatments usually did not separate significantly from the single November application treatments in these experiments, OLS levels were consistently lower in the two-application (November and January) treatments from 1990 through 1994. Biennial or other sporadic treatment schedules are not recommended because this strategy promotes a gradual increase in OLS, which makes disease control more difficult over time.

Return to top

Author notes

The authors wish the thank Dennis M. Harper, John Soares, Bret Allen and Nancy Goodell for technical assistance, Carol Adams for statistical advice, and the California Olive Commission for its support of this research.


University of California, 2801 Second Street, Room 184, Davis, CA, 95618
Email: calag@ucanr.edu | Phone: (530) 750-1223 | Fax: (510) 665-3427
Website: https://calag.ucanr.edu