Rhizoctonia and Crown Rot status of Western Australian paddocks can be managed with crop rotation

Rhizoctonia and Crown Rot status of Western Australian paddocks can be managed with crop rotation

Key Messages

  • A survey of farmers’ fields from 2010 to 2013 found little Rhizoctonia and Crown Rot across the WA Wheatbelt.
  • However, the data demonstrated that even with low levels of disease present, rotation with break crops can manage inoculum levels of Rhizoctonia and Crown Rot.
  • Inoculum levels are likely to build up when continous cereals are grown; in fertile soils; and in dry summers.
  • Pastures did not offer a break to the cereal, suggesting that unmanaged pastures can contribute to the problem of crown rot.

Aims

Continuous cropping and minimum tillage have changed the agricultural landscape in WA, and it is important to monitor the system to determine if our farming system is under threat. In the context of disease; the implication is that the threat of diseases in cereal crops will increase over time. For example, if the soil-borne disease Crown Rot increases from one year to the next over successive years, then the farming system will become increasingly risky, as cereal crops will be less resistant to terminal drought. In seasons that are marginally profitable, such an increase in crown rot could turn a moderately profitable season into a loss making season.

For this reason, it is prudent to monitor the disease status of fields across the wheat belt, and determine whether the dominant diseases of Crown Rot and Rhizoctonia are increasing, decreasing, or remaining constant. Each disease has two measurable components; the level of inoculum (measured in this survey using the SARDI predictaB service, and an estimate of disease incidence (made in this instance by a DAFWA plant pathologist). The infection and expression of the disease is a consequence of both the level of inoculum and the conduciveness of the season, therefore it is important to monitor the pathogen and the disease to determine whether the farming system may be vulnerable to soil-borne diseases. Furthermore, trends may appear at a regional level and if an increase in a particular disease is detected, it is important to determine whether such an increase has occurred on specific paddocks, or occurs randomly throughout the wheatbelt. Of considerable concern is whether break crops are performing their expected function and reducing the expression of disease in both the break crop and in subsequent cereal crops.

Here, we explore the regional trends in the two major diseases of wheat Crown Rot and Rhizoctonia from 2010 to 2013. We then consider whether the problem sustains itself in an individual field, or whether management actions can successfully reduce the disease in the following season. Finally we explore the break crops and pastures and test whether the break crops are indeed reducing the disease levels.

Method

DAFWA staff monitored 180 selected paddocks, annually for four years from 2010 to 2013 inclusive. These paddocks were predominantly from the medium rainfall region. Each paddock was monitored for crop or pasture type, soil type, nutrient level and soil-borne disease status each year. Diseases were assessed four times through the season using a combination of SARDI Predicta B that measures DNA and a visual inspection of root health by a trained DAFWA pathologist. SARDI predicta B measurement took place prior to seeding and at anthesis. The pathogens were assessed at the seedling stage and near anthesis. Data were recorded into an MS Access Database. Data were interrogated to explore the time trends of the 2 important diseases, Crown Rot and Rhizoctonia in wheat crops based on the visual assessment by the pathologist and using the DNA. These analyses provide some insight, but simple approaches cannot capture the complex dynamics of disease. A regression and classification tree of Crown Rot and Rhizoctonia was also used to explore what factors lead to an increase in the inoculum of these diseases in wheat plants.

Results

Crown Rot

Visual inspection of crops for crown rot identified a small but steady increase in the observed rating from an incidence of just 1% (1 plant/100) to 6% (6 plants/100 in 2012) (Figure 1a). The expression of Crown Rot in the surveyed plants declined in 2013 to just 1%. These changes in the observed expression of Crown Rot differed to those for the inoculum levels detected using SARDI predictaB. The log DNA scores increased from very low levels in 2010 and 2011 (~ 0.2) to slightly higher levels of ~0.4 by 2013 (Figure 1b). The contrasts between the expression of the disease through the ratings and the inoculum levels suggest that having more inoculum may not lead to expression. However, inoculum build up is considered to be the first step towards substantial infection that could lead to yield losses. Therefore the basis for the change in inoculum of Crown Rot at the paddock level was explored further using a regression and classification tree.

The decision tree found that inoculum increases for 136 paddocks that were monitored in 2012 and 2013 were more likely to occur in paddocks sown to cereals in 2012. Paddocks with dry summers with rainfall less than 12mm tended to increase the likelihood of an increase in inoculum from 2012 to 2013. In contrast, highly infertile soils, with less than 11 kg/ha of NH4 and 0.41 kg/ha of zinc tended to reduce the likelihood of an inoculum buildup.

Once these conditions were met (cereal crop in 2012, or dry summer or relatively infertile soil), then Crown Rot inoculum tended to increase if the paddock was in pasture in 2013 or in wheat. Increases were generally not detected in other crops. Again, a dry summer with < 18mm of rainfall was overly represented in those paddocks that experienced an increase in Crown Rot inoculum levels.

Finally, in 7 paddocks, an increase of almost a full unit (log score) of Crown Rot inoculum was detected in soils with K levels of 178.5 Kg/ha or more. There was a pronounced decrease in Crown Rot inoculum of 0.41 units in 28 paddocks that had a break crop in 2012 and were from the Facey, Fitzgerald or Great Southern groups.

Rhizoctonia

Across all paddocks surveyed, the incidence for Rhizoctonia increased from 17% (17 plants/100) in 2010 to 21% (21 plants/100) scored in 2012. This expression declined to 12 plants/100 in 2013 (Figure 1c). While Rhizoctonia ratings fluctuated, the log DNA scores remained relatively constant at 0.9 across the state. A slight dip occurred in 2011, when fewer paddocks were sampled (Figure 1d).

Although the surveyed average DNA levels remained constant, the change in DNA from one season to the next did vary. That is, the response of individual paddocks varied across the wheatbelt. The decision tree analysis again split primarily on landuse in 2012. Paddocks in cereal (Wheat, Oats and Barley) tended not to experience a decline in DNA, perhaps because it was higher in 2012 to start with. The influence of previous crop type on Rhizoctonia levels in 2013 was similar in magnitude to the impact that crop choice in 2013 had, where cereals had more Rhizoctonia in 2013 than the break crops. As a result of these relationships, it appears as though Rhizoctonia does increase under cereals. More fertile soils with Zinc levels > 435 g/ha and organic carbon levels of 3.01% or more tended to be generate a similar split to crop history. That is, a Rhizoctonia increase from one year to the next was more likely on more fertile soils. Other relationships related to the change in Rhizoctonia DNA levels were difficult to detect.

Figure 1a:d.  Mean estimated for Crown Rot (a and b) and Rhizoctonia (c and d) from 2010 to 2013 for the visual score conducted by the DAFWA pathologist and for the SARDI preditaB measure of DNA at anthesis, where DNA was log transformed an.

Figure 1a:d. Mean estimated for Crown Rot (a and b) and Rhizoctonia (c and d) from 2010 to 2013 for the visual score conducted by the DAFWA pathologist and for the SARDI preditaB measure of DNA at anthesis, where DNA was log transformed.

Conclusion

The survey of farmers fields from 2010-2013 demonstrated that climate and crop rotation are important for managing soil-borne diseases in the WA wheatbelt. Similarly, pasture management appears to be critical, as many unmanaged pastures did not provide a break from soil-borne disease. Overall, the levels of disease in the surveyed paddocks were exceptionally low and we were challenged to find any relationship between disease and management. Nevertheless, rotation is still important for managing all diseases and unmanaged pastures often fail to provide a disease break while fertile soils and dry summers can contribute to the disease problem.

Acknowledgments

The authors acknowledge field work by DAFWA staff in conjunction with Liebe, WANTFA, Facey and MIG. Plant Disease ratings by DAFWA plant pathologists. The authors also thank SARDI for PreDicta B assays and the GRDC for funding the project.

GRDC Project Code: DAW00213,