Introduction
Treatment of a dairy cow at drying off is designed to minimize clinical and subclinical mastitis in the subsequent lactation by eliminating pre-existing intramammary infections (IMI) and preventing the development of new ones during the dry period (Bradley and Green, Reference Bradley and Green2004). Conventionally, this has been achieved by the use of long-acting dry cow antibiotics at the end of lactation (Halasa et al., Reference Halasa, Nielen, Whist and Østerås2009a, Reference Halasa, Østerås, Hogeveen, van Werven and Nielen2009b). However, the use of dry cow antibiotic treatment (DCAT), while effective, adds significantly to the use of antibiotics on dairy farms. For example, in New Zealand >40% of the antimicrobial usage by the dairy industry is DCAT (based on mg active ingredient per population correction unit; Bryan and Hea, Reference Bryan and Hea2017). Pressure on antibiotic usage, especially prophylactic use, has resulted in increasing emphasis on selective dry cow therapy (SDCT) at the individual cow or quarter level. This can significantly reduce overall antibiotic use (Rowe et al., Reference Rowe, Godden, Nydam, Gorden, Lago, Vasquez, Royster, Timmerman and Thomas2020b) and is supported by the use of internal teat sealants, which are effective at preventing new IMI during the dry period (Rabiee and Lean, Reference Rabiee and Lean2013; Dufour et al., Reference Dufour, Wellemans, Roy, Lacasse, Ordonez-Iturriaga and Francoz2019).
Many different methods have been used to select cows for antibiotic treatment when using SDCT. In New Zealand, cow records, typically somatic cell count (SCC) and clinical mastitis (CM) data have been used to predict which cows are most likely to have an IMI at drying off, and thus require antibiotic DCAT (Gohary and McDougall, Reference Gohary and McDougall2018). The most commonly used alternative to cow-level data for allocating cows to DCAT is culture (Rowe et al., Reference Rowe, Godden, Nydam, Gorden, Lago, Vasquez, Royster, Timmerman and Thomas2020b). Bacterial culture may give a more accurate measure of the prevalence of IMI at dry-off than an SCC-guided algorithm (Clabby et al., Reference Clabby, Valldecabres, Dillon, McParland, Arkins, O'Sullivan, Flynn, Murphy and Boloña2023) and also provides information on the bacteria present at dry-off, which can be used to guide herd-specific mastitis prevention programmes (beyond DCAT) and, potentially, may be used to determine whether an IMI detected prior to dry off needs antibiotics.
However, outside of the research setting, culture has not been widely used in New Zealand to determine DCAT due to cost and logistical challenges (Gohary and McDougall, Reference Gohary and McDougall2018). However, there is an increasing availability of rapid culture-based techniques that can be used to detect IMI in cattle (Bates et al., Reference Bates, Laven, Bork, Hay, McDowell and Saldias2020; Kabera et al., Reference Kabera, Roy, Afifi, Godden, Stryhn, Sanchez and Dufour2021). Such techniques may make culture-guided DCAT more feasible and more affordable.
Normally, culture-guided DCAT is based simply on a positive culture result (i.e. isolation of any bacterial species; Rowe et al., Reference Rowe, Godden, Nydam, Gorden, Lago, Vasquez, Royster, Timmerman and Thomas2020a). However, this positive/negative approach may not use the full value of bacterial culture and, therefore, may result in increased unnecessary antibiotic use. Firstly, simply because a bacterium is isolated does not mean that there is an IMI (Kurban et al., Reference Kurban, Roy, Kabera, Fréchette, Um, Albaaj, Rowe, Godden, Adkins and Middleton2022); in such cases, antibiotic use is not justified. However, secondly, just because a bacterium with a known link to clinical or subclinical mastitis is isolated, it still does not mean that DCAT use is justified. Treating dry cows with antibiotics is only justified if there is a reasonable expectation that their use will improve cure rates (Müller et al., Reference Müller, Nitz, Tellen, Klocke and Krömker2023). Indeed, it can be argued that DCAT is only justified if there are marked improvements in cure rates, as SDCT has been promoted as an alternative to whole herd DCAT even when SDCT was associated with a ‘marginal decline in udder health’ (Tho Seeth et al., Reference Tho Seeth, Wente, Paduch, Klocke, Mansion-de Vries, Hoedemaker and Krömker2017). Often IMIs are categorized into major (the bulk of which worldwide are defined as Staphylococcus aureus, Streptococcus uberis, Streptococcus dysgalactiae and Escherichia coli) and minor pathogens, and they are grouped as such to relate to the pathogens’ virulence and propensity to negatively affect udder health (Keane, Reference Keane2019). However, except for Staph. aureus, there are limited published data that can be used to determine, based on bacteriology, whether the use of DCAT is effective and, therefore, justified (Royster and Wagner, Reference Royster and Wagner2015; McMullen et al., Reference McMullen, Sargeant, Kelton, O'Connor, Reedman, Hu, Glanville, Wood and Winder2021). The data are particularly limited for Staphylococcaceae other than Staph. aureus (usually referred to as non-aureus Staphylococci or NAS and defined as minor pathogens), with most peer-reviewed studies that evaluated the cure rate of NAS-related IMI after DCAT being small and showing no or minor benefit in terms of cure rate (Harmon et al., Reference Harmon, Crist, Hemken and Langlois1986; Cummins and McCaskey, Reference Cummins and McCaskey1987; Schukken et al., Reference Schukken, Vanvliet, Vandegeer and Grommers1993; Hogan et al., Reference Hogan, Smith, Todhunter, Schoenberger, Dinsmore, Canttell and Gabel1994).
The most recent and largest study of the cure rate of NAS-related IMI after DCAT (Müller et al., Reference Müller, Nitz, Tellen, Klocke and Krömker2023) produced an inconclusive result. This study of 1094 cases found that the estimated cure rate for NAS treated with antibiotics was 78.1% while that for cases not treated with antibiotics was 70.3% (odds ratio for cure 1.59; 95% CI 0.96–2.64). Published data thus suggest the benefit of DCAT on the cure rate of NAS-related IMI may only be moderate at best and could be biologically unimportant. The same may also apply to other minor pathogens such as Corynebacterium bovis. As most pathogens present in the udder at dry-off in New Zealand are considered minor pathogens, not routinely treating such pathogens with DCAT could significantly reduce the quantity of antibiotics used (Lipkens et al., Reference Lipkens, Piepers, De Visscher and De Vliegher2019; Rowe et al., Reference Rowe, Godden, Nydam, Gorden, Lago, Vasquez, Royster, Timmerman and Thomas2020a; McDougall et al., Reference McDougall, Williamson, Gohary and Lacy-Hulbert2021). However, we need data on whether this potential reduction in antibiotic use is associated with no impact on key mastitis outcomes, in particular, SCC and CM risk.
Therefore, the aim of this study was to evaluate the impact on SCC and CM risk of using a selective dry cow protocol where cows were selected for DCAT based on the identification of IMI caused by major pathogens rather than the current New Zealand industry standard (i.e. selecting cows for DCAT based on SCC and CM). Alongside this we wanted to evaluate how effective a novel quartered combination agar plate (designed specifically for culture-guided SDCT) was at identifying major pathogens. There were thus two objectives of this study: 1) define the predictive ability of the novel culture plate to identify IMI due to major pathogens and compare this to the current industry standard of using SCC-based algorithms and 2) evaluate the effect of DCAT selection protocol (culture and treatment of major pathogens only vs standard SCC-based) on the incidence of CM over the dry period and the first 30 days of the subsequent lactation, and individual cow SCC at the first herd test after calving. Our hypothesis was that using the culture-based protocol would result in less antibiotic use, without compromising udder health.
Materials and methods
All animal manipulations were approved by AgResearch Animal Ethics Committee (AEC number 15608). The study was a randomized clinical study with both non-inferiority (CM outcomes) and conventional superiority objectives (comparison of major IMI and SCC outcomes).
Sample size calculations
These were based on a non-inferiority study design comparing cow-level CM risk (dry period to 30 days in milk), as this would require the largest sample size across study outcomes. A cow-level CM risk of 10% (McDougall, Reference McDougall1999) was assumed, for both treatment groups, with a non-inferiority cow-level CM margin of 4%. For 80% power and a one-sided alpha of 2.5%, 696 animals were required per group. Inflating this by 15% to account for losses and missing data gave a sample size of 766 animals per group.
Farm enrolment
In April 2022, three farms were enrolled in this study. These were selected from farms under the care of the veterinary practice undertaking the on-farm aspects of the study. This practice was based in the Canterbury region of the South Island of New Zealand. Farms were selected based on: i) at least 500 available, eligible cows, ii) bulk milk SCC seasonal average <250,000 cells/ml, iii) suitable facilities for sampling, treatments and observations, iv) reliable visual unique identification of cows, v) individual cow SCC data from <80 days prior to dry off, vi) herd test within 30 days of planned start of calving in the 2022/2023 season and vii) accurate/complete animal treatment records.
Animal enrolment
The study population consisted of all multiparous pregnant lactating cattle on the three farms. Cows were not eligible for enrolment if they were observed by a veterinarian as dull or depressed at the sampling or enrolment visits or if they had received antibiotics for any reason during the 14 days preceding enrolment. In addition, on the day of sampling, cows had to have four functional quarters (i.e. all quarters were being milked), be in body condition score 3.5 or more (1–10 scale; Roche et al., Reference Roche, Dillon, Stockdale, Baumgard and VanBaale2004), not be visually lame, not have rough/very rough teat-end hyperkeratosis (Ohnstad, Reference Ohnstad2003) or CM (i.e. heat in one or more quarters and/or clots present in milk after stripping). Furthermore, enrolled cows needed to have been recorded as pregnant and have electronic records of SCC or milk volume within 80 days of Study Day 0. Animals were enrolled over multiple days on each farm until the required sample size of valid milk sample culture results was obtained.
Following enrolment, animals that died or were culled due to conditions unrelated to diagnosed CM were excluded from the clinical and subclinical mastitis outcome analyses.
Milk sample collection (Day −14 to −10)
Between 10 and 14 days prior to Study Day 0 (dry-off), a composite 4-quarter sterile milk sample was collected prior to milking by trained veterinary technicians into a 28 ml container (SC Yellow/1000 – P28, Techno Plas, Adelaide, Australia). Prior to milk sampling, all teat ends were disinfected beginning with the front left and proceeding clockwise. Excess organic debris was removed using teat wipes from the teat, followed by disinfection of the teat end with teat wipes infused with 70% isopropyl alcohol. After disinfection, teats were sampled in anticlockwise fashion, starting, with the back left quarter, with the first three to five squirts discarded and then one to two good squirts of milk (∼5–10 ml) collected from each teat into the single container. Samples were placed in a portable cooler with ice packs, and once all samples were collected, packaged into an insulated box with ice packs and sent via courier to Gribbles Veterinary Pathology (Palmerston North, New Zealand). Dry-off date was defined by the farmers as per standard farm practice. This involved a combination of expected calving date, milk yield and body condition score, amongst other factors, at the discretion of the farmer. Sample and dry-off visits were conducted in mobs of ∼200–250 animals per day. Two of the farms had sampling conducted on two separate days, while the third farm had three sampling days. The same veterinary technicians were present at all visits, responsible for both milk sampling and intramammary administration.
Laboratory procedures (Day −12 to −2)
On arrival at the laboratory, samples were split into two. Both samples were then frozen at −20°C until testing. After thawing, one sample was cultured using the laboratory’s standard culture methods and the other using the novel culture method.
For the standard culture method, once samples were thawed, they were then inverted ∼10–15 times to mix and 10 µl of milk was spread onto one quarter of a 5% sheep blood agar plate containing 0.1% aesculin (Fort Richard Laboratories, Auckland, NZ). Plates were incubated for up to 48 h at 37°C before examination, with initial reads carried out after 24 (±2) h. Initial identification was based on colony morphology, patterns of haemolysis, Gram stain and aesculin reaction. Further biochemical testing for speciation was undertaken for presumptive major pathogens; these included, as required, coagulase test (Staphyloslide; Becton Dickinson, New Jersey, USA), CAMP test, subculture onto MacConkey agar (Fort Richard Laboratories), and further testing using triple sugar iron reactions. A cow was defined as infected with a pathogen when >1 colony-forming units (cfu) of a single bacterial species or yeast was found, except for Staph. aureus when ≥1 cfu was defined as a positive culture (McDougall et al., Reference McDougall, Williamson, Gohary and Lacy-Hulbert2021). A sample was defined as mixed when two different colony types were isolated, and as contaminated if there were >2 distinct colony types. Samples with no visible bacterial growth after 48 h (or with no more than one cfu of any colony type) were recorded as ‘no growth’. For analysis, major pathogens were defined as Staph. aureus, Strep. dysgalactiae, Strep. uberis, Strep. spp. (i.e. streptococci other than Strep. uberis or Strep. dysgalactiae), E. coli or Klebsiella spp. (McDougall et al., Reference McDougall, Williamson, Gohary and Lacy-Hulbert2021). All bacterial species not included in this list were defined as minor pathogens.
The second sample underwent culture using the novel culture method. This consisted of a custom-made quartered plate (Fort Richard Laboratories) with four different agars: i) aesculin blood agar; ii) chromogenic Staph. agar; iii) chromogenic E. coli agar; and iv) chromogenic gram-negative agar (see Supplementary Figure S1). Each quarter of the plate was inoculated with 10 µl of milk and then incubated at 37°C for 24 h before assessment, with colony morphology, colour and size being used to identify bacterial species alongside haemolysis and aesculin reaction. Contaminated cultures and major pathogens were defined as for standard cultures.
Final cow enrolment and randomization (Day −7 to −1)
Once standard culture results had been finalized, they were provided to the veterinarian overseeing the study, who was responsible for the allocation of cows to DCAT protocols. The overseeing veterinarian did not know the novel culture results until after the cows had been allocated to their selective DCAT protocol. None of the culture results were communicated to any study farm staff.
Cows whose standard culture results were contaminated, or which had no bacteriology reported, were excluded from the study. Within each farm, cows were blocked based on standard culture results, i.e. major pathogen, minor pathogen or no growth. Animals with a mixed culture were assigned to the major pathogen group if at least one of the isolates was a major pathogen. Cows with a major pathogen were then further blocked based on species of major pathogen (e.g. all Staph. aureus positive cows were in one block). Within each block, cows were ordered from smallest to largest ear tag number, and a random number generator (1 or 2; Excel 16; Microsoft Corp., Redmond, USA) was used to decide the DCAT protocol of the cow with the lowest ear tag. If ‘1’ was selected, then the animal was enrolled into the protocol based on the results of the novel culture method (cult-SDCT), if ‘2’ was selected, then the animal was enrolled into the protocol based on the SCC and CM algorithm (alg-SDCT). The next animal on the list was then enrolled into the other diagnostic protocol, with the sequence repeated for all animals within a block.
Diagnostic-protocol-based identification of cows requiring treatment with dry cow antibiotics
For cows allocated to cult-SDCT, the results of the novel culture method were used to assign DCAT. Cows that were culture positive with a major pathogen or that returned a contaminated result were assigned to DCAT. Cows with no growth or that were culture positive for a minor pathogen only were identified as not requiring DCAT.
For cows allocated to alg-SDCT, cows identified as having an SCC > 150,000 cells/ml at the last herd test within 80 days of Study Day 0 and/or which had an electronic record of CM in the current lactation were assigned to DCAT (Gohary and McDougall, Reference Gohary and McDougall2018; Anon, 2020). All other cows in this group were identified as not requiring DCAT.
On all three farms during the afternoon milking on the day prior to DCAT administration, the primary veterinarian sprayed the top right rump of all cows assigned to DCAT with blue spray paint. All cows were sprayed with the same colour irrespective of whether they had been allocated to cult-SDCT or alg-SDCT.
Dry cow therapy treatment (Day 0)
Irrespective of which diagnostic protocol was used, cows that were assigned to DCAT received 500 mg cloxacillin plus 250 mg ampicillin per quarter (Dryclox DC, Elanco New Zealand Limited, Auckland, NZ) followed by an internal teat sealant containing 2.6 g bismuth subnitrate (Teatseal, Zoetis New Zealand Ltd, Auckland, NZ). All cows identified as not requiring DCAT received an internal teat sealant only.
Both products were administered after the final milking by trained veterinary technicians and veterinarians wearing clean disposable gloves. Before treatment, teat ends were cleaned with cotton wool balls moistened in 70% methylated spirits for at least 5 s. For cows given both DCAT and teat sealant, the cleaning process was repeated between the insertion of the DCAT and the insertion of the sealant. A partial insertion technique was used for both products, with DCAT massaged into udder prior to administration of teat sealant.
Immediately after treatment, cows were sprayed with marking paint unique to the person who applied their treatment, so that any association between dry period mastitis and the individual applicator of treatment could be determined. All cows’ teats were then sprayed with the farm's usual teat disinfectant before cows were returned to pasture.
Blinding
All farm staff and personnel administering the DCAT and sealant were blinded as to which protocol had been used to decide on a cow's treatment. However, farm staff were aware of which cows received DCAT. The biometrician completing the statistical analysis was blinded until the analysis was completed and the results reported.
Outcomes of interest
There were three outcomes of interest: i) the predictive ability of the two diagnostic approaches to identify cows with IMI due to major pathogens (using status determined by the standard laboratory culture method as the reference method); ii) the effect of diagnosis and treatment protocol on CM risk from dry-off to 30 days post-calving and iii) effect of diagnostic approach on subclinical infection risk (SCC).
Outcome 1. Predictive ability of the two diagnostic protocols to identify IMI due to major pathogens.
For cows allocated to dry cow treatment using the cult-SDCT protocol, this was analysed in two ways.
• Analysis 1: Contaminated samples after novel culture classified as predicting that the sample was negative for major pathogens.
• Analysis 2: Contaminated samples after novel culture classified as predicting that the sample was positive for major pathogens (reflecting how DCAT was assigned in the study).
For alg-SDCT, it was assumed that an SCC > 150,000 and/or a recorded case of CM during lactation indicated that an IMI due to a major pathogen was present.
Outcome 2. Effect of diagnostic protocol used to determine DCAT on CM risk from dry-off to 30 days post-calving.
Cows were monitored for signs of CM by farm staff daily for seven days following dry-off, then weekly until two weeks before their expected calving date (when daily observations restarted). Records of clinical cases occurring post-calving were recorded for the first 30 days in milk (DIM); only the first case was analysed if >1 case was recorded for an enrolled animal. Cases of CM in enrolled cows were recorded in the herd's management system (Minda Live, LIC, Hamilton, New Zealand) and treated following the individual farm treatment protocols.
Outcome 3. Effect of diagnostic protocol used to determine allocation to DCAT on subsequent subclinical infection risk (SCC).
Two herd tests were conducted on each of the farms: end of August (∼25 days from herd start of calving) and beginning of October. For each cow, the result of the sample closest to 30 days post-calving was used. This outcome was defined as a count variable and not used as the more crude binary indicator of infection status (infected or not).
Statistical analysis
All three primary outcome variables were analysed at the cow level, with all analyses carried out in R Version 4.3.1 (R Core Team, 2023).
Outcome 1. Predictive ability of the two diagnostic protocols to identify IMI due to major pathogens.
From the total enrolled study population, the sensitivity, specificity, positive predictive value (PPV) and negative predictive value (NPV) of cult-SDCT and alg-SDCT were calculated with respect to identifying IMI due to major pathogens (as determined by standard laboratory culture). For each of the metrics, the point estimate and 95% confidence interval, using the exact method, were calculated.
Outcome 2. Effect of diagnostic protocol used to determine allocation to DCAT on CM risk from dry-off to 30 days post-calving.
The number and proportion of animals diagnosed with CM in both diagnostic protocols were reported. Logistic regression was used to model the binary CM outcome. The diagnostic protocol was included a priori as the fixed effect of interest, with farm, age and SCC at dry-off included initially as covariates in the multivariable regression model, as well as a farm * diagnostic protocol interaction. Including the interaction term, each respective covariate was removed from the model, with the Akaike information criterion (AIC) compared between the model with the covariate removed and the full model. If the AIC of the full model was less than the nested model with a particular covariate removed, then that covariate was eliminated from the multivariable model. This continued until all included covariates resulted in a better model fit (lower AIC). Each eliminated covariate from the initial process was then added back into the multivariable model and the AIC was compared, with the covariate remaining if the AIC was lower. Throughout this process, if a covariate’s removal altered the coefficient or standard errors of the treatment group of interest by >20%, then it was considered a confounder and remained in the model irrespective of AIC. Outputs of the model were converted to report absolute risk difference and relative risk, with the non-inferiority assessed based on whether the upper limit of the 95% confidence interval absolute risk difference for animals in the cult-SDCT group compared with the alg-SDCT was >4%. Model assumptions of collinearity were assessed using variance inflation factors, with any covariate with a variance inflation factor >4 considered collinear. Outlier and influential data points were investigated. The presence of heteroskedasticity was visually assessed with boxplots and scatterplots of the standardized residuals for each variable, and linearity between the continuous predictor, SCC at dry-off, was assessed by plotting to log-odds across the range of SCC values.
Outcome 3. Effect of diagnostic protocol used to determine allocation to DCAT on subsequent subclinical infection risk (SCC).
The raw median and interquartile range (IQR) for SCC for both diagnostic groups were reported, as was the distribution of days-in-milk at the time of the first herd test. The SCC data were modelled as count data at the cow-level, using a negative binomial distribution (to account for overdispersion). The diagnostic protocol was included a priori as the fixed effect of interest. Days in milk, farm, age and SCC at dry-off were included initially as covariates in the multivariable regression model, as well as the farm * diagnostic protocol interaction. Model building and diagnostics were as described in Outcome 2 above. Outputs were reported as estimated marginal mean SCC that were back-transformed from the negative binomial model, with corresponding 95% confidence intervals, along with incidence risk ratios (i.e. the ratio of the number of cells per mL).
In addition, descriptive SCC and CM statistics were reported for six subsets within each of the diagnostic protocol groups based on the results of their pre-drying off standard laboratory culture and allocation to DCAT (see Table 5).
Table 5. Pathogen group (as identified by standard milk laboratory culture), with all cows whose sample returned a contaminated result being excluded), diagnostic protocol and whether cattle received dry-cow antibiotic therapy (DCAT) and their SCC and CM outcomes from a randomized clinical interventional study comparing two diagnostic methods for identifying major pathogen IMI and associated DCAT

Results
Demographic information from the three farms is presented in Supplementary Table S1. Of the 1997 eligible cows across the three farms, a total of 456 were excluded (Fig. 1) leaving 1541 enrolled cattle. A total of 1739 animals had composite milk samples collected and cultured using standard laboratory culture. Of these composite samples, contamination (>2 distinct colony types) was identified in 173 (10.0%) samples. Microbiology results for enrolled animals are presented in Table 1. A major pathogen was identified by standard culture in 223 (14.5%) samples. Of those major pathogens, 174 (78.0%) were Staph. aureus, 11.3% of all samples.

Figure 1. CONSORT diagram for a randomized clinical interventional study investigating the predictive ability of two selective dry cow antibiotic (SDCT) therapy protocols for major pathogen status at dry-off and udder health in dairy cows.
Table 1. Intramammary pathogen identification with standard laboratory culture or novel culture in a randomized clinical interventional trial investigating two methods of SDCT allocation. Animals with contaminated samples from the reference laboratory were excluded from the study population

a n (%); *, bacterial species defined as major pathogens.
Outcome 1. Predictive ability of the two diagnostic protocols to identify IMI due to major pathogens
The 2-by-2 tables for predicting the presence of a major pathogen IMI (based on standard laboratory culture – excluding all contaminated standard cultures) are presented for alg-SDCT protocol (Table 2a), for the cult-SDCT protocol with contaminated novel-culture method samples classified as negative (Table 2b) and for the cult-SDCT protocol with contaminated novel-culture method samples classified as positive (Table 2c).
Table 2. Contingency tables for the presence or absence of a major pathogen from conventional laboratory culture methods and whether an animal was predicted of having a major pathogen. For a), the alg-SDCT, where positive = SCC ≥ 150,000 at the dry-off herd test and/or at least one case of CM; negative = SCC < 150,000 at the dry-off herd test and no CM. For b), the cult-SDCT, where positive = presence of major pathogen from the cult-SDCT analysis; negative = no major pathogen or any samples reported as contaminated from the cult-SDCT analysis. For c), the cult-SDCT, where positive = presence of major pathogen or any samples reported as contaminated from the cult-SDCT analysis; negative = no major pathogen from the cult-SDCT analysis. Data from a randomized clinical interventional trial investigating two methods of SDCT allocation (n = 1541)

The test characteristics for alg-SDCT and the two cult-SDCT analyses as predictors of major pathogen IMI are presented in Table 3. The sensitivity, specificity, PPV and NPV were all greater for both of the cult-SDCT groups than the alg-SDCT. Assuming that a contaminated sample was positive, marginally, but not significantly, increased sensitivity (estimated difference 0.018; 95% CI −0.055 to 0.09) but decreased both specificity (estimated difference 0.036; 95% CI 0.012–0.059) and PPV (estimated difference 0.077; 95% CI 0.001–0.154).
Table 3. Comparison of alg-SDCT and cult-SDCT as predictors of major pathogen IMI (criteria for alg-SDCT from Table 3, cult-SDCT contaminated (contam) negative from Table 4 and cult-SDCT contaminated positive from Table 5). Data from 1541 cows from three farms enrolled into a randomized clinical interventional trial investigating two methods of SDCT allocation

Outcome 2. Effect of the diagnostic protocol used to determine the allocation to DCAT on CM risk from dry-off to 30 days post-calving
Seven hundred and seventy-six cows were enrolled into cult-SDCT and 765 enrolled into alg-SDCT. The pathogens isolated by standard laboratory culture, split by diagnostic protocol, are presented in Supplementary Table S2. An equal distribution of all major pathogens was achieved during the randomization process. Of the 776 cows allocated to the cult-SDCT protocol group, 181 (23.3%; 95% CI 20.5–26.4) received DCAT, which included 25 (3.2%) cows with a novel culture result classified as contaminated. Cows with a contaminated novel culture result accounted for 13.8% of the total DCAT administered, yet only 1/25 (4%) of these contaminated novel culture samples had a major pathogen reported after standard laboratory culture. In comparison, in the 765 cows allocated to the alg-SDCT protocol group, 190 (24.8%; 95% CI 21.9–28.0) received DCAT.
A total of 29/1541 (1.9%) animals were recorded as having CM between dry-off and 30 days post-calving. None of these 29 cows were recorded as having >1 case of mastitis during the study period. There were 15/776 (1.9%) and 14/765 (1.8%) cows that got mastitis in the cult-SDCT and alg-SDCT groups, respectively. As the probability of this outcome was rare (<5%), odds ratios can accurately approximate relative risk (Grimes and Schulz, Reference Grimes and Schulz2008). After accounting for age and SCC at dry-off, the relative risk of CM in animals in the cult-SDCT group was 1.07 (95% CI 0.47–2.37) times the risk of CM in animals in the alg-SDCT group. The absolute risk difference in CM for animals in the cult-SDCT group was 0.1 (95% CI −0.97 to 2.6) % compared with animals in the alg-SDCT group.
Outcome 3. Effect of the diagnostic protocol used to determine the allocation to DCAT on subsequent subclinical infection risk (SCC)
This analysis used data from 1453 cows (88 enrolled cows were excluded for SCC analysis, presented in Fig. 1). All 54 animals recorded as missing or having invalid herd test results were accounted for as alive in the herd after the initial herd test from other farm records (e.g. pregnancy test information or subsequent herd tests).
The SCC distribution was similar in both diagnostic groups. Median SCC for animals in the cult-SDCT group was 29,000 cells/ml (IQR 16,000–61,000), compared with 28,000 cells/ml (IQR 16,000–68,000) for alg-SDCT. There was no difference in the days-in-milk for animals in the cult-SDCT and alg-SDCT groups at their post-calving herd test. The median DIM was 27 (IQR 17–46) days for cult-SDCT compared with 28 (IQR 16–45) days for alg-SDCT.
After accounting for farm, age and dry-off SCC, compared with cows within the cult-SDCT group, cows within the alg-SDCT group had an SCC that was 1.14 times (95% CI 0.99–1.32; p = 0.067) higher at the post-calving herd test (Table 4). The predicted marginal mean SCC at this herd test was 129,000 cells/ml (95% CI 116,000–143,000) for cows in the alg-SDCT group and 113,000 cells/ml (95% CI 116,000–143,000) for cows in the cult-SDCT group.
Table 4. Multivariable negative binomial regression model output with SCC post-calving as the outcome from 1453 cows enrolled into either a cult-SDCT or alg-SDCT diagnostic group from a randomized clinical interventional trial investigating two methods of SDCT allocation

a IRR, incidence rate ratio; CI, confidence interval.
b For every 100 cells/ml increase in SCC at the dry-off herd test, the SCC post-calving increased by 13%.
The descriptive analysis of the subgroups within each diagnostic protocol is presented in Table 5. Density curves of log SCC post-calving are used to illustrate the differences between the combinations in SCC (Supplementary Figure 2).
Discussion
The effectiveness of SDCT is dependent on the correct identification of cows that will benefit from getting antibiotics, so that cows which do not need antibiotics are not treated with antibiotics, and those which do need treatment are given antibiotics. However, as yet, there is no ‘gold standard’ method of identification which will correctly identify all cows. As SDCT becomes more common, it is important to test new methods of selecting cows for DCAT and comparing them to the industry standard. This study is the first study of a culture-based SDCT protocol, as far as the authors are aware, to use bacteriology to determine allocation to DCAT beyond simply treating all cows with a positive culture (Rowe et al., Reference Rowe, Godden, Nydam, Gorden, Lago, Vasquez, Royster, Timmerman and Thomas2020b). Nonetheless, it is not a test of whether we should treat minor pathogen-associated IMI with DCAT, it is a test of the whole selection protocol (cult-SDCT) against the industry standard SCC-based alg-SDCT.
Effectiveness of the alg-SDCT protocol in predicting an IMI with a major pathogen
Our threshold of 150,000 cells/ml and at least one recorded CM case in the current lactation has been standard practice in New Zealand for selecting cows for DCAT (Laven and Lawrence, Reference Laven and Lawrence2008). This choice of SCC threshold hinders direct comparison with other studies but the sensitivity and specificity of alg-SDCT in detecting major pathogens in our study is similar to McDougall et al. (Reference McDougall, Williamson, Gohary and Lacy-Hulbert2021) and Clabby et al. (Reference Clabby, Valldecabres, Dillon, McParland, Arkins, O'Sullivan, Flynn, Murphy and Boloña2023). Nevertheless, 33% of animals with a major IMI identified by standard laboratory culture were not identified by alg-SDCT. And whilst the PPV reported in this current study was greater than that reported by McDougall et al. (Reference McDougall, Williamson, Gohary and Lacy-Hulbert2021), this value was still only 38%, despite the prevalence of major IMI, as defined by standard culture, to be high. This study provides further evidence that alg-SDCT protocols result in the majority of animals that receive DCAT not having major IMI at the time of, or near, dry-off.
Effectiveness of the cult-SDCT protocol in predicting an IMI with a major pathogen
Irrespective of how we treated contaminated results, cult-SDCT more accurately predicted major pathogen IMI status than alg-SDCT, with higher sensitivity and specificity, consistent with Rowe et al. (Reference Rowe, Godden, Nydam, Gorden, Lago, Vasquez, Royster, Timmerman and Thomas2020a). Unlike SCC, there are no studies in pasture-based cows on the accuracy of alternative culture methods in determining IMI status at drying off. Rowe et al. (Reference Rowe, Godden, Nydam, Gorden, Lago, Vasquez, Royster, Timmerman and Thomas2020a) evaluated a rapid-on-farm culture (Minnesota Easy 4 Cast Plate) in housed cows. The specificity of our cult-SDCT protocol for identifying major pathogens (88–91%) was higher than that reported by Rowe et al. (Reference Rowe, Godden, Nydam, Gorden, Lago, Vasquez, Royster, Timmerman and Thomas2020a) (44–54%), as was our sensitivity (80–82% vs 72–75%). The cult-SDCT protocol used in our study was more similar to conventional culture than the system used by Rowe et al. (Reference Rowe, Godden, Nydam, Gorden, Lago, Vasquez, Royster, Timmerman and Thomas2020a), so these results are unsurprising. Nevertheless, neither sensitivity nor specificity was close to 100%. In part, this may be due to differences between the novel culture method and the standard protocol, particularly the use, in the novel culture, of a quarter plate and the use of chromogenic agars (both of which, particularly the latter, can reduce sensitivity and specificity; Garcia et al., Reference Garcia, Fidelis, Freu, Granja and Dos Santos2021). But some of the differences may be due to standard culture being far from a gold standard test (Dohoo et al., Reference Dohoo, Smith, Andersen, Kelton and Godden2011). Nevertheless, we chose standard laboratory culture as the reference test because it is a widely accepted method (Dohoo et al., Reference Dohoo, Smith, Andersen, Kelton and Godden2011; Rowe et al., Reference Rowe, Godden, Nydam, Gorden, Lago, Vasquez, Royster, Timmerman and Thomas2020a) and using it as the reference test simplified the comparison between the two protocols used in this study.
Direct comparisons with standard culture DCAT usage are not possible with this study design, due to the study exclusion criteria of contaminated milk samples at enrolment and randomization. However, if DCAT therapy protocols were based on all 1541 enrolled animals, there would have been a significant, but modest, reduction in the DCAT usage in the cult-SDCT compared with the alg-SDCT group (relative risk of recommending DCAT treatment using alg-SDCT rather than cult-SDCT was 1.14 (95%CI 1.01–1.29).
The modest reduction in the use of DCAT when using cult-SDCT compared with alg-SDCT was partly because of the ‘safety first’ policy of recommending DCAT for cows with contaminated cult-SDCT results. Across our study population, this resulted in 51 extra cows being recommended DCAT of which only four had a major pathogen identified by standard culture. Nevertheless, even if no DCAT was the recommendation for contaminated samples, across the whole population, cult-SDCT would still have identified 72 more cows (298 vs 226, respectively) as requiring DCAT than standard culture (see Table 4). Further research is required to establish the cause of this difference.
Impact on CM risk
The overall risk of CM was <2% across all enrolled cows, markedly less than the anticipated 10% (although consistent with previous data from New Zealand, with McDougall (Reference McDougall1999) reporting a range of herd level prevalences from 0.9% to 21.4%). Low prevalences increase the confidence in our estimate of the absolute difference in CM between the two protocols but mean that relative differences are less accurately estimated (Mauri and D'Agostino, Reference Mauri and D'Agostino2017). As such although our absolute risk calculation showed that the non-inferiority margin was not breached, the small number of clinical cases means that we cannot exclude a relative risk that is higher than ideal (unadjusted relative risk for cult-SDCT vs alg-SDCT 1.05 (95% CI 0.51–2.2). We need more data to better identify the impact on CM risk of using cult-SDCT rather than alg-SDCT to determine DCAT.
Impact on SCC
Once farm, age and dry-off SCC were accounted for, the difference in post-calving SCC between the cows in the two protocol groups was small (IRR = 1.14; 95% CI 0.99–1.32 for alg-SDCT vs cult-SDCT cows, respectively). If the true IRR was 0.99 (i.e. marginally lower in alg-SDCT than cult-SDCT), we would almost never (2.5% of the time) expect an IRR as large as we report. Thus, a biologically important increase in SCC of cult-SDCT cows compared to alg-SDCT cows is not compatible with our data. The data from this study thus support the proposition that compared to using an SCC-based protocol, using the culture-based protocol evaluated in this study (i.e. using a novel culture method to identify IMI and only use DCAT when major pathogens were identified), can reduce antibiotic use without increasing SCC or CM rates.
However, there are several caveats to that conclusion. Most importantly, this is a small-scale study with only three farms – which may not be representative of New Zealand dairy farms especially in the proportion of IMI due to Staph. aureus and NAS, which were both higher than previously reported by McDougall et al. (Reference McDougall, Williamson, Gohary and Lacy-Hulbert2021). Compared with the standard SCC-based protocol, our culture-based protocol assigned cows to DCAT which did not have elevated cell counts but did have major pathogen-associated IMI and did not assign cows to DCAT that had elevated cell counts but no associated bacteria or only minor pathogens. As the comparative effectiveness of the culture-based protocol is probably dependent, at least in part, on the proportion of cows in these three groups, our results need replicating on more farms with a wider range of pathogens and pathogen proportions.
The second caveat is regarding NAS. In this study, as in most studies that have evaluated NAS (De Buck et al., Reference De Buck, Ha, Naushad, Nobrega, Luby, Middleton, De Vliegher and Barkema2021), they were combined as a single category. However, it is increasingly clear that NAS vary widely in relation to important factors, such as the source of infection and pathogenicity (De Buck et al., Reference De Buck, Ha, Naushad, Nobrega, Luby, Middleton, De Vliegher and Barkema2021). Thus, our conclusions may only apply to farms that have the same NAS species as present on this farm and where ∼25% (103/406) of cows with NAS have elevated SCC.
The third caveat is that our bacterial culture results were based on a single sample taken near drying off. The sensitivity of a single culture for identifying IMI is limited (Dohoo et al., Reference Dohoo, Smith, Andersen, Kelton and Godden2011); multiple cultures would undoubtedly provide a more accurate picture of the bacteria causing IMI on a farm. However, this would significantly add to costs, and, at least on the study farms, single-sample bacteriology performed as well as the standard alg-SDCT in relation to SCC and CM risk.
The fourth and final caveat is in relation to the definition of a minor pathogen. Alongside NAS we categorized Serratia, Enterobacter and Enterococcus as minor pathogens as they were not included by McDougall et al. (Reference McDougall, Williamson, Gohary and Lacy-Hulbert2021) in their list of major pathogens. All three genera could be considered major pathogens, although the evidence that DCAT is effective at treating IMI due to these bacteria is limited (McMullen et al., Reference McMullen, Sargeant, Kelton, O'Connor, Reedman, Hu, Glanville, Wood and Winder2021). However, if we are to use a protocol designed to avoid the treatment of minor pathogen IMI, we need an agreed list of what constitutes a minor pathogen.
Those limitations notwithstanding, our results do support our hypothesis that with information about the bacteria present prior to drying off, we may be able to continue to reduce antibiotic use without markedly affecting CM risk and post-calving SCC. However, the development of the protocol highlighted the lack of information we have on the value of DCAT in three key groups of cows at drying off: 1) cows with minor pathogen-associated IMI and elevated SCC; 2) cows with no identified pathogen but elevated SCC; and 3) cows with major pathogen-associated IMI but no SCC elevation. In this study of the 1541 cows, these groups accounted for 110, 105 and 80 cows, respectively, so they are not uncommon (together accounting, in this study, for 19% of all cows). Yet we lack robust information regarding whether the use of antibiotics at drying off in such cows is justified. If we are to further develop culture-based selective dry cow protocols, we urgently need such information, so that we can make prudent choices that limit antibiotic use without markedly increasing the risk of mastitis.
Conclusion
We used a novel culture system to identify and treat cows with major pathogen-associated IMI at drying off. Compared with a standard algorithm-based protocol using SCC and CM, our culture-based protocol treated a higher proportion of cows with major pathogens identified by traditional culture and a lower proportion of cows with minor pathogens. The two protocols resulted in similar post-calving SCC and CM incidence. Thus, on these farms, our culture-based protocol reduced antibiotic use without affecting clinical or subclinical mastitis risk.
Supplementary material
The supplementary material for this article can be found at https://doi.org/10.1017/S0022029925101180
Acknowledgements
The authors would like to thank the veterinarians and technicians involved in sampling and study processes. We also would like to thank the farm managers and staff of the three study farms, who helped this study run smoothly on large commercial dairy farms. This study was made possible with funding by DairySmart, and the authors are thankful for their assistance and technical support during the study.
Competing interests
The authors confirm that they all contributed to the study and manuscript production in various ways. EC and RN were weighted more heavily to protocol and carrying out study. RL weighted more heavily to manuscript production, with WM involved equally in all sections.
The study was funded by DairySmart but did not contribute to the manuscript nor study protocol. We can confirm that none of the authors are affiliated with DairySmart.
Interpretive summary
Bacterial culture may give a more accurate measure of IMI at dry-off than an SCC guided algorithm, improving the responsible use of antibiotics at drying off. This study compared a novel rapid culture-based protocol where only cows identified as having an IMI due to major pathogens were treated to the current industry standard of using SCC and CM history to identify cows needing antibiotic treatment at drying off. Compared with the industry standard approach, a novel culture system identified a higher proportion of major pathogens identified by conventional culture, and using this system to treat only major pathogens did not increase post-calving SCC.