Modifiable healthcare factors affecting 28-day survival in bloodstream infection: a prospective cohort study

Background Bloodstream infection is common in the UK and has significant mortality depending on the pathogen involved, site of infection and other patient factors. Healthcare staffing and ward activity may also impact on outcomes in a range of conditions, however there is little specific National Health Service (NHS) data on the impact for patients with bloodstream infection. Bloodstream Infections – Focus on Outcomes is a multicentre cohort study with the primary aim of identifying modifiable risk factors for 28-day mortality in patients with bloodstream infection due to one of six key pathogens. Methods Adults under the care of five NHS Trusts in England and Wales between November 2010 and May 2012 were included. Multivariable Cox regression was used to quantify the association between modifiable risk factors, including staffing levels and timing of appropriate therapy, and 28-day mortality, after adjusting for non-modifiable risk factors such as patient demographics and long-term comorbidities. Results A total of 1676 patients were included in the analysis population. Overall, 348/1676 (20.8%) died within 28 days. Modifiable factors associated with 28-day mortality were ward speciality, ward activity (admissions and discharges), movement within ward speciality, movement from critical care, and time to receipt of appropriate antimicrobial therapy in the first 7 days. For each additional admission or discharge per 10 beds, the hazard increased by 4% (95% CI 1 to 6%) in medical wards and 11% (95% CI 4 to 19%) in critical care. Patients who had moved wards within speciality or who had moved out of a critical care ward had a reduction in hazard of mortality. In the first 7 days, hazard of death increased with increasing time to receipt of appropriate antimicrobial therapy. Conclusion This study underlines the importance of appropriate antimicrobials within the first 7 days, and the potential for ward activity and ward movements to impact on survival in bloodstream infection.


Key points
Modifiable risk factors associated with 28-day mortality after BSI are ward speciality, ward activity, ward movement within speciality, movement from critical care and time to receipt of appropriate antibiotics in the first 7 days.

Background
At least 100,000 patients have an episode of bloodstream infection (BSI) each year in England, Wales and Northern Ireland [1]. Depending on the pathogen involved, underlying patient characteristics, the severity of infection and treatment provided the death rate from these infections can reach 15-25% at 30 days and 50% at 3 years [2][3][4]. Poor outcomes are known to be related to several non-modifiable patient factors including age, comorbidities, severe sepsis, source of infection, neutropenia, type of infection, and intensive care (ICU) admission [2,[5][6][7][8][9][10]. In contrast, timely appropriate antimicrobial chemotherapy and removal of infected prosthetic materials have been found to be beneficialbut data are typically from single-centre studies, and information on the estimated size of these effects is limited. Staffing levels are known to impact a range of care quality measures, including patient mortality [11][12][13]. In addition, nursing skill mix and the use of nonpermanent staff may increase the rates of hospitalacquired infection [14]. However there is less information on the impact of staffing on infection outcomes in general, and none on its impact on BSI outcomes in particular [14,15].

Study design
BSI-FOO is a prospective multicentre cohort study conducted across five NHS acute hospital trusts in England and Wales. The study period was from November 2010 to May 2012, but for administrative reasons the study opened and closed on different dates in the five centres. Data were collected from routine care, investigations or tests and recorded according to usual clinical practice. The National Information Governance Board approved the use of such routinely collected data without individual patient consent under section 251 of the NHS Act 2006. Patients were followed up for 28 days from the date the diagnostic blood sample was taken.

Study population
Adults (≥18 years old) receiving in-patient NHS hospital care and having a clinically significant BSI with an organism in one or more of the six key pathogen groups were included. Non-ESBL-producing E. coli is one of the most prevalent pathogen of BSI in the UK, so to ensure all pathogen groups were adequately represented, a random sample of one-third of episodes caused by non-ESBL-producing E. coli and all episodes caused by the other six key pathogens were included. The number of cases at each of the hospitals during the study period determined the sample size.
Patients with HIV-positive serology, cystic fibrosis, on an end-of-life care pathway when the blood sample was taken, in the custody of HM Prison Service of England or Wales, not receiving NHS care, not an in-patient when the blood sample was taken and not admitted shortly afterwards, or discharged on the day the sample was taken, with notes irretrievably missing or a generalised refusal to take part in research noted in medical records were excluded. Duplicate blood cultures i.e. samples from the same infection episode, were also excluded.

Outcome measures
The primary outcome was time to death up to 28 days of taking the diagnostic positive blood sample. Death within seven days was a secondary outcome. Other outcomes, to be reported elsewhere, were time to resolution of fever, and length of hospital stay.

Risk factors
An episode of infection began when the first positive blood sample confirming BSI was taken (defined as day/ time 0). Any factors present before time 0 were considered non-modifiable (see Supplemental Table 1, Additional file 1). Modifiable risk factors considered were aspects of hospital care received from time 0 onwards, which included staffing levels, ward activity (number of admissions/discharges), movements between wards, timing of appropriate therapy and continuing presence of intravenous lines and catheters (see Supplemental Table 2, Additional file 1). Within each centre, ward level information including ward speciality, staffing and activity was collected from day 0 to 7, for the ward where the patient spent most of their day. Overall staffing levels, including healthcare assistants, trustemployed nurses and agency nurses, was averaged across three shifts (early, late and night) and defined as staff: bed ratio (number of staff per bed). Ward activity per ped was defined similarly. The presence or absence of central lines, peripheral lines and urinary catheters was observed on day 0, and their presence on days 1 to 28 determined using the date of removal.
Antimicrobial therapy was defined as 'appropriate' if the organism was susceptible to the antimicrobial prescribed, and therapy continued for at least 36 h to allow therapeutic effect. Susceptibility was judged first by test results for the antimicrobial prescribed (if available), then by inference from test results for related antimicrobials. In the absence of any relevant test results, susceptibility was deduced from the inherent susceptibility/resistance profile for the species concerned [16]. If treatment was changed from one appropriate antimicrobial to another, this was treated as a single period of appropriate therapy providing that the next therapy began within 24 h of the last dose of the previous therapy.

Statistical analyses
The analysis population consisted of all eligible participants entered into the study. Repeat episodes (distinct infection episodes within the same patient) and patients with polymicrobial episodes (infections involving more than one microbial species) were excluded. The statistical model was built in two stages:

Stage one model
The non-modifiable risk factors were assessed univariately and factors associated with mortality were identified using Cox regression analysis with a 20% significance level. These identified factors were then considered for inclusion in a multivariable model with factors identified using backwards selection. The estimated ln(hazard ratio) for each factor included in the model were then used to derive a "risk score" for each patient. The proportional hazards assumption was assessed and if the assumption was not met then the model was stratified by the variable(s) causing non-proportional hazards. Multivariable fractional polynomial models were used to select the best-fitting functional form for continuous variables.

Stage two model
All modifiable risk factors were included in the model regardless of statistical significance, with organism and the risk score derived in stage one included as covariates. Episodes were split at daily intervals from day 0 to 28 with ward speciality, central line, peripheral line, urinary catheter, ward movements, staffing levels, ward activity, and antimicrobial therapy variable values updated at each interval (see Supplemental Table 2, Additional file 1). Ward data was only collected up to day 7 and so for patients who survived and were discharged after day seven, ward speciality, staffing, ward activity and ward movements were assumed be constant for the remaining days (up to death/discharge or day 28). For risk factors which were bounded by survival time e.g. time to receipt of appropriate therapy and number of ward movements, time-dependent variables were used within the data framework. The value of the time-dependent exposure was calculated as a cumulative count of ward movements up until that day. So, for each day at risk, the ward movement count was increased by one if the patient moved wards or remained the same if the patient did not. Similarly, a cumulative count of days before first receipt of appropriate antimicrobial therapy was used. That is, on day 0 time to receipt of appropriate therapy was 0 for all patients, remaining at 0 each day for patients who received appropriate therapy on day 0 otherwise increasing by one for each additional day until receipt of first appropriate therapy. This ensured that on day 3, for example, the maximum time to appropriate therapy was 3 days, for both survivors and deceased.
Interactions between ward speciality and ward activity and between ward speciality and staffing levels were included in the model regardless of statistical significance. The following interaction terms were then considered for potential inclusion: organism with each of risk score, central line, peripheral line, urinary catheter and time to appropriate antimicrobial therapy, and ward speciality with within-ward speciality movements. A forward stepwise approach was taken to select interactions to be included in the final model, using likelihood ratio tests to compare nested models with a 10% significance level. This level was chosen to increase statistical power whilst ensuring that not the number of events per variable did not exceed 10 [17].
Model fit was assessed using standard methods and calibration was assessed by comparing observed event rate for patients in each decile of predicted event rates. Collinearity was examined using the variance inflation factor with values < 5 considered acceptable. The proportional hazards assumption was assessed as for the stage one model. If non-proportional hazards were indicated, then time was categorised into periods where proportional hazards appeared valid and an interaction between this categorised time and the variable with non-proportional hazards was added to the model.
The risk score derived from the stage one model was also used for the analysis for the secondary outcome, so only stage two was repeated for 7-day mortality.
Missing data was assumed to be missing at random and missing values were imputed using multiple imputation (45 imputations) and the results were combined using Rubin's rules. Non-normally distributed variables were transformed prior to imputation. If a suitable transformation could not be found or the imputation procedure imputed values outside valid ranges then predictive mean matching was used for the imputation of that variable. All variables that were in the primary analysis model, potential auxiliary variables (Supplemental Table 1, Additional file 1), indicator for death and the log of survival time were included in the imputation procedure. We intended to allow for any interaction terms in the main analysis model in the imputation procedure by imputing separately for each category of one of the variables involved in any interactions. However, unfortunately, the imputation procedure would not converge, and it was not possible to include the interactions in the imputation. The model selection process was performed on a single imputed dataset so that log-likelihood statistics could be calculated and compared. All analyses were performed in Stata version 14.0 (StataCorp LP, College Station, Texas, USA).

Sensitivity analysis
In the main analysis, an antimicrobial treatment was considered 'appropriate' if the organism was susceptible to the antimicrobial prescribed and the therapy continued for at least 36 h; and was not considered appropriate if the patient died within 36 h of starting treatment. However, this may lead to inflated effect estimates as the death could be viewed as a consequence of not receiving the therapy. A sensitivity analysis with the "36-h rule" removed was performed to assess this possibility and we also investigated a 12-h and 24-h rule. A complete case analysis was also performed to assess the impact of the multiple imputation.

Results
In total, 1828 patients (1903 eligible blood samples) were recruited; 227 repeat and/or polymicrobial episodes were excluded leaving an analysis population of 1676 patients: 116 with Candida, 168 ESBL-producing Enterobacteriales, 542 E. coli, 237 P. aeruginosa, 513 MSSA and 100 MRSA (Supplemental Figure 1, Fig. 1. Non-modifiable factors including demography and patient comorbidities are summarised by 28-day survival status in Table 1 and Supplemental Table 3, Additional file 1. Approximately 55% of the patients were male, with a median age of 68.5 years (interquartile range (IQR) 53.0 to 80.0).
Modifiable risk factors and the presence of lines are summarised by 28-day survival status in Table 2 and  Supplemental Table 4, Additional file 1. Approximately two-thirds of patients had a line present at time 0 and both central and peripheral lines at time 0 were more common in patients who died (30.5% vs 22.6, and 62.5% vs 46.8%, respectively). The average number of nurses per 10 beds on day 0 was slightly higher for patients who died compared to those who survived (median 1.7 nurses [IQR 1.2 to 7.0] vs 1.4 [IQR 1.1 to 2.1]). Ward activity on day 0 was similar with a median of 3.7 admissions/discharges per 10 beds (IQR 1.9 to 7.3) for patients who survived and 3.5 admissions/discharges per 10 beds (IQR 1.8 to 5.5) for patients who died. Overall, 84.5% of patients (1416/1676) received appropriate antimicrobial therapy with a median time to receipt of 7 h (IQR 1-40) (Supplemental Table 5, Additional file 1).
There was some suggestion of slightly reduced numbers of nurses at weekends compared to weekdays, but the average number of health care assistants (HCA) and agency staff did not appear to change throughout the course of a week (Supplemental Figure 2, Additional file 1). Ward activity followed a similar trend to that of nursing levels, the quietest days being Saturday and Sunday with a median of 1.8 (IQR 1.5 to 4.5) and 1.5 (IQR 0.7 to 2.9) admissions/discharges per 10 beds, respectively, compared to a median above 3 on all other days (Supplemental Figure 3, Additional file 1).
All patients were followed up for the full 28-day follow-up period and were therefore not censored at hospital discharge. The model used to derive the risk score from non-modifiable risk factors for mortality is shown in Supplemental Table 6, Additional file 1. The model for modifiable risk factors, after adjusting for risk score and organism is detailed in Fig. 2 and Supplemental Table 7, Additional file 1. Interaction terms that were included in the model (p < 0.1) were i) risk score by ward speciality, ii) time to receipt of appropriate antimicrobial therapy by organism. Model checks suggested that the hazards for time to receipt of appropriate antimicrobial therapy were not proportional. Therefore, follow-up time was categorised into three intervals: days 0-6, days 7-13, and day 14 onwards, and the effect of time to appropriate antimicrobial therapy was estimated separately for each interval.
The final model indicated that ward speciality (ICU), increased ward activity, and time to receipt of appropriate antimicrobial therapy in the first 7 days were associated with increased hazard of death and ward speciality (surgical), movement within ward speciality and movement from a critical care ward were associated with decreased hazard of death within 28 days, after adjustment for risk score and all other factors included in the model (Fig. 2).
The effect of risk score on mortality was greatest for patients in surgical (Hazard ratio (HR) 2.89; 95% CI 2.13 to 3.90) and medical wards (HR 2.77; 95% CI 2.35 to 3.27). For patients in critical care, the effect was still highly statistically significant, but with a smaller effect size (HR 1.84; 95% CI 1.54 to 2.19). The presence of central lines, peripheral lines and urinary catheters were not significantly associated with 28-day mortality.
Patients in critical care wards had a 111% increase (95% CI 30 to 241%) in hazard of mortality within 28 days and patients in surgical wards a 37% decrease (95% CI 2% increase to 61% decrease), compared to patients in a medical ward. These values are estimated at the median staffing, ward activity and risk score values, due to interactions between ward speciality and these terms.
The average staff per 10 beds was not significantly associated with 28-day mortality, although the estimated effect was greater for surgical wards (HR 0.95; 95% CI 0.63 to 1.44) compared to medicine (HR 1.00; 95% CI 0.86 to 1.16) and critical care (HR 0.99; 95% CI 0.96 to 1.02). In terms of ward activity, in a medical ward for each increase in 1 admission or discharge per 10 beds there was a 4% (95% CI + 1% to + 6%) increased hazard of death within 28 days. This increase in hazard was slightly higher in a critical care ward (12%; 95% CI + 6% to + 19%) and negligible in a surgical ward (3%; 95% CI − 9% to + 16%).
Patients who had moved wards within a speciality had a 33% reduced (95% CI: − 7% to − 52%) hazard of death within 28 days. There was also a 48% reduction (95% CI − 76% to + 11%) for patients who had moved out of a critical care ward. There was no evidence to suggest movement to a critical care ward, movement from surgery to medicine or from medicine to surgery impacted on 28-day mortality.
The effect of time to receipt of appropriate antimicrobial therapy varied depending on organism and time. There was a highly significant effect for all organisms for each day delay during the first week. The effect was greatest for MSSA with a 102% increase (95% CI + 71% to + 138%) in hazard of mortality associated with each day delay until the receipt of first appropriate antimicrobial therapy, and lowest for MRSA, with a corresponding 39% increase (95% CI + 3% to + 88%) in hazard of mortality. For patients who survived to day 7, the effect of time to receipt of first appropriate therapy on 28-day mortality was not statistically significant after day 7. Predicted and observed risks are given by deciles of predicted risk for 28-day mortality in Supplemental Table 8, Additional file 1.
The sensitivity analyses assessing the impact of changing the "36-h rule" in defining appropriate antimicrobial therapy are given in Supplemental Figures 4, 5     Of the 348 deaths within 28 days, just over half occurred within the first 7 days (Supplemental Table 9, Additional file 1). Across the different organisms 7-day mortality followed a similar pattern to 28-day mortality, patients with P. aeruginosa having the highest death rate (20.7%) and non-ESBL-producing E. coli the lowest (5.2%). The analyses indicated that ward speciality, ward activity, movements within ward specialities, and time to receipt of appropriate antimicrobial therapy in the first 5 days were all risk factors associated with mortality within 7 days, after adjustment for other factors (Fig. 3 and Supplemental Table 10, Additional file 1). Table 1 Summary table of non-modifiable

Discussion
This report describes modifiable risk factors for 28day mortality in patients with bloodstream infection due to one of six key pathogens. Our main findings were that: For all the pathogen groups studied timely appropriate antimicrobial therapy was associated with improved mortality over 28 days, and the effect of each day delay was most marked in the first 7 days. Increased ward activity (admissions and discharges) was associated with increased hazard of death within medical wards and especially in critical care wards.
The analysis of risk factors for mortality was split into those classified as non-modifiable factors and those that could be modified by changes in organisation or patient management. For the pathogen groups studied, ward speciality, ward activity, ward movement within speciality, movement from critical care and time to receipt of appropriate antibiotics were all independently associated with mortality.
The reduced hazard of death within 28 days for patients who moved wards within ward specialty or had moved out of a critical care ward could be a result of healthier patients being more likely to be moved or a result of improving condition. Similarly, the association between ward specialty and mortality is likely to be related to the severity and complexity of patients admitted to the ward.
There are a large number of publications relating organisational and management factors to infection control performance in acute hospitals including workforce and workload [18]. However, there are no previous studies relating ward staffing and activity to infection outcomes. In our descriptive analysis of modifiable factors, patients who died were on wards where the average number of nurses per 10 beds on day 0 was slightly higher than for those who survived, but this may reflect the higher mortality in intensive care units where nursing staffing was much higher. After adjustment for other factors there was no evidence of a statistically significant effect of staffing levels on 28-day mortality. Interestingly, the number of NHS-employed nurses, healthcare assistants or agency staff working did not vary greatly by day of week, although nurse numbers were slightly lower at the weekend. However, ward activity was markedly lower at weekendswhich is perhaps not surprising despite the current drive in the NHS towards a 7 day working week [19]. Ward activity was highest on the day blood samples were taken and diminished over the following week, possibly as patients are moved from high activity settings such as emergency departments or admissions units to wards having more stable populations of patients undergoing recovery. It has been shown that increasing exposure to shifts with high turnover of patients is associated with an increase in the risk of death, however there is less information on the impact of workload on infection outcomes in particular [20]. In an adjusted model where ward activity was updated daily to reflect the ward activity where the patient spent most of the day, increased ward activity was associated with an increased hazard of death within 28 days.
Appropriate antimicrobial therapy has been shown to reduce mortality based on a large number of publications over the last 20 years [21]. There are several more recent systematic reviews and meta-analyses indicating that appropriate antimicrobial therapy has survival benefit in both BSI [22] and severe sepsis [23,24]. Our data shows that delays in administration of appropriate antimicrobials impact on outcome in BSI over days 0-6.

Strengths and limitations
This is one of the largest observational cohort studies of patients with BSI in the NHS. A key strength of the study is that it did not require individual patient consent therefore reducing the risk of selection bias. For example, less acutely ill patients may be more likely to be approached and consent which could undermine the scientific integrity and public value of the research. The National Information Governance Board approved the Fig. 2 Adjusted Cox model of modifiable risk factors on 28-day mortality. a Effect of organism is given for the time period 0 to 6 days, when time to appropriate therapy is 1 day. b Effect of ward speciality is given for the median number of staff per 10 beds, median ward activity and median risk score. Abbreviations: ESBL = Extended-spectrum beta-lactamase, MRSA-Methicillin-resistant S. aureus, MSSA = Methicillin-susceptible S. aureus, CI=Confidence interval use of such routinely collected data without individual patient consent under section 251 of the NHS Act 2006. A large number of data items were collected which enabled us adequality control for potential confounding in the analysis and also allowed us to include variables that are predictive of missing data in a covariate of interest in the multiple imputation procedure making the missing at random assumption plausible. There are some limitations of the study. Firstly, focussing on six key pathogens may limit the generalisability of the results. This study focused on patients with clinically significant pathogens that produce large numbers of infections and may have multidrug resistance. We specifically focused on pathogens that were highly unlikely to be contaminants but were also common cases of BSI, linked with significant mortality and remain a significant problem across the NHS. Results may not be generalisable to other bloodstream infections and therefore conclusions drawn should be limited to the six key pathogens studied. Finally, this study focussed on NHSemployed nurses, health-care assistants and agency nurses, but did not explore the impact of other medical staffing levels such as consultants which may merit analysis in future research.

Conclusions
Our overall conclusion is that for all the pathogen groups studied timely appropriate antimicrobial therapy was associated with improved clinical outcome as measured by mortality over 28 days, and the effect of each day delay was most marked in the first 7 days. In addition, increased ward activity (admissions and discharges) was associated with increased hazard of death within medical wards and especially in critical care wards. Fig. 3 Adjusted Cox model of modifiable risk factors on 7-day mortality. a Effect of organism is given for the time period 0 to 5 days and when time to appropriate therapy is 1 day. b Effect of ward speciality is given for the median number of staff per 10 beds, median ward activity and median risk score. Abbreviations: ESBL = Extended-spectrum beta-lactamase, MRSA-Methicillin-resistant S. aureus, MSSA = Methicillin-susceptible S. aureus, CI=Confidence interval