Equity-centered adaptive sampling in sub-sewershed wastewater surveillance using census data
Sub-city, or sub-sewershed, wastewater monitoring for infectious diseases offers a data-driven strategy to inform local public health response and complements city-wide data from centralized wastewater treatment plants. Developing strategies for equitable representation of diverse populations in sub-city wastewater sampling frameworks is complicated by misalignment between demographic data and sampling zones. We address this challenge by: (1) developing a geospatial analysis tool that probabilistically assigns demographic data for subgroups aggregated by race and age from census blocks to sub-city sampling zones; (2) evaluating representativeness of subgroup populations for COVID-19 wastewater-based disease surveillance in Davis, California; and (3) demonstrating scenario planning that prioritizes vulnerable populations. We monitored SARS-CoV-2 in wastewater as a proxy for COVID-19 incidence in Davis (November 2021-September 2022). Daily city-wide sampling and thrice-weekly sub-city sampling from 16 maintenance holes covered nearly the entire city population. Sub-city wastewater data, aggregated as a population-weighted mean, correlated strongly with centralized treatment plant data (Spearman's correlation 0.909). Probabilistic assignment of demographic data can inform decisions when adapting sampling locations to prioritize vulnerable groups. We considered four scenarios that reduced the number of sampling zones from baseline by 25% and 50%, chosen randomly or to prioritize coverage of >65-year-old populations. Prioritizing representation increased coverage of >65-year-olds from 51.1% to 67.2% when removing half the zones, while increasing coverage of Black or African American populations from 67.5% to 76.7%. Downscaling had little effect on correlations between sub-city and centralized data (Spearman's correlations ranged from 0.875 to 0.917), with strongest correlations observed when prioritizing coverage of >65-year-old populations.
Identifying predictors of . in rural household water in sub-Saharan Africa using elimination regression
Exposure to fecally contaminated drinking water contributes to the global disease burden, especially in sub-Saharan Africa (SSA). We used cross-sectional data and elimination regression analysis to examine factors influencing contamination in household drinking water samples from 4,499 rural households in nine countries in SSA (Malawi, Mozambique, and Zambia in Southern Africa; Ghana, Mali, and Niger in Western Africa; and Kenya, Rwanda, and Tanzania in Eastern Africa). The proportion of household water samples containing was 71%, ranging from 45% (Malawi) to 89% (Tanzania). Pooled and multi-country predictive logistic regression models showed that using an unimproved-type water source, the absence of a community water committee, and domestic animal ownership were significantly associated with household drinking water contamination. Household water treatment and storage practices, sanitation and hygiene practices, and payment for drinking water were not significantly associated with contamination in any model. The season was a significant predictor of in the pooled model; samples collected in the rainy season were 2.3 [2.0, 2.7] times as likely to be contaminated with . Practitioners and policymakers should prioritize implementing piped on-plot water services, establishing effective local water source management structures, and incorporating animal husbandry practices into water, sanitation, and hygiene interventions.
Impact of orthophosphate on the solubility and properties of lead orthophosphate nanoparticles
Orthophosphate (PO) is a commonly used corrosion control treatment to reduce lead (Pb) concentrations in drinking water. PO reduces Pb concentrations by forming relatively insoluble lead phosphate (Pb-PO) minerals. In some cases, however, Pb-PO minerals have been observed to form nanoparticles, and if suspended in water, these nanoparticles can be mobile and reach consumer taps. Although recent research on Pb-PO particles has been performed, there remains a need to improve our understanding of the nature of Pb-PO nanoparticles. For that reason, Pb precipitation experiments were conducted to generate Pb-PO nanoparticles in bench scale studies for analysis. The study objective was to observe how pH, dissolved inorganic carbon (DIC), and PO impacted the properties of Pb-PO particles. Specifically, particle size, surface charge, mineralogy, and solubility were analysed. Hydrocerussite was precipitated when no PO was present, hydroxypyromorphite (Pb(PO)OH) nanoparticles (<100 nm diameter) were precipitated when excess PO relative to Pb necessary to completely precipitate the mineral was present, and a mixture of the two minerals was precipitated when an insufficient amount of PO was present. Hydroxypyromorphite particles were less soluble than hydrocerussite by up to two orders of magnitude. The estimated of 10 in this work closely aligned with previous estimates that ranged from 10 to 10. Hydroxypyromorphite particles would not settle in water which was likely due to their small size and high negative charge. The mobility and size of these particles indicates that there are potential implications for such particulate Pb to remain suspended in water and thus be present in the tap water.
Water Quality Trade-offs for Risk Management Interventions in a Green Building
Premise plumbing water quality degradation has led to negative health impacts from pathogen outbreaks (e.g., and non-tuberculous mycobacteria), as well as chronic effects from exposure to heavy metals or disinfection by-products (DBP). Common water quality management interventions include flushing, heat shock (thermal disinfection), supplemental disinfection (shock or super chlorination), and water heater temperature setpoint change. In this study, a - colonized Leadership in Energy and Environmental Design (LEED) certified building was monitored to study health-relevant water quality changes before and after three controlled management interventions: (1) flushing at several points throughout the building; (2) changing the water heater set point; and (3) a combination of interventions (1) and (2) by flushing during a period of elevated water heater set point (incompletely performed due to operational issues). Microbial (culturable the gene, and cATP) and physico-chemical (pH, temperature, conductivity, disinfectant residual, disinfection by-products (DBPs; total trihalomethanes, TTHM), and heavy metals) water quality were monitored alongside building occupancy as approximated using Wi-Fi logins. Flushing alone resulted in a significant decrease in cATP and concentrations ( = 0.018 and 0.019, respectively) and a significant increase in chlorine concentrations ( = 0.002) as well as iron and DBP levels ( = 0.002). Copper concentrations increased during the water heater temperature setpoint increase alone to 140°F during December 2022 ( = 0.01). During the flushing and elevated temperature in parts of the building in February 2023, there was a significant increase in chlorine concentrations ( = 0.002) and iron ( = 0.002) but no significant decrease in concentrations in the drinking water samples ( = 0.27). This study demonstrated the potential impacts of short term or incompletely implemented interventions which in this case were not sufficient to holistically improve water quality. As implementing interventions is logistically- and time-intensive, more effective and holistic approaches are needed for informing preventative and corrective actions that are beneficial for multiple water quality and sustainability goals.
Reactions of hypobromous acid with dimethyl selenide, dimethyl diselenide and other organic selenium compounds: kinetics and product formation
Selenium (Se) is an essential micronutrient for many living organisms particularly due to its unique redox properties. We recently found that the sulfur (S) analog for dimethyl selenide (DMSe), dimethyl sulfide (DMS), reacts fast with the marine oxidant hypobromous acid (HOBr) which likely serves as a sink of marine DMS. Here we investigated the reactivity of HOBr with dimethyl selenide and dimethyl diselenide (DMDSe), which are the main volatile Se compounds biogenically produced in marine waters. In addition, the reactivity of HOBr with further organic Se compounds was tested, , SeMet (as -acetylated-SeMet), and selenocystine (SeCys as -acetylated-SeCys), as well as the phenyl-analogs of DMSe and DMDSe, respectively, diphenyl selenide (DPSe) and diphenyl diselenide (DPDSe). Apparent second-order rate constants at pH 8 for the reactions of HOBr with the studied Se compounds were (7.1 ± 0.7) × 10 M s for DMSe, (4.3 ± 0.4) × 10 M s for DMDSe, (2.8 ± 0.3) × 10 M s for SeMet, (3.8 ± 0.2) × 10 M s for SeCys, (3.5 ± 0.1) × 10 M s for DPSe, and (8.0 ± 0.4) × 10 M s for DPDSe, indicating a very high reactivity of all selected Se compounds with HOBr. The reactivity between HOBr and DMSe is lower than for DMS and therefore this reaction is likely not relevant for marine DMSe abatement. However, the high reactivity of SeMet with HOBr suggests that SeMet may act as a relevant quencher of HOBr.
A combined experimental and computational approach to unravel degradation mechanisms in electrochemical wastewater treatment
Electrochemical wastewater treatment is a promising technique to remove recalcitrant pollutants from wastewater. However, the complexity of elucidating the underlying degradation mechanisms hinders its optimisation not only from a techno-economic perspective, as it is desirable to maximise removal efficiencies at low energy and chemical requirements, but also in environmental terms, as the generation of toxic by-products is an ongoing challenge. In this work, we propose a novel combined experimental and computational approach to (i) estimate the contribution of radical and non-radical mechanisms as well as their synergistic effects during electrochemical oxidation and (ii) identify the optimal conditions that promote specific degradation pathways. As a case study, the distribution of the degradation mechanisms involved in the removal of benzoic acid (BA) boron-doped diamond (BDD) anodes was elucidated and analysed as a function of several operating parameters, , the initial sulfate and nitrate content of the wastewater and the current applied. Subsequently, a multivariate optimisation study was conducted, where the influence of the electrode nature was investigated for two commercial BDD electrodes and a customised silver-decorated BDD electrode. Optimal conditions were identified for each degradation mechanism as well as for the overall BA degradation rate constant. BDD selection was found to be the most influential factor favouring any mechanism (, 52-85% contribution), given that properties such as its boron doping and the presence of electrodeposited silver could dramatically affect the reactions taking place. In particular, decorating the BDD surface with silver microparticles significantly enhanced BA degradation sulfate radicals, whereas direct oxidation, reactive oxygen species and radical synergistic effects were promoted when using a commercial BDD material with higher boron content and on a silicon substrate. Consequently, by simplifying the identification and quantification of underlying mechanisms, our approach facilitates the elucidation of the most suitable degradation route for a given electrochemical wastewater treatment together with its optimal operating conditions.
Characterizing as a surrogate for wastewater treatment studies and bioaerosol emissions
This study characterized (BG) as a Sterne (BAS) surrogate for wastewater treatment-related studies of UV inactivation, adsorption onto powdered activated carbon (PAC), and bioaerosol emission. The inactivation of BG was faster than that of BAS in DI water (pseudo first-order rate constants of 0.065 and 0.016 min respectively) and in PBS solution (0.030 and 0.005 min respectively). BG was also removed more quickly than BAS by PAC adsorption in DI (0.07 and 0.05 min respectively) and in PBS (0.09 and 0.04 min respectively). In DI, BG aggregated more ( < 0.05) than BAS when the pH was 7 or greater but there were no statistically significant differences in NaCl solution. Spore aggregation was also studied with extended Derjaguin-Landau-Verwey-Overbeek (XDLVO) models. Less than 1% of all spores were released as bioaerosols, and there was no significant difference ( > 0.05) in emission between BG and BAS. To the author's knowledge, this study is the first to demonstrate that BG is a suitable surrogate for BAS for bioaerosol emissions, but a poor surrogate for both UV inactivation and PAC adsorption. These results can be used to understand the ability of BAS to act as a surrogate for BA Ames because of its genetic and morphological similarities with BAS.
Hexavalent chromium waste removal bioelectrochemical systems - a life cycle assessment perspective
Bioelectrochemical systems (BESs) such as microbial fuel cells (MFCs) present numerous benefits for the removal and recovery of heavy metals from industrial and municipal wastewater. This study evaluated the life cycle environmental impact of simultaneous hexavalent chromium (Cr(vi)) removal and bioelectricity generation in a dual chamber MFC. Results indicate a global warming potential (GWP) of -0.44 kg carbon dioxide (CO)-eq. per kg of chromium recovered, representing a total saving of up to 97% in comparison with existing technologies for the treatment of Cr(vi) laden wastewater. The observed savings in GWP (kg CO-eq.) reduced to 61.8% with the removal of the allocated credits from the MFC system's life cycle. Of all the various sub-systems considered within the chromium waste treatment plant, the MFC unit and the chromium metal recovery unit had the largest impact in terms of GWP (kg CO-eq.), non-renewable energy use (NREU) (MJ primary), and mineral extraction (MJ surplus). A statistical analysis of the results showed that an increase in chemical oxygen demand (COD) was associated with a reduction in GWP (kg CO-eq.), NREU (MJ primary), and terrestrial ecotoxicity (kg triethylene glycol equivalents into soil (TEG soil)-eq.). The life cycle assessment (LCA) output showed a high sensitivity to changes in the materials and construction processes of MFC reactors, indicating the need for further research into sustainable materials for MFC reactor construction. The observed interaction effects of process variables also suggest the need for combined optimization of these variables. Analysis with other types of metals is also important to further demonstrate the practical viability of metal removal through MFCs.
Selective elimination of enterovirus genotypes by activated sludge and chlorination
Enteroviruses, which are commonly circulating viruses shed in the stool, are released into the sewage system and only partially removed or inactivated, resulting in the discharge of infectious enteroviruses into the environment. Activated sludge and chlorination remove or inactivate enterovirus genotypes to different extents, and thus have the potential to shape the population that will be discharged. The goal of this study was to evaluate how activated sludge and chlorination treatment shape an enterovirus population at the genotype level, using a population of eight genotypes commonly found in sewage: CVA9, CVB1, CVB2, CVB3, CVB4, CVB5, E25, E30. Our results show that the extent of inactivation varied among genotypes, but also across sludge samples. We find that the effluent of activated sludge systems will be depleted in CVA9, CVB1 and CVB2 while E25 together with CVB3, CVB4 and CVB5 will be prevalent. Furthermore, we found that microbial inactivation was the main mechanism of infectivity loss in the activated sludge, while adsorption to the sludge flocs was not significant. During effluent chlorination, we also observed that CVB5, CVB3 and to a lesser extent E25 were less susceptible to chlorination while E30 was readily inactivated, and activated sludge-derived EPS provided further protection against chlorination. This study contributes to a better understanding of the variability of sewage treatment efficacy against different enteroviruses.
Evaluation of intra- and inter-lab variability in quantifying SARS-CoV-2 in a state-wide wastewater monitoring network
In December 2019, SARS-CoV-2, the virus that causes coronavirus disease 2019, was first reported and subsequently triggered a global pandemic. Wastewater monitoring, a strategy for quantifying viral gene concentrations from wastewater influents within a community, has served as an early warning and management tool for the spread of SARS-CoV-2 in a community. Ohio built a collaborative statewide wastewater monitoring network that is supported by eight labs (university, government, and commercial laboratories) with unique sample processing workflows. Consequently, we sought to characterize the variability in wastewater monitoring results for network labs. Across seven trials between October 2020 and November 2021, eight participating labs successfully quantified two SARS-CoV-2 RNA targets and human fecal indicator virus targets in wastewater sample aliquots with reproducible results, although recovery efficiencies of spiked surrogates ranged from 3 to 75%. When SARS-CoV-2 gene fragment concentrations were adjusted for recovery efficiency and flow, the proportion of variance between laboratories was minimized, serving as the best model to account for between-lab variance. Another adjustment factor (alone and in different combinations with the above factors) considered to account for sample and measurement variability includes fecal marker normalization. Genetic quantification variability can be attributed to many factors, including the methods, individual samples, and water quality parameters. In addition, statistically significant correlations were observed between SARS-CoV-2 RNA and COVID-19 case numbers, supporting the notion that wastewater surveillance continues to serve as an effective monitoring tool. This study serves as a real-time example of multi-laboratory collaboration for public health preparedness for infectious diseases.
Wastewater research and surveillance: an ethical exploration
The current COVID-19 pandemic has given wastewater research a huge impetus. While wastewater research has some promising applications, there are as yet no well-developed ethical guidelines on how and under what conditions to use wastewater research. The current perspective paper aims to explore the different ethical questions pertaining to wastewater research and surveillance and to provide some tentative guidelines on the desirability of different types of applications. This paper shows that wastewater research offers interesting possibilities, but that legal regulation and ethical guidelines are still lacking, while there are ethical risks involved. The perspective indicates that it is important to look beyond the regulation of data collection and to shift the focus to the question how the analysis and use of wastewater data can be supervised.
Modeling Risk Dynamics of Contaminants of Emerging Concern in a Temperate-region Wastewater Effluent-dominated Stream
Wastewater effluent-dominated streams are becoming increasingly common worldwide, including in temperate regions, with potential impacts on ecological systems and drinking water sources. We recently quantified the occurrence/ spatiotemporal dynamics of pharmaceutical mixtures in a representative temperate-region wastewater effluent-dominated stream (Muddy Creek, Iowa) under baseflow conditions and characterized relevant fate processes. Herein, we quantified the ecological risk quotients (RQs) of 19 effluent-derived contaminants of emerging concern (CECs; including: 14 pharmaceuticals, 2 industrial chemicals, and 3 neonicotinoid insecticides) and 1 run-off-derived compound (atrazine) in the stream under baseflow conditions, and estimated the probabilistic risks of effluent-derived CECs under all-flow conditions (i.e., including runoff events) using stochastic risk modeling. We determined that 11 out of 20 CECs pose medium-to-high risks to local ecological systems (i.e., algae, invertebrates, fish) based on literature-derived acute effects under measured baseflow conditions. Stochastic risk modeling indicated decreased, but still problematic, risk of effluent-derived CECs (i.e., RQ≥0.1) under all-flow conditions when runoff events were included. Dilution of effluent-derived chemicals from storm flows thus only minimally decreased risk to aquatic biota in the effluent-dominated stream. We also modeled in-stream transport. Thirteen out of 14 pharmaceuticals persisted along the stream reach (median attenuation rate constant k<0.1 h) and entered the Iowa River at elevated concentrations. Predicted and measured concentrations in the drinking water treatment plant were below the human health benchmarks. This study demonstrates the application of probabilistic risk assessments for effluent-derived CECs in a representative effluent-dominated stream under variable flow conditions (when measurements are less practical) and provides an enhanced prediction tool transferable to other effluent-dominated systems.
Regrowth of in environmental waters after chlorine disinfection: shifts in viability and culturability
Bacterial regrowth after water/wastewater disinfection poses severe risks to public health. However, regrowth studies under realistic water conditions that might critically affect bacterial regrowth are scarce. This study aimed to assess for the first time the regrowth of () in terms of its viability and culturability in environmental waters after chlorine disinfection, which is the most widely used disinfection method. Post-chlorination regrowth tests were conducted in 1) standard 0.85% NaCl solution, 2) river water receiving domestic wastewater effluents, and 3) river water that is fully recharged by domestic wastewater effluents. The multiplex detection of plate count and fluorescence-based viability test was adopted to quantify the culturable and viable to monitor the regrowth process. The results confirmed that chlorine treatment (0.2, 0.5 and 1.0 mg L initial free chlorine) induced more than 99.95% of to enter a viable but non-culturable (VBNC) state and the reactivation of VBNC is presumably the major process of the regrowth. A second-order regrowth model well described the temporal shift of the survival ratio of culturable after the chlorination (: 0.73-1.00). The model application also revealed that the increase in initial chlorine concentration and chlorine dose limited the maximum regrowth rate and the maximum survival ratio, and the regrowth rate and percentage also changed with the water type. This study gives a better understanding of the potential regrowth after chlorine disinfection and highlights the need for investigating the detailed relation of the regrowth to environmental conditions such as major components of water matrices.
SARS-CoV-2 RNA is enriched by orders of magnitude in primary settled solids relative to liquid wastewater at publicly owned treatment works
Wastewater-based epidemiology has gained attention throughout the world for detection of SARS-CoV-2 RNA in wastewater to supplement clinical testing. Raw wastewater consists of small particles, or solids, suspended in liquid. Methods have been developed to measure SARS-CoV-2 RNA in the liquid and the solid fraction of wastewater, with some studies reporting higher concentrations in the solid fraction. To investigate this relationship further, six laboratories collaborated to conduct a study across five publicly owned treatment works (POTWs) where both primary settled solids obtained from primary clarifiers and raw wastewater influent samples were collected and quantified for SARS-CoV-2 RNA. Settled solids and influent samples were processed by participating laboratories using their respective methods and retrospectively paired based on date of collection. SARS-CoV-2 RNA concentrations, on a mass equivalent basis, were higher in settled solids than in influent by approximately three orders of magnitude. Concentrations in matched settled solids and influent were positively and significantly correlated at all five POTWs. RNA concentrations in both settled solids and influent were correlated to COVID-19 incidence rates in the sewersheds and thus representative of disease occurrence; the settled solids methods appeared to produce a comparable relationship between SARS-CoV-2 RNA concentration measurements and incidence rates across all POTWs. Settled solids and influent methods showed comparable sensitivity, N gene detection frequency, and calculated empirical incidence rate lower limits. Analysis of settled solids for SARS-CoV-2 RNA has the advantage of using less sample volume to achieve similar sensitivity to influent methods.
Effects of residual disinfectants on the redox speciation of lead(ii)/(iv) minerals in drinking water distribution systems
This study investigated the reaction kinetics on the oxidative transformation of lead(ii) minerals by free chlorine (HOCl) and free bromine (HOBr) in drinking water distribution systems. According to chemical equilibrium predictions, lead(ii) carbonate minerals, cerussite PbCO and hydrocerussite Pb(CO)(OH), and lead(ii) phosphate mineral, chloropyromorphite Pb(PO)Cl are formed in drinking water distribution systems in the absence and presence of phosphate, respectively. X-ray absorption near edge spectroscopy (XANES) data showed that at pH 7 and a 10 mM alkalinity, the majority of cerussite and hydrocerussite was oxidized to lead(iv) mineral PbO within 120 minutes of reaction with chlorine (3 : 1 Cl : Pb(ii) molar ratio). In contrast, very little oxidation of chloropyromorphite occurred. Under similar conditions, oxidation of lead(ii) carbonate and phosphate minerals by HOBr exhibited a reaction kinetics that was orders of magnitude faster than by HOCl. Their end oxidation products were identified as mainly plattnerite β-PbO and trace amounts of scrutinyite α-PbO based on X-ray diffraction (XRD) and extended X-ray absorption fine structure (EXAFS) spectroscopic analysis. A kinetic model was established based on the solid-phase experimental data. The model predicted that in real drinking water distribution systems, it takes 0.6-1.2 years to completely oxidize Pb(ii) minerals in the surface layer of corrosion scales to PbO by HOCl without phosphate, but only 0.1-0.2 years in the presence of bromide (Br) due the catalytic effects of HOBr generation. The model also predicts that the addition of phosphate will significantly inhibit Pb(ii) mineral oxidation by HOCl, but only be modestly effective in the presence of Br. This study provides insightful understanding on the effect of residual disinfectant on the oxidation of lead corrosion scales and strategies to prevent lead release from drinking water distribution systems.
Reproducibility and sensitivity of 36 methods to quantify the SARS-CoV-2 genetic signal in raw wastewater: findings from an interlaboratory methods evaluation in the U.S
In response to COVID-19, the international water community rapidly developed methods to quantify the SARS-CoV-2 genetic signal in untreated wastewater. Wastewater surveillance using such methods has the potential to complement clinical testing in assessing community health. This interlaboratory assessment evaluated the reproducibility and sensitivity of 36 standard operating procedures (SOPs), divided into eight method groups based on sample concentration approach and whether solids were removed. Two raw wastewater samples were collected in August 2020, amended with a matrix spike (betacoronavirus OC43), and distributed to 32 laboratories across the U.S. Replicate samples analyzed in accordance with the project's quality assurance plan showed high reproducibility across the 36 SOPs: 80% of the recovery-corrected results fell within a band of ±1.15 log genome copies per L with higher reproducibility observed within a single SOP (standard deviation of 0.13 log). The inclusion of a solids removal step and the selection of a concentration method did not show a clear, systematic impact on the recovery-corrected results. Other methodological variations pasteurization, primer set selection, and use of RT-qPCR or RT-dPCR platforms) generally resulted in small differences compared to other sources of variability. These findings suggest that a variety of methods are capable of producing reproducible results, though the same SOP or laboratory should be selected to track SARS-CoV-2 trends at a given facility. The methods showed a 7 log range of recovery efficiency and limit of detection highlighting the importance of recovery correction and the need to consider method sensitivity when selecting methods for wastewater surveillance.
The removal of ammonia, arsenic, iron and manganese by biological treatment from a small Iowa drinking water system
Although not regulated in United States drinking water, ammonia has the potential to increase chlorine consumption and cause nitrification problems in the distribution system. Many groundwaters with elevated ammonia are also contaminated with other inorganic analytes such as arsenic, iron, and manganese, all of which have primary or secondary maximum contaminant levels (MCLs). The objective of this work was to demonstrate the effectiveness of an innovative biological treatment process to simultaneously remove ammonia (2.9 mg N per L), arsenic (23 μg L), iron (2.9 mg L) and manganese (80 μg L) from a groundwater source in Iowa. The biological treatment system consisted of an "aeration contactor" followed by a conventional granular media filter. Orthophosphate was also added, as a biological nutrient, at 0.3 mg PO per L. Ammonia, manganese, and iron were consistently reduced through the pilot system by 98 to 99%. Complete oxidation of ammonia to nitrate was observed (., no nitrite was released) and arsenic was consistently removed to below the 10 μg L MCL. Ammonia was oxidized by ammonia and nitrite oxidizing bacteria and arsenic by bacteria which converted As(III) in the source water to more readily removable As(V). Iron was presumably oxidized by oxygen during aeration although some biologically assisted oxidation could not be ruled out. As(V) bound iron particles were removed in the filter resulting in effective arsenic (and iron) reduction. A surprising treatment benefit was the effective manganese reduction, the mechanism of which was not so clear, but was attributed to biologically assisted oxidation of Mn(II). While some system acclimation time was necessary to achieve desired ammonia and manganese reductions, acceptable arsenic and iron reductions were observed shortly after start-up.
Factors associated with elevated levels of antibiotic resistance genes in sewer sediments and wastewater
The sewer environment is a potential hotspot for the proliferation of antibiotic resistance genes (ARGs) and other hazardous microbial agents. Understanding the potential for ARG proliferation and retardation and/or accumulation in sewer sediments is of interest for protecting the health of sewage workers and the broader community in the event of sewer overflows as well as for interpreting sewage epidemiology data. To better understand this understudied environment for antibiotic resistance, a field survey was conducted to identify the factors that may control ARGs in sewer sediments and sewage. qPCR was performed for select ARGs and amplicon sequencing was performed for paired samples from combined and separate sanitary sewer systems. Metagenomic sequencing was performed on combined sewer sediments. The relative abundances of 1, (O), (W), F, and A were higher in wastewater compared to sewer sediments, while NDM-1 was greater in sewer sediment and F was similar between the two matrices. NDM-1 was observed in sewer sediment but rarely above detection in wastewater in this study. This may indicate that larger/more frequent wastewater samples are needed for detection and/or that retardation and/or accumulation in sewage sediment may need to be considered when interpreting wastewater-based epidemiology data for ARGs. Random forest analyses indicated that season and conductivity were important variables and to a lesser extent so were pH, TSS, heavy metals, and sewer type for explaining the variance of the ARGs. These variables explained the 19-61% of the variance of 1, (O), (G), and (W) quantified in wastewater. These variables performed less well for explaining the variance in sewer sediments (0.2-24%). Sewer sediment and wastewater had distinct microbial community structures and biomarkers for each are described. Metagenomics indicated that a high diversity of ARGs, including several of medical importance, were observed in the combined sewer sediment. This work provides insight into the complex sewer microbiome and the potential hazard posed by different sewer matrices.
The effect of mixing and free-floating carrier media on bioaerosol release from wastewater: a multiscale investigation with
Aeration tanks in wastewater treatment plants (WWTPs) are significant sources of bioaerosols, which contain microbial contaminants and can travel miles from the site of origin, risking the health of operators and the general public. One potential mitigation strategy is to apply free-floating carrier media (FFCM) to suppress bioaerosol emission. This article presents a multiscale study on the effects of mixing and FFCM on bioaerosol release using spores in well-defined liquid media. Bioaerosol release, defined as percentage of spores aerosolized during a 30 minute sampling period, ranged from 6.09 × 10% to 0.057%, depending upon the mixing mode and intensity. Bioaerosol release increased with the intensity of aeration (rotating speed in mechanical agitation and aeration rate in diffused aeration). A surface layer of polystyrene beads reduced bioaerosol released by >92% in the bench-scale studies and >74% in the pilot-scale study. This study discovered strong correlations ( > 0.82) between bioaerosol release and superficial gas velocity, Froude number, and volumetric gas flow per unit liquid volume per minute. The Reynolds number was found to be poorly correlated with bioaerosol release ( < 0.5). This study is a significant step toward the development of predictive models for full scale systems.
Standardizing data reporting in the research community to enhance the utility of open data for SARS-CoV-2 wastewater surveillance
SARS-CoV-2 RNA detection in wastewater is being rapidly developed and adopted as a public health monitoring tool worldwide. With wastewater surveillance programs being implemented across many different scales and by many different stakeholders, it is critical that data collected and shared are accompanied by an appropriate minimal amount of metainformation to enable meaningful interpretation and use of this new information source and intercomparison across datasets. While some databases are being developed for specific surveillance programs locally, regionally, nationally, and internationally, common globally-adopted data standards have not yet been established within the research community. Establishing such standards will require national and international consensus on what metainformation should accompany SARS-CoV-2 wastewater measurements. To establish a recommendation on minimum information to accompany reporting of SARS-CoV-2 occurrence in wastewater for the research community, the United States National Science Foundation (NSF) Research Coordination Network on Wastewater Surveillance for SARS-CoV-2 hosted a workshop in February 2021 with participants from academia, government agencies, private companies, wastewater utilities, public health laboratories, and research institutes. This report presents the primary two outcomes of the workshop: (i) a recommendation on the set of minimum meta-information that is needed to confidently interpret wastewater SARS-CoV-2 data, and (ii) insights from workshop discussions on how to improve standardization of data reporting.
Emerging investigator series: microplastic-based leachate formation under UV irradiation: the extent, characteristics, and mechanisms
Microplastics in the aquatic system are among the many inevitable consequences of plastic pollution, which has cascading environmental and public health impacts. Our study aimed at analyzing surface interactions and leachate production of six microplastics under ultraviolet (UV) irradiation. Leachate production was analyzed for the dissolved organic content (DOC), UV, and fluorescence through excitation emission (EEM) to determine the kinetics and mechanisms involved in the release of organic matter by UV irradiation. The results suggested there was a clear trend of organic matter being released from the surface of the six microplastics caused by UV irradiation based on DOC, UV absorbance, and EEM intensity increasing with time. Polystyrene had the greatest and fastest increase in DOC concentrations, followed by the resin coated polystyrene. Experiments conducted at different temperatures indicated the endothermic nature of these leaching mechanisms. The differences in leachate formation for different polymers were attributed to their chemical makeup and their potency to interact with UV. The aged microplastic samples were analyzed by Fourier-transform infrared spectroscopy (FT-IR), Raman, and X-ray photoelectron spectroscopy (XPS), to determine the surface changes with respect to leachate formation. Results indicated that all microplastics had increasing carbonyl indices when aged by UV with polystyrene being the greatest. These findings affirm that the leachate formation is an interfacial interaction and could be a significant source of organic compound influx to natural waters due to the extremely abundant occurrence of microplastics and their large surface areas.