Font Size: a A A

Method To Remove Atrazine From Aqueous Stream By Coupling Microporous Mineral Adsorption With Microwave-induced Degradation

Posted on:2016-11-02Degree:DoctorType:Dissertation
Country:ChinaCandidate:E D HuFull Text:PDF
GTID:1221330461980745Subject:Environmental Science
Abstract/Summary:PDF Full Text Request
Contamination of surface water and groundwater by atrazine, one of the most effective and affordable herbicides, is widespread in many countries due to its heavy past and/or current use in agriculture. Because of its toxicity and potential carcinogenicity, atrazine is regulated as a priority pollutant in the drinking water standards established in many countries and organizations, such as the U.S., European Union, China, and the World Health Organization. On the other hand, removal of atrazine from aqueous phase presents a significant challenge because it is recalcitrant to biodegradation, does not adsorb strongly on activated carbon or undergo chemical oxidation easily. As a result, there has been continued effort on improving the performance of the conventional drinking treatment processes at atrazine removal and developing alternative technologies.My research aims to develop a novel technology for removal and destruction of atrazine from aqueous stream by adsorbing it from aqueous solution by using microporous mineral sorbents, followed by microwave-induced degradation of the sorbed atrazine.First, the adsorption of atrazine from water onto zolites CBV-720 and 4A, quartz sand, and diatomite, and its microwave-induced degradation when sorbed on these minerals were investigated. Dealuminated HY zeolite CBV-720 exhibited a distinctly higher atrazine sorption capacity than the other mineral sorbents because of its high micropore volume, suitable pore sizes, and surface hydrophobicity. Atrazine sorbed on the minerals degraded under microwave irradiation due to interfacial selective heating by the microwave, while atrazine in aqueous solution and associated with PTFE powder was not affected. Atrazine degraded rapidly in the micropores of CBV-720 under microwave irradiation and its degradation intermediates also decomposed with further irradiation, suggesting atrazine could be fully mineralized. Two new degradation intermediates of atrazine, 3,5-diamino-1,2,4-triazole and guanidine, were first identified in this study. The evolution of degradation intermediates and changes in infrared spectra of CBV-720 after microwave irradiation consistently indicate creation of steady state hot spots in micropores and degradation of atrazine following a pyrolysis mechanism.My work further studied the sorption and microwave-induced degradation of atrazine in the micropores of nine Y zeolites with different densities(0.16-2.62 site/nm2) and types(Mg2+, Ca2+, H+, Na+, and NH4+) of surface cations, and the influence of the content of co-sorbed water in the mineral micropores on atrazine degradation rate. The results indicate the presence of surface cations at around 0.23 site/nm2 on the pore wall surface was optimal for atrazine degradation, probably due to formation of insufficient number of “hot spots” with too few cations but excessive competition for microwave energy with too many hydrated cations. Atrazine degraded faster in the presence of cations with lower hydration free energies, which could be attributed to less microwave energy consumption to desorb the bounded water molecules. Reducing the content of co-adsorbed water in the micropores also increased atrazine degration rate because of less competition for microwave energy from water.To develop and design the optimum microporous mineral-based sorbent for uptaking atrazine and catalyzing its microwave-induced degradation, we harnessed the catalytic properties of transition metal ions by exchanging Cu2+ and Fe3+ into the cages and channels of dealuminated Y zeolites. With the ability to complex with atrazine, the introduction of Cu2+ greatly increased the sorption of atrazine on the Y zeolites. Atrazine sorption on the Fe3+ exchanged zeolites was also significantly enhanced, which was attributed to hydrolysis of Fe3+ in mineral micropores and subsequent protonation of atrazine molecules. As expected, the introduction of transition metal ions, Cu2+ and Fe3+, into the micropores of Y zeolites drastically increased the degradation rates of the sored atrazine(and its degradation intermediates) under microwave irradiation. The surface species of transition metal ions exchanged into the mineral micropores are hypothesized to be thermally activated under microwave irradiation, and subsequently form highly reactive sites catalyzing degradation of atrazine(and its degradation intermediates).Finally, we evaluate the practical application of transition metal-exchanged zeolites on sorption of atrazine from aqueous solutions and destruction of the sorbed atrazine under microwave irradiation. With efficient regeneration by microwave irradiation, both copper- and iron-exchanged Y zeolites could be reused multiple times, while the catalytic activity of the later was better retained due to the much stronger stability of Fe3+ species in the micropores. The presence of common cations and anions, and humic acid had little impact on the sorption and microwave-induced degradation of the transition metal-exchanged zeolites. In the treatment of atrazine spiked in natural surface water and groundwater samples, sorptive removal of atrazine was found to be impacted by the level of dissolved organic carbon, probably through competition for hydrophobic micropore spaces and pore blocking, while the water matrix exhibited no strong effect on the microwave-induced degradation of sorbed atrazine. Overall, the promising laboratory test results suggest that the iron-exchanged zeolites can be used for treating atrazine pollution through sorptive removal coupled with microwave-induced degradation on large-scale applications.
Keywords/Search Tags:atrazine, mineral micropores, sorption, microwave, degradation
PDF Full Text Request
Related items