Biogeochemical processes controlling nitrate attenuation in aquifers are critically reviewed. An understanding of the fate of nitrate in groundwater is vital for managing risks associated with ...nitrate pollution, and to safeguard groundwater supplies and groundwater-dependent surface waters. Denitrification is focused upon as the dominant nitrate attenuation process in groundwater. As denitrifying bacteria are essentially ubiquitous in the subsurface, the critical limiting factors are oxygen and electron donor concentration and availability. Variability in other environmental conditions such as nitrate concentration, nutrient availability, pH, temperature, presence of toxins and microbial acclimation appears to be less important, exerting only secondary influences on denitrification rates. Other nitrate depletion mechanisms such as dissimilatory nitrate reduction to ammonium and assimilation of nitrate into microbial biomass are unlikely to be important in most subsurface settings relative to denitrification. Further research is recommended to improve current understanding on the influence of organic carbon, sulphur and iron electron donors, physical restrictions on microbial activity in dual porosity aquifers, influences of environmental condition (e.g. pH in poorly buffered environments and salinity in coastal or salinized soil settings), co-contaminant influences (particularly the contrasting inhibitory and electron donor influences of pesticides) and improved quantification of denitrification rates in the laboratory and field.
Full text
Available for:
GEOZS, IJS, IMTLJ, KILJ, KISLJ, NUK, OILJ, PNG, SAZU, SBCE, SBJE, UL, UM, UPCLJ, UPUK, ZRSKP
This paper presents the results of a review of studies employing interpretative phenomenological analysis (IPA) obtained from three of the major databases: web of science, medline and psychinfo. ...Between 1996 and 2008, 293 papers presenting empirical IPA studies were published. Trends over time are presented. This is followed by a categorisation of the content area of that corpus. The biggest specific area of research within IPA is illness experience, it forming the subject of nearly a quarter of the corpus. The paper then describes a guide for evaluating IPA research which is used to assess the illness experience papers. Detailed summaries are provided of the papers rated as good. These summaries describe the substantive findings as well as the markers of high quality. The paper finishes with a summary of core features of high-quality IPA work.
Full text
Available for:
BFBNIB, IZUM, KILJ, NUK, PILJ, PNG, SAZU, UL, UM, UPUK
Cardiovascular disease (CVD) is a serious comorbidity in nonalcoholic fatty liver disease (NAFLD). Since plasma ceramides are increased in NAFLD and sphingomyelin, a ceramide metabolite, is an ...independent risk factor for CVD, the role of ceramides in dyslipidemia was assessed using LDLR(-/-) mice, a diet-induced model of NAFLD and atherosclerosis. Mice were fed a standard or Western diet (WD), with or without myriocin, an inhibitor of ceramide synthesis. Hepatic and plasma ceramides were profiled and lipid and lipoprotein kinetics were quantified. Hepatic and intestinal expression of genes and proteins involved in insulin, lipid and lipoprotein metabolism were also determined. WD caused hepatic oxidative stress, inflammation, apoptosis, increased hepatic long-chain ceramides associated with apoptosis (C16 and C18) and decreased very-long-chain ceramide C24 involved in insulin signaling. The plasma ratio of ApoB/ApoA1 (proteins of VLDL/LDL and HDL) was increased 2-fold due to increased ApoB production. Myriocin reduced hepatic and plasma ceramides and sphingomyelin, and decreased atherosclerosis, hepatic steatosis, fibrosis, and apoptosis without any effect on oxidative stress. These changes were associated with decreased lipogenesis, ApoB production and increased HDL turnover. Thus, modulation of ceramide synthesis may lead to the development of novel strategies for the treatment of both NAFLD and its associated atherosclerosis.
Full text
Available for:
DOBA, IZUM, KILJ, NUK, PILJ, PNG, SAZU, SIK, UILJ, UKNU, UL, UM, UPUK
In this article, I present a rebuttal of Max Van Manen’s critique of interpretative phenomenological analysis (IPA). Unfortunately, Van Manen’s piece contains a series of misrepresentations of IPA ...and its history. Here, I answer these misrepresentations and present IPA as subscribing, and contributing, to a broad and holistic phenomenology concerned with both prereflective and reflective domains of lived experience. I contend that IPA has much to offer to our understanding of the experience of health and illness, where participants are spontaneously and actively engaged in making sense of the significant and unexpected things that happen to them.
Full text
Available for:
NUK, OILJ, SAZU, UKNU, UL, UM, UPUK, VSZLJ
The recent deep learning revolution has created enormous opportunities for accelerating compute capabilities in the context of physics-based simulations. In this article, we propose EikoNet, a deep ...learning approach to solving the Eikonal equation, which characterizes the first-arrival-time field in heterogeneous 3-D velocity structures. Our grid-free approach allows for rapid determination of the travel time between any two points within a continuous 3-D domain. These travel time solutions are allowed to violate the differential equation-which casts the problem as one of optimization-with the goal of finding network parameters that minimize the degree to which the equation is violated. In doing so, the method exploits the differentiability of neural networks to calculate the spatial gradients analytically, meaning that the network can be trained on its own without ever needing solutions from a finite-difference algorithm. EikoNet is rigorously tested on several velocity models and sampling methods to demonstrate robustness and versatility. Training and inference are highly parallelized, making the approach well-suited for GPUs. EikoNet has low memory overhead and further avoids the need for travel-time lookup tables. The developed approach has important applications to earthquake hypocenter inversion, ray multipathing, and tomographic modeling, as well as to other fields beyond seismology where ray tracing is essential.
In this article I offer a theoretical account of interpretative phenomenological analysis's (IPA's) position in relation to meaning-making by participant and researcher. In doing this, I draw on a ...range of theoretical writing on meaning. I then apply these ideas to a series of empirical studies on pain which I have been involved in. The intention, therefore, is for the article to contribute a theoretically informed and empirically grounded extension to the literature on IPA.
Full text
Available for:
BFBNIB, IZUM, KILJ, NUK, PILJ, PNG, SAZU, UL, UM, UPUK
Interpretative phenomenological analysis (IPA) is a qualitative approach which aims to provide detailed examinations of personal lived experience. It produces an account of lived experience in its ...own terms rather than one prescribed by pre-existing theoretical preconceptions and it recognises that this is an interpretative endeavour as humans are sense-making organisms. It is explicitly idiographic in its commitment to examining the detailed experience of each case in turn, prior to the move to more general claims. IPA is a particularly useful methodology for examining topics which are complex, ambiguous and emotionally laden. Pain is a prime exemplar of such a phenomenon: elusive, involving complex psycho-somatic interactions and difficult to articulate. In addition to the 1998 article, published in this Special Issue, two further papers are suggested that the interested reader might wish to look out for.
Full text
Available for:
NUK, OILJ, SAZU, UKNU, UL, UM, UPUK
Seismic swarms show the structure
Faults responsible for earthquakes are idealized into two dimensions, despite fault zones being complicated, three-dimensional structures. Ross
et al.
used machine ...learning to find 22,000 seismic events near Cahuilla, California, during a seismic swarm. They used the locations and sizes of these events to show how the complex structure of the fault interacted with natural fluid injections from below. The authors' methods highlight the complexities of one fault and suggest a way to characterize other faults around the world.
Science
, this issue p.
1357
Locating 22,000 events from a seismic swarm shows the complex interplay between earthquakes, fluids, and fault geometry.
The vibrant evolutionary patterns made by earthquake swarms are incompatible with standard, effectively two-dimensional (2D) models for general fault architecture. We leverage advances in earthquake monitoring with a deep-learning algorithm to image a fault zone hosting a 4-year-long swarm in southern California. We infer that fluids are naturally injected into the fault zone from below and diffuse through strike-parallel channels while triggering earthquakes. A permeability barrier initially limits up-dip swarm migration but ultimately is circumvented. This enables fluid migration within a shallower section of the fault with fundamentally different mechanical properties. Our observations provide high-resolution constraints on the processes by which swarms initiate, grow, and arrest. These findings illustrate how swarm evolution is strongly controlled by 3D variations in fault architecture.
Although high high-density lipoprotein (HDL)-cholesterol levels are associated with decreased cardiovascular risk in epidemiological studies, recent genetic and pharmacological findings have raised ...doubts about the beneficial effects of HDL. Raising HDL levels in animal models by infusion or overexpression of apolipoprotein A-I has shown clear vascular improvements, such as delayed atherosclerotic lesion progression and accelerated lesion regression, along with increased reverse cholesterol transport. Inflammation and other factors, such as myeloperoxidase-mediated oxidation, can impair HDL production and HDL function, with regard to its reverse cholesterol transport, antioxidant, and anti-inflammatory activities. Thus, tests of HDL function, which have not yet been developed as routine diagnostic assays, may prove useful and be a better predictor of cardiovascular risk than HDL-cholesterol levels.