The immune response in halo nevi Zeff, Richard A.; Freitag, Anna; Grin, Caron M. ...
Journal of the American Academy of Dermatology,
10/1997, Letnik:
37, Številka:
4
Journal Article
Recenzirano
The mechanism(s) responsible for halo nevus presents a provocative link with the immune response to melanoma. Although no direct demonstration of melanocyte killing has been observed by the immune ...effector cells found within the halo, the abundance of antigen-presenting cells in the regressing nevus and the presence of T lymphocytes at the site of depigmentation suggest that these cells participate in the halo phenomenon. Within the latter population of cells, evidence points to the involvement of CD8
+ T cells as potential effectors in the destruction of nevomelanocytes. The break in tolerance that triggers migration and the presumed activation of these and other lymphocytes in the nevus in the apparent absence of disease remains unexplained. This brief overview reviews the evidence for the participation of the immune response in the genesis of the halo nevus. (J Am Acad Dermatol 1997;37:620-4.)
The melanoma cell line FO-1 does not express HLA class I antigens and does not acquire them on the cell surface after incubation with IFN-gamma. Immunochemical studies showed that FO-1 cells ...synthesize HLA class I heavy chain, but do not synthesize beta 2-microglobulin (beta 2-mu). The latter abnormality is associated with lack of beta 2-mu mRNA which remains undetectable in FO-1 cells incubated with IFN-gamma. The defect was identified as a genetic lesion in the B2m gene, since DNA hybridization analysis detected a deletion of the first exon of the 5'-flanking region, and of a segment of the first intron of the B2m gene. HLA class I antigen expression was reconstituted on melanoma cells FO-1 after transfection with the wild-type mouse B2m gene, thereby confirming the abnormality of the endogenous B2m gene. The defect identified in FO-1 cells is distinct from that underlying the lack of HLA class I antigen expression by lymphoblastoid cells Daudi, but is remarkably similar to that causing lack of H-2 class I antigen expression by mouse lymphoblastoid cells R1 (TL-). These results suggest that genetic recombination in the 5' region of the B2m gene is a recurrent mechanism in B2m gene defects. In addition to contributing to our understanding of molecular abnormalities in HLA class I antigen expression by melanoma cells, FO-1 cells represent a useful model for analyzing the role of HLA class I antigens in the biology of melanoma cells and in their interaction with cells of the immune system.
Functional neuroimaging is a vital element of neuroscience and cognitive research and, increasingly, is an important clinical tool. Diffuse optical imaging is an emerging, noninvasive technique with ...unique portability and hemodynamic contrast capabilities for mapping brain function in young subjects and subjects in enriched or clinical environments. We have developed a high-performance, high-density diffuse optical tomography (DOT) system that overcomes previous limitations and enables superior image quality. We show herein the utility of the DOT system by presenting functional hemodynamic maps of the adult human visual cortex. The functional brain images have a high contrast-to-noise ratio, allowing visualization of individual activations and highly repeatable mapping within and across subjects. With the improved spatial resolution and localization, we were able to image functional responses of 1.7 cm in extent and shifts of <1 cm. Cortical maps of angle and eccentricity in the visual field are consistent with retinotopic studies using functional MRI and positron-emission tomography. These results demonstrate that high-density DOT is a practical and powerful tool for mapping function in the human cortex.
The development of diffuse optical tomography (DOT) instrumentation for neuroimaging of humans is challenging due to the large size and the geometry of the head and the desire to distinguish signals ...at different depths. One approach to this problem is to use dense imaging arrays that incorporate measurements at different source-detector distances. We previously developed a high-density DOT system that is able to obtain retinotopic measurements in agreement with functional magnetic resonance imaging and positron emission tomography. Further extension of high-density DOT neuroimaging necessitates a thorough study of the measurement and imaging sensitivity that incorporates the complex geometry of the head--including the head curvature and layered tissue structure. We present numerical simulations using a finite element model of the adult head to study the sensitivity of the measured signal as a function of the imaging array and data sampling strategy. Specifically, we quantify the imaging sensitivity available within the brain (including depths beyond superficial cortical gyri) as a function of increasing the maximum source-detector separation included in the data. Through the use of depth related sensitivity analysis, it is shown that for a rectangular grid with 1.3 cm first nearest neighbor (NN) spacing, second NN measurements are sufficient to record absorption changes along the surface of the brain's cortical gyri (brain tissue depth <5 mm). The use of fourth and fifth NN measurements would permit imaging down into the cortical sulci (brain tissue depth >15 mm).
Rising development costs and growing concerns over environmental impacts have led many communities to explore more diversified water management strategies. These “portfolio”‐style approaches ...integrate existing supply infrastructure with other options such as conservation measures or water transfers. Diversified water supply portfolios have been shown to reduce the capacity and costs required to meet demand, while also providing greater adaptability to changing hydrologic conditions. However, this additional flexibility can also cause unexpected reductions in revenue (from conservation) or increased costs (from transfers). The resulting financial instability can act as a substantial disincentive to utilities seeking to implement more innovative water management techniques. This study seeks to design portfolios that employ financial tools (e.g., contingency funds and index insurance) to reduce fluctuations in revenues and costs, allowing these strategies to achieve improved performance without sacrificing financial stability. This analysis is applied to the development of coordinated regional supply portfolios in the “Research Triangle” region of North Carolina, an area comprising four rapidly growing municipalities. The actions of each independent utility become interconnected when shared infrastructure is utilized to enable interutility transfers, requiring the evaluation of regional tradeoffs in up to five performance and financial objectives. Diversified strategies introduce significant tradeoffs between achieving reliability goals and introducing burdensome variability in annual revenues and/or costs. Financial mitigation tools can mitigate the impacts of this variability, allowing for an alternative suite of improved solutions. This analysis provides a general template for utilities seeking to navigate the tradeoffs associated with more flexible, portfolio‐style management approaches.
Key Points
Adaptive measures increase supply reliability but also add financial risk
Financial mitigation helps to achieve reliability and financial objectives
Index insurance can reduce the cost of mitigation when used with self‐insurance
This paper uses a sample of the regression and behavioral papers published in The Accounting Review and the Journal of Accounting Research from September 2012 through May 2013. We argue first that ...the current research results reported in empirical regression papers fail adequately to justify the time period adopted for the study. Second, we maintain that the statistical analyses used in these papers as well as in the behavioral papers have produced flawed results. We further maintain that their tests of statistical significance are not appropriate and, more importantly, that these studies do not -- and cannot -- properly address the economic significance of the work. In other words, significance tests are not tests of the economic meaningfulness of the results. We suggest ways to avoid some but not all of these problems. We also argue that replication studies, which have been essentially abandoned by accounting researchers, can contribute to our search for truth, but few will be forthcoming unless the academic reward system is modified.
Tissue factor (TF) exists in a cryptic form i.e. without procoagulant activity (PCA) in peripheral blood monocytes and quiescent tissue macrophages but is expressed constitutively in most human tumor ...cells. Induction and cell surface expression of TF in these cells in vivo is associated with activation of intravascular and extravascular coagulation in patients with a variety of inflammatory or malignant diseases. The regulation of TF synthesis in cells is complex and new information from transfection studies suggests that changes in cellular glycosylation pathways impair cell surface expression of functional TF. Such dysregulation may also characterize the lineage-unfaithful expression of TF in leukemic cells and perhaps explain some of the thrombohemorrhagic complications in patients with acute progranulocytic leukemia. The importance of carbohydrate modification of TF is reviewed.
AbstractExploratory simulation allows analysts to discover scenarios in which existing or planned water supplies may fail to meet stakeholder objectives. These robustness assessments rely heavily on ...the choice of plausible future scenarios, which, in the case of drought management, requires sampling or generating a streamflow ensemble that extends beyond the historical record. This study develops a method to modify synthetic streamflow generators by increasing the frequency and severity of droughts for the purpose of exploratory modeling. To support management decisions, these synthetic droughts can be related to recent observed droughts of consequence for regional stakeholders. The method approximately preserves the spatial and temporal correlation of historical streamflow in drought-adjusted scenarios. The approach is demonstrated in a bottom-up planning context using an urban water portfolio design problem in North Carolina, a region whose water supply faces both climate and population pressures. Synthetic scenarios are used to simulate the implications for reliability and cost if events with similar severity to the recent 2007–2008 drought become more frequent under climate change, and in general, the system-level consequences of increasingly frequent and/or severe droughts. Finally, synthetically generated drought extremes are compared with runoff projections derived from downscaled climate model output, serving to support bottom-up robustness methods in water systems planning.
This paper begins with a description of the accounting research environment prior to, and shortly following, the appearance of Abacus in 1965. During this period, the approach to accounting was ...predominantly normative in focus, but also reflected historical approaches, as researchers grappled with the accounting issues faced by practising accountants and bodies that established accounting principles. The 1960s witnessed the beginning of a major change in the interests and approach of accounting researchers. Articles increasingly reflected a decline in reliance on the normative approach, accompanied by an increase in empirical analyses. The new focus introduced the ideas and concepts of several sister disciplines, including the social sciences, notably cognitive psychology and mathematics, particularly statistics, into accounting research. This era, which is still with us today, stressed theory, mathematical modelling, and, importantly, statistical testing. Simultaneously, the new directions gradually abandoned the contributions of normative approaches and diminished the interest in history, both of which had enlightened the problems of practice that previously held centre stage. We examine a broad sample of research articles to inform our discussion and analysis, and then we comment on some of the limitations of the new data‐driven approaches embedded in current research efforts. We conclude with ten recommendations for accounting researchers to consider as they tackle the complex issue of increasing the relevance of our efforts in the future. We hope that these recommendations, if adopted, will increase the academic relevance of academic research to the problems facing decision makers beyond the academic community.
Historically, intravenous acetylcysteine has been delivered at a fixed dose and duration of 300 mg/kg over 20 to 21 hours to nearly every patient deemed to be at any risk for hepatotoxicity following ...acetaminophen overdose. We investigated a 12‐hour treatment regimen for selected low‐risk patients. This was a multicenter, open‐label, cluster‐controlled trial at six metropolitan emergency departments. We enrolled subjects following single or staggered acetaminophen overdose with normal serum alanine transaminase (ALT) and creatinine on presentation and at 12 hours, and less than 20 mg/L acetaminophen at 12 hours. Patients were allocated to intervention (250 mg/kg over 12‐hour) or control (300 mg/kg over 20‐hour) regimens by site. The primary outcome was incidence of “hepatic injury” 20 hours following initiation of acetylcysteine treatment, defined as ALT doubling and peak ALT greater than 100 IU/L, indicating the need for further antidotal treatment. Secondary outcomes included incidence of hepatotoxicity (ALT > 1,000 IU/L), peak international normalized ratio (INR), and adverse drug reactions. Of the 449 acetaminophen overdoses receiving acetylcysteine, 100 were recruited to the study. Time to acetylcysteine (median 7 hours interquartile ratio 6,12 versus 7 hours 6,10) and initial acetaminophen (124 mg/L 58,171 versus 146 mg/L 66,204) were similar between intervention and control groups. There was no difference in ALT (18 IU/L 13,22 versus 16 IU/L 13,21) or INR (1.2 versus 1.2) 20 hours after starting acetylcysteine between groups. No patients developed hepatic injury or hepatotoxicity in either group (odds ratio 1.0 95% confidence interval 0.02, 50). No patients represented with liver injury, none died, and 96 of 96 were well at 14‐day telephone follow‐up. Conclusion: Discontinuing acetylcysteine based on laboratory testing after 12 hours of treatment is feasible and likely safe in selected patients at very low risk of liver injury from acetaminophen overdose.