The Sino-Soviet split Luthi, Lorenz M
2010., 20101216, 2010, 2008, c2008., 2008-01-01, 20080101, Volume:
109
eBook
A decade after the Soviet Union and the People's Republic of China established their formidable alliance in 1950, escalating public disagreements between them broke the international communist ...movement apart. InThe Sino-Soviet Split, Lorenz Lüthi tells the story of this rupture, which became one of the defining events of the Cold War. Identifying the primary role of disputes over Marxist-Leninist ideology, Lüthi traces their devastating impact in sowing conflict between the two nations in the areas of economic development, party relations, and foreign policy. The source of this estrangement was Mao Zedong's ideological radicalization at a time when Soviet leaders, mainly Nikita Khrushchev, became committed to more pragmatic domestic and foreign policies.
Using a wide array of archival and documentary sources from three continents, Lüthi presents a richly detailed account of Sino-Soviet political relations in the 1950s and 1960s. He explores how Sino-Soviet relations were linked to Chinese domestic politics and to Mao's struggles with internal political rivals. Furthermore, Lüthi argues, the Sino-Soviet split had far-reaching consequences for the socialist camp and its connections to the nonaligned movement, the global Cold War, and the Vietnam War.
The Sino-Soviet Splitprovides a meticulous and cogent analysis of a major political fallout between two global powers, opening new areas of research for anyone interested in the history of international relations in the socialist world.
Information and communication technology (ICT) is often considered a technology for reducing environmental emissions by increasing energy and resource efficiencies of processes. However, due to other ...effects of ICT, such as rebound and induction effects, the net benefits of ICT in terms of environmental impact are by no means assured. Even though the relevance of indirect or higher order effects has become a well-known issue in recent years, their environmental assessment remains controversial. Life cycle assessment (LCA) is one of the most established environmental assessment methods for modelling the environmental effects of goods and services throughout their life cycle. Although LCA is traditionally rather product-focused, there exist also LCA-based approaches to assess higher order effects of technology replacement and optimization.
This paper examines whether and how LCA case studies on environmental effects of ICT already take into account related higher order effects. A systematic review of scientific literature published since 2005 has been conducted and 25 case studies were analyzed in detail. The following research questions were addressed: i) Which products are assessed? ii) Which higher order effects of ICT are considered; and iii) how is the integration of higher order effects methodically realized? The results show that few case studies were concerned with the environmental effects of the introduction of ICT services in commerce, telework and monitoring and control. Most studies investigated the substitution of certain media with electronic devices or digital services. It was found that technology-based higher order effects, such as optimization and substitution, are usually included in the assessment by choosing comparative study designs, while user-related higher order effects, such as rebound effects and induction effects, are less often considered. For the latter effects, methodological integration was mainly provided by scenario modelling and sensitivity analysis. Overall, most studies chose an attributional LCA approach. It can be concluded from the results that, in particular, user-related effects such as rebound effects have not yet been frequently included in the environmental assessment of ICT. The identified research gaps include in particular interdisciplinary approaches on how changing use patterns can be more strongly observed in LCA.
•The framework of environmental impacts of ICT is extended to include behavioral effects.•There are approaches to integrate certain higher order ICT effects into LCA studies.•Technology-based higher order effects are usually included in the assessment.•User-related higher order effects are less often considered in the assessment.•User-related effects of technological change must be taken more into account in LCA.
Thermoelectric devices that are flexible and optically transparent hold unique promise for future electronics. However, development of invisible thermoelectric elements is hindered by the lack of ...p-type transparent thermoelectric materials. Here we present the superior room-temperature thermoelectric performance of p-type transparent copper iodide (CuI) thin films. Large Seebeck coefficients and power factors of the obtained CuI thin films are analysed based on a single-band model. The low-thermal conductivity of the CuI films is attributed to a combined effect of the heavy element iodine and strong phonon scattering. Accordingly, we achieve a large thermoelectric figure of merit of ZT=0.21 at 300 K for the CuI films, which is three orders of magnitude higher compared with state-of-the-art p-type transparent materials. A transparent and flexible CuI-based thermoelectric element is demonstrated. Our findings open a path for multifunctional technologies combing transparent electronics, flexible electronics and thermoelectricity.
High-throughput screening (HTS) is a well-established process for lead discovery in Pharma and Biotech companies and is now also being used for basic and applied research in academia. It comprises ...the screening of large chemical libraries for activity against biological targets via the use of automation, miniaturized assays and large-scale data analysis. Since its first advent in the early to mid 1990s, the field of HTS has seen not only a continuous change in technology and processes, but also an adaptation to various needs in lead discovery. HTS has now evolved into a mature discipline that is a crucial source of chemical starting points for drug discovery. Whereas in previous years much emphasis has been put on a steady increase in screening capacity (‘quantitative increase’) via automation and miniaturization, the past years have seen a much greater emphasis on content and quality (‘qualitative increase’). Today, many experts in the field see HTS at a crossroad with the need to decide on either higher throughput/more experimentation or a greater focus on assays of greater physiological relevance, both of which may lead to higher productivity in pharmaceutical R&D. In this paper, we describe the development of HTS over the past decade and point out our own ideas for future directions of HTS in biomedical research. We predict that the trend toward further miniaturization will slow down with the balanced implementation of 384 well, 1536 well, and 384 low volume well plates. Furthermore, we envisage that there will be much more emphasis on rigorous assay and chemical characterization, particularly considering that novel and more difficult target classes will be pursued. In recent years we have witnessed a clear trend in the drug discovery community toward rigorous hit validation by the use of orthogonal readout technologies, label free and biophysical methodologies. We also see a trend toward a more flexible use of the various screening approaches in lead discovery, that is, the use of both full deck compound screening as well as the use of focused screening and iterative screening approaches. Moreover, we expect greater usage of target identification strategies downstream of phenotypic screening and the more effective implementation of affinity selection technologies as a result of advances in chemical diversity methodologies. We predict that, ultimately, each hit finding strategy will be much more project-related, tailor-made, and better integrated into the broader drug discovery efforts.
Dynamic material flow analysis (MFA) is a frequently used method to assess past, present, and future stocks and flows of metals in the anthroposphere. Over the past fifteen years, dynamic MFA has ...contributed to increased knowledge about the quantities, qualities, and locations of metal-containing goods. This article presents a literature review of the methodologies applied in 60 dynamic MFAs of metals. The review is based on a standardized model description format, the ODD (overview, design concepts, details) protocol. We focus on giving a comprehensive overview of modeling approaches and structure them according to essential aspects, such as their treatment of material dissipation, spatial dimension of flows, or data uncertainty. The reviewed literature features similar basic modeling principles but very diverse extrapolation methods. Basic principles include the calculation of outflows of the in-use stock based on inflow or stock data and a lifetime distribution function. For extrapolating stocks and flows, authors apply constant, linear, exponential, and logistic models or approaches based on socioeconomic variables, such as regression models or the intensity-of-use hypothesis. The consideration and treatment of further aspects, such as dissipation, spatial distribution, and data uncertainty, vary significantly and highly depends on the objectives of each study.
The RNA world scenario posits replication by RNA polymerases. On early Earth, a geophysical setting is required to separate hybridized strands after their replication and to localize them against ...diffusion. We present a pointed heat source that drives exponential, RNA-catalyzed amplification of short RNA with high efficiency in a confined chamber. While shorter strands were periodically melted by laminar convection, the temperature gradient caused aggregated polymerase molecules to accumulate, protecting them from degradation in hot regions of the chamber. These findings demonstrate a size-selective pathway for autonomous RNA-based replication in natural nonequilibrium conditions.
Full text
Available for:
CMK, CTK, FMFMET, NUK, UL
Cycles of glaciation impose mechanical stresses on underlying bedrock as glaciers advance, erode, and retreat. Fracture initiation and propagation constitute rock mass damage and act as preparatory ...factors for slope failures; however, the mechanics of paraglacial rock slope damage remain poorly characterized. Using conceptual numerical models closely based on the Aletsch Glacier region of Switzerland, we explore how in situ stress changes associated with fluctuating ice thickness can drive progressive rock mass failure preparing future slope instabilities. Our simulations reveal that glacial cycles as purely mechanical loading and unloading phenomena produce relatively limited new damage. However, ice fluctuations can increase the criticality of fractures in adjacent slopes, which may in turn increase the efficacy of fatigue processes. Bedrock erosion during glaciation promotes significant new damage during first deglaciation. An already weakened rock slope is more susceptible to damage from glacier loading and unloading and may fail completely. We find that damage kinematics are controlled by discontinuity geometry and the relative position of the glacier; ice advance and retreat both generate damage. We correlate model results with mapped landslides around the Great Aletsch Glacier. Our result that most damage occurs during first deglaciation agrees with the relative age of the majority of identified landslides. The kinematics and dimensions of a slope failure produced in our models are also in good agreement with characteristics of instabilities observed in the field. Our results extend simplified assumptions of glacial debuttressing, demonstrating in detail how cycles of ice loading, erosion, and unloading drive paraglacial rock slope damage.
Key Points
Simply adding then removing glacier ice from an alpine valley has little net effect on rock wall damage
Glacial erosion, i.e., rock debuttressing, creates significant new rock slope damage during first deglaciation
Damage kinematics vary during a glacial cycle: ice advance favors toppling, while retreat promotes sliding
•Advances in protein expression systems have enabled the production of more authentic human proteins.•New labeling methods are furthering functional and biophysical studies of proteins.•‘Gene ...optimisation’ methods can increase protein expression, although hurdles remain.
Protein production for structural and biophysical studies, functional assays, biomarkers, mechanistic studies in vitro and in vivo, but also for therapeutic applications in pharma, biotech and academia has evolved into a mature discipline in recent years. Due to the increased emphasis on biopharmaceuticals, the growing demand for proteins used for structural and biophysical studies, the impact of genomics technologies on the analysis of large sets of structurally diverse proteins, and the increasing complexity of disease targets, the interest in innovative approaches for the expression, purification and characterisation of recombinant proteins has steadily increased over the years. In this review, we summarise recent developments in the field of recombinant protein expression for research use in pharma, biotech and academia. We focus mostly on the latest developments for protein expression in the most widely used expression systems: Escherichia coli (E. coli), insect cell expression using the Baculovirus Expression Vector System (BEVS) and, finally, transient and stable expression of recombinant proteins in mammalian cells.
Summary
Thorough understanding of the complex pathophysiology of osteoarthritis (OA) is necessary in order to open new avenues for treatment. The aim of this study was to characterize the CD4+ T cell ...population and evaluate their activation and polarization status in OA joints. Fifty‐five patients with end‐stage knee OA (Kellgren–Lawrence grades III–IV) who underwent surgery for total knee arthroplasty (TKA) were enrolled into this study. Matched samples of synovial membrane (SM), synovial fluid (SF) and peripheral blood (PB) were analysed for CD3+CD4+CD8– T cell subsets T helper type 1 (Th1), Th2, Th17, regulatory T cells and activation status (CD25, CD69, CD45RO, CD45RA, CD62L) by flow cytometry. Subset‐specific cytokines were analysed by cytometric bead array (CBA). SM and SF samples showed a distinct infiltration pattern of CD4+ T cells. In comparison to PB, a higher amount of joint‐derived T cells was polarized into CD3+CD4+CD8– T cell subsets, with the most significant increase for proinflammatory Th1 cells in SF. CBA analysis revealed significantly increased immunomodulating cytokines interferon (IFN)‐γ, interleukin (IL)‐2 and IL‐10 in SF compared to PB. Whereas in PB only a small proportion of CD4+ T cells were activated, the majority of joint‐derived CD4+ T cells can be characterized as activated effector memory cells (CD69+CD45RO+CD62L–). End‐stage OA knees are characterized by an increased CD4+ T cell polarization towards activated Th1 cells and cytokine secretion compared to PB. This local inflammation may contribute to disease aggravation and eventually perpetuate the disease process.
The main focus of this study was to characterize the CD4+ T cell population and evaluate their activation and polarization status in osteoarthritis (OA) joints by analysis of synovial membrane (SM), synovial fluid (SF) and peripheral blood (PB). SF of end‐stage OA knees is characterized by a significant increase of activated proinflammatory T helper type 1 (Th1) cells compared to PB. This shift towards inflammatory T cell subsets is thought to contribute to OA pathophysiology and could affect OA symptoms and progression of cartilage loss.
Photoredox decarboxylative cross-coupling via iridium–nickel dual catalysis has emerged as a valuable method for C(sp2)–C(sp3) bond formation. Herein we describe the application of a segmented flow ...(“microslug”) reactor equipped with a newly designed photochemistry module for material-efficient reaction screening and optimization. Through the deployment of a self-optimizing algorithm, optimal flow conditions for the model reaction were rapidly developed, simultaneously accounting for the effects of continuous variables (temperature and time) and discrete variables (base and catalyst). Temperature was found to be a critical parameter with regard to reaction rates and hence productivity in subsequent scale-up in flow. The optimized conditions identified at microscale were found to directly transfer to a Vapourtec UV-150 continuous flow photoreactor, enabling predictable scale-up operation at a scale of hundreds of milligrams per hour. This optimization approach was then expanded to other halide coupling partners that were low-yielding in batch reactions, highlighting the practical application of this optimization platform in the development of conditions for photochemical synthesis in continuous flow.