T-cell receptor–engineered T-cell therapy and chimeric antigen receptor T-cell therapy are 2 types of adoptive T-cell therapy that genetically modify natural T cells to treat cancers. Although ...chimeric antigen receptor T-cell therapy has yielded remarkable efficacy for hematological malignancies of the B-cell lineages, most solid tumors fail to respond significantly to chimeric antigen receptor T cells. T-cell receptor–engineered T-cell therapy, on the other hand, has shown unprecedented promise in treating solid tumors and has attracted growing interest. In order to create an unbiased, comprehensive, and scientific report for this fast-moving field, we carefully analyzed all 84 clinical trials using T-cell receptor–engineered T-cell therapy and downloaded from ClinicalTrials.gov updated by June 11, 2018. Informative features and trends were observed in these clinical trials. The number of trials initiated each year is increasing as expected, but an interesting pattern is observed. NY-ESO-1, as the most targeted antigen type, is the target of 31 clinical trials; melanoma is the most targeted cancer type and is the target of 33 clinical trials. Novel antigens and underrepresented cancers remain to be targeted in future studies and clinical trials. Unlike chimeric antigen receptor T-cell therapy, only about 16% of the 84 clinical trials target against hematological malignancies, consistent with T-cell receptor–engineered T-cell therapy’s high potential for solid tumors. Six pharma/biotech companies with novel T-cell receptor–engineered T-cell ideas and products were examined in this review. Multiple approaches have been utilized in these companies to increase the T-cell receptor’s affinity and efficiency and to minimize cross-reactivity. The major challenges in the development of the T-cell receptor–engineered T-cell therapy due to tumor microenvironment were also discussed here.
High-strength aluminum alloys are important for lightweighting vehicles and are extensively used in aircraft and, increasingly, in automobiles. The highest-strength aluminum alloys require a series ...of high-temperature "bakes" (120° to 200°C) to form a high number density of nanoparticles by solid-state precipitation. We found that a controlled, room-temperature cyclic deformation is sufficient to continuously inject vacancies into the material and to mediate the dynamic precipitation of a very fine (1- to 2-nanometer) distribution of solute clusters. This results in better material strength and elongation properties relative to traditional thermal treatments, despite a much shorter processing time. The microstructures formed are much more uniform than those characteristic of traditional thermal treatments and do not exhibit precipitate-free zones. These alloys are therefore likely to be more resistant to damage.
The capability of selectively sharing encrypted data with different users via public cloud storage may greatly ease security concerns over inadvertent data leaks in the cloud. A key challenge to ...designing such encryption schemes lies in the efficient management of encryption keys. The desired flexibility of sharing any group of selected documents with any group of users demands different encryption keys to be used for different documents. However, this also implies the necessity of securely distributing to users a large number of keys for both encryption and search, and those users will have to securely store the received keys, and submit an equally large number of keyword trapdoors to the cloud in order to perform search over the shared data. The implied need for secure communication, storage, and complexity clearly renders the approach impractical. In this paper, we address this practical problem, which is largely neglected in the literature, by proposing the novel concept of key-aggregate searchable encryption and instantiating the concept through a concrete KASE scheme, in which a data owner only needs to distribute a single key to a user for sharing a large number of documents, and the user only needs to submit a single trapdoor to the cloud for querying the shared documents. The security analysis and performance evaluation both confirm that our proposed schemes are provably secure and practically efficient.
Summary
The ability to interpret daily and seasonal fluctuations, latitudinal and vegetation canopy variations in light and temperature signals is essential for plant survival. However, the precise ...molecular mechanisms transducing the signals from light and temperature perception to maintain plant growth and adaptation remain elusive. We show that far‐red light induces PHYTOCHROME‐INTERACTING TRANSCRIPTION 4 (SlPIF4) accumulation under low‐temperature conditions via phytochrome A in Solanum lycopersicum (tomato). Reverse genetic approaches revealed that knocking out SlPIF4 increases cold susceptibility, while overexpressing SlPIF4 enhances cold tolerance in tomato plants. SlPIF4 not only directly binds to the promoters of the C‐REPEAT BINDING FACTOR (SlCBF) genes and activates their expression but also regulates plant hormone biosynthesis and signals, including abscisic acid, jasmonate and gibberellin (GA), in response to low temperature. Moreover, SlPIF4 directly activates the SlDELLA gene (GA‐INSENSITIVE 4, SlGAI4) under cold stress, and SlGAI4 positively regulates cold tolerance. Additionally, SlGAI4 represses accumulation of the SlPIF4 protein, thus forming multiple coherent feed‐forward loops. Our results reveal that plants integrate light and temperature signals to better adapt to cold stress through shared hormone pathways and transcriptional regulators, which may provide a comprehensive understanding of plant growth and survival in a changing environment.
Full text
Available for:
BFBNIB, DOBA, FZAB, GIS, IJS, IZUM, KILJ, NLZOH, NUK, OILJ, PILJ, PNG, SAZU, SBCE, SBMB, UILJ, UKNU, UL, UM, UPUK
Diversity has long been regarded as a security mechanism for improving the resilience of software and networks against various attacks. More recently, diversity has found new applications in cloud ...computing security, moving target defense, and improving the robustness of network routing. However, most existing efforts rely on intuitive and imprecise notions of diversity, and the few existing models of diversity are mostly designed for a single system running diverse software replicas or variants. At a higher abstraction level, as a global property of the entire network, diversity and its effect on security have received limited attention. In this paper, we take the first step toward formally modeling network diversity as a security metric by designing and evaluating a series of diversity metrics. In particular, we first devise a biodiversity-inspired metric based on the effective number of distinct resources. We then propose two complementary diversity metrics, based on the least and the average attacking efforts, respectively. We provide guidelines for instantiating the proposed metrics and present a case study on estimating software diversity. Finally, we evaluate the proposed metrics through simulation.
Follow-up observations at high-angular resolution of bright submillimeter galaxies selected from deep extragalactic surveys have shown that the single-dish sources are comprised of a blend of several ...galaxies. Consequently, number counts derived from low- and high-angular-resolution observations are in tension. This demonstrates the importance of resolution effects at these wavelengths and the need for realistic simulations to explore them. We built a new 2 deg2 simulation of the extragalactic sky from the far-infrared to the submillimeter. It is based on an updated version of the 2SFM (two star-formation modes) galaxy evolution model. Using global galaxy properties generated by this model, we used an abundance-matching technique to populate a dark-matter lightcone and thus simulate the clustering. We produced maps from this simulation and extracted the sources, and we show that the limited angular resolution of single-dish instruments has a strong impact on (sub)millimeter continuum observations. Taking into account these resolution effects, we are reproducing a large set of observables, as number counts and their evolution with redshift and cosmic infrared background power spectra. Our simulation consistently describes the number counts from single-dish telescopes and interferometers. In particular, at 350 and 500 μm, we find that the number counts measured by Herschel between 5 and 50 mJy are biased towards high values by a factor ~2, and that the redshift distributions are biased towards low redshifts. We also show that the clustering has an important impact on the Herschel pixel histogram used to derive number counts from P(D) analysis. We find that the brightest galaxy in the beam of a 500 μm Herschel source contributes on average to only ~60% of the Herschel flux density, but that this number will rise to ~95% for future millimeter surveys on 30 m-class telescopes (e.g., NIKA2 at IRAM). Finally, we show that the large number density of red Herschel sources found in observations but not in models might be an observational artifact caused by the combination of noise, resolution effects, and the steepness of color- and flux density distributions. Our simulation, called Simulated Infrared Dusty Extragalactic Sky (SIDES), is publicly available.
Full text
Available for:
FMFMET, NUK, UL, UM, UPUK
The smart grid frequently collects consumers' fine-grained power usage data through smart meters to facilitate various applications, such as billing, load monitoring, regional statistics, and demand ...response. However, the smart meter reading streams may also pose severe privacy threats to the consumers by leaking their appliances' ON/OFF status. In this paper, we first quantitatively measure the information leakage with respect to specific appliances' status from any reading stream, and define a novel privacy notion to bound such information leakage. In addition, we propose a privacy preserving streaming algorithm with different options to effectively convert readings and promptly stream safe readings in different fashions. The output time series readings satisfy our privacy notion while guaranteeing excellent utility, such as extremely low aggregation errors and billing errors. Finally, we experimentally validate the effectiveness and efficiency of our approach using real data sets.
In defending one’s network against cyber attack, certain vulnerabilities may seem acceptable risks when considered in isolation. But an intruder can often infiltrate a seemingly well-guarded network ...through a multi-step intrusion, in which each step prepares for the next.
Attack graphs can reveal the threat by enumerating possible sequences of exploits that can be followed to compromise given critical resources. However, attack graphs do not directly provide a solution to remove the threat. Finding a solution by hand is error-prone and tedious, particularly for larger and less secure networks whose attack graphs are overly complicated. In this paper, we propose a solution to automate the task of hardening a network against multi-step intrusions. Unlike existing approaches whose solutions require removing exploits, our solution is comprised of initially satisfied conditions only. Our solution is thus more enforceable, because the initial conditions can be independently disabled, whereas exploits are usually consequences of other exploits and hence cannot be disabled without removing the causes. More specifically, we first represent given critical resources as a logic proposition of initial conditions. We then simplify the proposition to make hardening options explicit. Among the options we finally choose solutions with the minimum cost. The key improvements over the preliminary version of this paper include a formal framework of the minimum network hardening problem, and an improved one-pass algorithm in deriving the logic proposition while avoiding logic loops.
Full text
Available for:
GEOZS, IJS, IMTLJ, KILJ, KISLJ, NUK, OILJ, PNG, SAZU, SBCE, SBJE, UL, UM, UPCLJ, UPUK
Abstract
Due to the advantages of wide coverage, continuous remote monitoring, and high measurement accuracy, the spaceborne multi‐channel surveillance radar has been widely used for the air moving ...target detection and tracking. In this paper, the minimum detection velocity (MDV) definition for the aerial target detection performance evaluation in a spaceborne multi‐channel radar system is analysed, where three kinds of MDV definitions based on empirical formula, output signal‐to‐clutter‐plus‐noise ratio (SCNR) loss criterion, and output SCNR criterion are discussed in detail based on theoretical analysis and simulation verification. The analysis results will provide valuable reference for the practical spaceborne radar system designment with the air target detection mode.
Full text
Available for:
FZAB, GIS, IJS, KILJ, NLZOH, NUK, OILJ, SAZU, SBCE, SBMB, UL, UM, UPUK
Abstract
Observations have found black holes spanning 10 orders of magnitude in mass across most of cosmic history. The Kerr black hole solution is, however, provisional as its behavior at infinity ...is incompatible with an expanding universe. Black hole models with realistic behavior at infinity predict that the gravitating mass of a black hole can increase with the expansion of the universe independently of accretion or mergers, in a manner that depends on the black hole’s interior solution. We test this prediction by considering the growth of supermassive black holes in elliptical galaxies over 0 <
z
≲ 2.5. We find evidence for cosmologically coupled mass growth among these black holes, with zero cosmological coupling excluded at 99.98% confidence. The redshift dependence of the mass growth implies that, at
z
≲ 7, black holes contribute an effectively constant cosmological energy density to Friedmann’s equations. The continuity equation then requires that black holes contribute cosmologically as vacuum energy. We further show that black hole production from the cosmic star formation history gives the value of Ω
Λ
measured by Planck while being consistent with constraints from massive compact halo objects. We thus propose that stellar remnant black holes are the astrophysical origin of dark energy, explaining the onset of accelerating expansion at
z
∼ 0.7.