Summary
Damage and failure of rubberized self‐compacting concrete (RSCC) under uniaxial tension are investigated by acoustic emission (AE) and digital image correlation (DIC) techniques. Four RSCC ...mixtures containing fine rubber particles with 0%, 5%, 10%, and 15% volume fractions are tested. The effect of rubber content on the macroscopic mechanical behavior, the AE parameters, the strain fields, and the damage developments are analyzed. It is demonstrated that the combined use of AE parameters and DIC strain maps provides an accurate estimation of different stages of damage evolution and that the crack propagation measured by DIC correlates strongly with all AE parameters. Modes of cracking are determined by analyses of average frequency (AF) versus rise time–amplitude (RA) values to demonstrate an additional feature of AE, which can be used for explanations of different behaviors under different loading conditions. It is shown that the substitution of fine aggregates with fine rubber particles leads to a reduction of stiffness, strength, and fracture toughness when the material experiences uniaxial tension. This has to be considered in the design of structures with RSCC.
This article describes the extreme value analysis (EVA) R package extRemes version 2.0, which is completely redesigned from previous versions. The functions primarily provide utilities for ...implementing univariate EVA, with a focus on weather and climate applications, including the incorporation of covariates, as well as some functionality for assessing bivariate tail dependence.
CO2 conversion covers a wide range of possible application areas from fuels to bulk and commodity chemicals and even to specialty products with biological activity such as pharmaceuticals. In the ...present review, we discuss selected examples in these areas in a combined analysis of the state-of-the-art of synthetic methodologies and processes with their life cycle assessment. Thereby, we attempted to assess the potential to reduce the environmental footprint in these application fields relative to the current petrochemical value chain. This analysis and discussion differs significantly from a viewpoint on CO2 utilization as a measure for global CO2 mitigation. Whereas the latter focuses on reducing the end-of-pipe problem “CO2 emissions” from todays’ industries, the approach taken here tries to identify opportunities by exploiting a novel feedstock that avoids the utilization of fossil resource in transition toward more sustainable future production. Thus, the motivation to develop CO2-based chemistry does not depend primarily on the absolute amount of CO2 emissions that can be remediated by a single technology. Rather, CO2-based chemistry is stimulated by the significance of the relative improvement in carbon balance and other critical factors defining the environmental impact of chemical production in all relevant sectors in accord with the principles of green chemistry.
Understanding the impact of variation in lesion topography on the expression of functional impairments following stroke is important, as it may pave the way to modeling structure–function relations ...in statistical terms while pointing to constraints for adaptive remapping and functional recovery. Multi‐perturbation Shapley‐value analysis (MSA) is a relatively novel game‐theoretical approach for multivariate lesion‐symptom mapping. In this methodological paper, we provide a comprehensive explanation of MSA. We use synthetic data to assess the method's accuracy and perform parameter optimization. We then demonstrate its application using a cohort of 107 first‐event subacute stroke patients, assessed for upper limb (UL) motor impairment (Fugl‐Meyer Assessment scale). Under the conditions tested, MSA could correctly detect simulated ground‐truth lesion‐symptom relationships with a sensitivity of 75% and specificity of ~90%. For real behavioral data, MSA disclosed a strong hemispheric effect in the relative contribution of specific regions‐of‐interest (ROIs): poststroke UL motor function was mostly contributed by damage to ROIs associated with movement planning (supplementary motor cortex and superior frontal gyrus) following left‐hemispheric damage (LHD) and by ROIs associated with movement execution (primary motor and somatosensory cortices and the ventral brainstem) following right‐hemispheric damage (RHD). Residual UL motor ability following LHD was found to depend on a wider array of brain structures compared to the residual motor ability of RHD patients. The results demonstrate that MSA can provide a unique insight into the relative importance of different hubs in neural networks, which is difficult to obtain using standard univariate methods.
In this methodological paper, we described in detail a newly revised multivariate approach to lesion‐symptom mapping based on the game‐theoretical principles of multi‐perturbation Shapley‐value analysis. Using as a test case, a data set of 107 stroke patients, we showed the ability of this revised approach to correctly detect ground‐truth brain–behavior relationship using moderately sized cohorts (50–60 patients), with acceptable type‐I and type‐II error rates. The results obtained by this method can provide useful information regarding the underlying brain network supporting the behavior of interest.
Abstract
The patent evasion design may reduce the value of the original patent. In order to avoid this possibility, it is necessary to analyze the development status of the patented technology while ...conducting the patent evasion design. Combining value analysis with patent evasion methods, a patent evasion design method based on value analysis is proposed. Firstly search related patents by identifying multi-level keywords to obtain the target patent database; then predict and analyze the maturity of the database patents to determine the core patent group; the structure/function model of the core patent is established and its value analysis is analyzed to find the low-value components as the target technology of the circumvention design; next, the contradictory points of the target patent are found through the TRIZ theoretical tool to perform the evasion design, and the new scheme is obtained and the infringement judgment is made. Finally, the method was verified by an automatic connecting device, and the result showed that the improved automatic connecting device has the advantages of simple structure and convenient operation.
Progress in coopetition research is impeded by two problems in the literature: (a) superficial conceptualization of simultaneity and outcomes and (b) lack of theorizing about core properties of ...coopetition and how they influence outcomes. This paper addresses these interrelated problems and charts a path towards a theory of coopetition. We systematically analyze competition and cooperation and illuminate how the interplay between specific aspects of competition and cooperation manifests through unique coopetition mechanisms. We explicate a range of possible outcomes from coopetition—joint value creation for all firms, value creation for individual firms, and value destruction—and suggest that coopetition mechanisms help explain how and why coopetition may lead to varying outcomes. Furthermore, we explain how effective navigation of simultaneity and value creation intent, two fundamental elements of coopetition, may be instrumental in deriving beneficial outcomes. Navigating simultaneity involves balancing competition and cooperation and maintaining both at moderately strong levels, and navigating value creation consists of managing the trade-off between joint value creation and firm value creation without compromising overall value creation. By explaining how coopetition manifests, what its unique underlying properties are, and how such properties influence outcomes, our paper provides a deeper understanding of the phenomenon and progresses the literature towards a theory of coopetition.
•Shear strength predictions from existing models differ from experimental results.•Predictions by MCFT(R2k) and CCC bear the closest comparison to experimental results.•Design values of current shear ...design provisions compare well, except for Eurocode 2.•Eurocode 2 shear design formulation may be uneconomical at low levels of stirrups.
An accurate method for predicting shear strength for reinforced concrete beams is paramount since shear failure is catastrophic and can lead to grave consequences. Different standards and guidelines use different models for shear resistance predictions. Their stipulations on shear design differ considerably from one another, resulting in different design procedures and safety performance. This contribution assessed the mean and design value predictions of shear models for beams with shear reinforcements in currents codes and published technical literature. The models investigated includes the EC2 VSIM shear model, the ACI 318 (2011) shear model, the Fib Model Code 2010 (MC-10 (III)) shear model, the best-estimate prediction by Modified Compression Field Theory (MCFT) based analysis program Response 2000 (R2k) and the Compression Chord Capacity model (CCC). The mean and design value predictions from the various methods are compared to one another and to experimental results, over the parametric range of shear reinforcement, concrete strength and beam depth. The assessment revealed that the mean value predictions of the different models differ considerably from one another and from experimental observations. The mean value predictions from MCFT (R2k) and CCC predictions bear the closest comparison to that of the experimental observations for the range of shear reinforcement ρwfywm investigated. The mean value predictions of the EC2 VSIM was shown to significantly underpredict capacity for slightly shear-reinforced concrete beams. The shear method produced the most conservative mean value predictions out of all the methods investigated at shear reinforcement ρwfywm≤1MPa. The design value analysis revealed that the design values of the various shear design methods compare well, except for EC2 VSIM at low levels of design shear reinforcement (ρwfywd≤1MPa).
Nonstationary extreme value analysis (NEVA) can improve the statistical representation of observed flood peak distributions compared to stationary (ST) analysis, but management of flood risk relies ...on predictions of out‐of‐sample distributions for which NEVA has not been comprehensively evaluated. In this study, we apply split‐sample testing to 1250 annual maximum discharge records in the United States and compare the predictive capabilities of NEVA relative to ST extreme value analysis using a log‐Pearson Type III (LPIII) distribution. The parameters of the LPIII distribution in the ST and nonstationary (NS) models are estimated from the first half of each record using Bayesian inference. The second half of each record is reserved to evaluate the predictions under the ST and NS models. The NS model is applied for prediction by (1) extrapolating the trend of the NS model parameters throughout the evaluation period and (2) using the NS model parameter values at the end of the fitting period to predict with an updated ST model (uST). Our analysis shows that the ST predictions are preferred, overall. NS model parameter extrapolation is rarely preferred. However, if fitting period discharges are influenced by physical changes in the watershed, for example from anthropogenic activity, the uST model is strongly preferred relative to ST and NS predictions. The uST model is therefore recommended for evaluation of current flood risk in watersheds that have undergone physical changes. Supporting information includes a MATLAB® program that estimates the (ST/NS/uST) LPIII parameters from annual peak discharge data through Bayesian inference.
Key Points
Stationary predictions of flood peak distributions are preferred, overall
Extrapolation of the nonstationary model parameter trend rarely improves the stationary prediction, even if an observed trend continues
Using the most recent nonstationary parameters to predict with an updated stationary model is preferred for physically changing watersheds