The scholarly work on China's environmental regulations in the context of “central–local” relations is dominated by the preference for a centralized approach. This article examines a centrally ...imposed and executed verification programme of locally reported pollution data, a rare and sustained central effort to enforce an environmental policy, namely the national pollution reduction target system. The programme was established in 2007 to curtail perceived widespread data falsification and to enhance the quality of emission data, the basis for assessing local compliance with targets. Based on an analysis of official documents and interviews with environmental officials and industry representatives, this article found that the verification programme appears to have reduced the overreporting problem with emission data, enhanced local monitoring and enforcement capacity, and to a certain degree deterred violations due to the increased frequency of national and local inspections. Nevertheless, significant challenges remain. Verification is highly resource intensive, it has involved little external oversight and public participation, the central authority has exerted significant yet unchecked discretionary powers, and poor data quality has remained an issue. Over time, the verification programme appears to have turned into essentially a “numbers game.” All those challenges indicate that a centralized enforcement approach is arguably ineffective in addressing China's long-standing problem of weak environmental policy implementation. This study also sheds lights on the classical “principal-agent” theory in the study of public bureaucracy. Not only does the principal distrust the agent, which is the main concern of the theory, but the agent also distrusts the principal. 现有的从“中央-地方”关系的角度探讨中国环境规制的研究中, 多数倾向于采用中央集中式管理模式。这篇论文以污染减排制度为切入点, 分析中央自 2007 年以来采纳的”自上而下“的中央集中式减排核查机制。该机制的设置是为了有效遏制遍及全国的数据造假, 提高作为评估减排效果基础的排放达标数据质量, 正好为深入研究中央集中式规制模式提供了一个很好的实践范本。该研究收集相关的官方资料, 走访中央和地方环保官员以及企业管理层, 在分析相关数据的基础上发现, 中央集中式核查机制减少了虚报 (报高) 排放数据的问题, 提高了地方环境监测和执法能力, 并在一定程度上减少了企业违法排污行为, 这是由中央和地方现场监察频率的增加带来的威慑作用所致。尽管如此, 该中央集中式核查机制存在不少问题: 核查机制耗费大量资源, 核查过程缺乏外在的监督机制和公众参与, 中央部门行使了显著的、同时也缺乏约束的自由裁量权, 环境数据质量仍然是一个问题。长期而言, 核查机制已逐步演变成 “数据游戏”。这些问题意味着中央集中式模式不是解决中国长期存在的环境执法弱的有效办法。这项研究还对广泛适用于研究公共政府机构的经典的“委托 - 代理”理论有重要的启示作用, 分析结果不仅展示了传统理论主要关注的委托人不信任代理人这一状况的存在, 也发现存在代理不信任委托人的情况。
We construct a measure of the pairwise relatedness of firms’ human capital to examine whether human capital relatedness is a key factor in mergers and acquisitions. We find that mergers are more ...likely and merger returns and postmerger performance are higher when firms have related human capital. These relations are stronger or only present in acquisitions where the merging firms do not operate in the same industries or product markets. Reductions in employment and wages following mergers with high human capital relatedness suggest that the merged firm has greater ability to layoff low quality and/or duplicate employees and reduce labor costs. We further show in a falsification test that human capital relatedness has no effect on acquiring firm returns in asset sales when little or no labor is transferred, which helps validate our measure of human capital relatedness.
The purpose of this study is to identify the important principles of publishing scientific research results, benefits, and systematic review of the constraints and solutions of scientific ...publications. The discussion of this research focuses on scientific errors in the form of falsification and fabrication. The identification of violations of scientific data distortion includes the character, causes, and prevention perspectives through a holistic approach.The research method used is qualitative research with the type of library research using keywords, namely scientific publication, manuscript, falsification, and fabrication. This study involved a systematic literature search from the online publishing database sources Sciencedirect, Web of Science, Scopus, and Google Scholar to find relevant articles published from 2000 to 2020. The study was conducted in September 2020. An electronic search revealed that there were 896 scientific publications published. relevant, of which 42 manuscripts are included in the bibliography. The synthesis of published data that was drawn was sourced from 88% of primary journals from all references, each of which was identified regarding the validity of the data for writing papers. A prospective design was used to maximize the homogeneity of the information extracted from the selected studies, as part of the research instrument.The results of the study found that the obstacles to scientific publications included the quality and novelty of research results, limited English skills, and minimal writing skills. Repeated violations of scientific thought include falsification and fabrication. This scientific error leads to distortion of research data. Individual falsification and fabrication errors are caused by unethical behavior, violations of applicable scientific norms and ethics.
•Fast and efficient discrimination method of different brands of whisky.•Correct classification of whisky brands by UV spectroscopy and PLS-DA.•Effective analysis of counterfeit samples with high ...correct classification rate.•Validation with samples belonging to known and unknown classes/brands.
The discrimination of whisky brands and counterfeit identification were performed by UV–Vis spectroscopy combined with partial least squares for discriminant analysis (PLS-DA). In the proposed method all spectra were obtained with no sample preparation. The discrimination models were built with the employment of seven whisky brands: Red Label, Black Label, White Horse, Chivas Regal (12years), Ballantine’s Finest, Old Parr and Natu Nobilis. The method was validated with an independent test set of authentic samples belonging to the seven selected brands and another eleven brands not included in the training samples. Furthermore, seventy-three counterfeit samples were also used to validate the method. Results showed correct classification rates for genuine and false samples over 98.6% and 93.1%, respectively, indicating that the method can be helpful for the forensic analysis of whisky samples.
This paper quantifies how past transnational terrorist attacks against a potential donor's assets result in enhanced foreign aid flows to a country hosting the responsible terrorist group. Given the ...reversed causality between foreign aid and terrorism, our empirical analysis puts forward an instrumental variable. Both conflict and governance assistance are shown to stem from transnational terrorist incidents involving recipient–donor dyads during 1974–2013 for a global sample. For recipient-related terrorism, lagged transnational terrorist events against a donor's assets display a robust positive impact on conflict and governance aid. Placebo or falsification tests support the exogeneity of the instrument.
Mendelian randomization may give biased causal estimates if the instrument affects the outcome not solely via the exposure of interest (violating the exclusion restriction assumption). We demonstrate ...use of a global randomization test as a falsification test for the exclusion restriction assumption. Using simulations, we explored the statistical power of the randomization test to detect an association between a genetic instrument and a covariate set due to (a) selection bias or (b) horizontal pleiotropy, compared to three approaches examining associations with individual covariates: (i) Bonferroni correction for the number of covariates, (ii) correction for the effective number of independent covariates, and (iii) an r
permutation-based approach. We conducted proof-of-principle analyses in UK Biobank, using CRP as the exposure and coronary heart disease (CHD) as the outcome. In simulations, power of the randomization test was higher than the other approaches for detecting selection bias when the correlation between the covariates was low (r
< 0.1), and at least as powerful as the other approaches across all simulated horizontal pleiotropy scenarios. In our applied example, we found strong evidence of selection bias using all approaches (e.g., global randomization test p < 0.002). We identified 51 of the 58 CRP genetic variants as horizontally pleiotropic, and estimated effects of CRP on CHD attenuated somewhat to the null when excluding these from the genetic risk score (OR = 0.96 95% CI: 0.92, 1.00 versus 0.97 95% CI: 0.90, 1.05 per 1-unit higher log CRP levels). The global randomization test can be a useful addition to the MR researcher's toolkit.
In this survey, we present a comprehensive list of major known security threats within a cognitive radio network (CRN) framework. We classify attack techniques based on the type of attacker, namely ...exogenous (external) attackers, intruding malicious nodes and greedy cognitive radios (CRs). We further discuss threats related to infrastructure-based CRNs as well as infrastructure-less networks. Besides the short-term effects of attacks over CRN performance, we also discuss the often ignored longer term behavioral changes that are enforced by such attacks via the learning capability of CRN. After elaborating on various attack strategies, we discuss potential solutions to combat those attacks. An overview of robust CR communications is also presented. We finally elaborate on future research directions pertinent to CRN security. We hope this survey paper can provide the insight and the roadmap for future research efforts in the emerging field of CRN security.
Do individuals reveal their true preferences when asked for their support for an ongoing war? This research note presents the results of a list experiment implemented in the midst of the Russian ...invasion of Ukraine. Our experiment allows us to estimate the extent of preference falsification with regard to support for the war by comparing the experimental results with a direct question. Our data comes from an online sample of 3000 Russians. Results show high levels of support for the war and significant levels of preference falsification: when asked directly, 71% of respondents support the war, while this share drops to 61% when using the list experiment. Preference falsification is particularly pronounced among individuals using TV as a main source of news. Our results imply that war leaders can pursue peace without fearing a large popular backlash, but also show that high levels of support for war can be sustained even once the brutality of the war has become clear.
•Simultaneous determination of chiral and achiral impurities of dapoxetine.•Polysaccharide chiral columns were screened for enantio‑ and chemoselectivty.•Reversed-phased method was developed with ...gradient elution and flow programming.•Validated method was applied to the analysis of pharmacy- and internet-acquired products.•Internet-acquired products contained racemic dapoxetine and/or high concentration of R-dapoxetine impurity.
A reversed-phase high performance liquid chromatographic method was developed and validated for the simultaneous determination of the related substances of S-dapoxetine, including R-dapoxetine, (3S)-3-(dimethylamino-3-phenyl-1-propanol), S-3-amino-3-phenyl-1-propanol, 1-naphtol, 4-phenyl-2H,3H,4H-naphtho1,2-bpyran and 1-(2E)-Cinnamyloxynaphthalene. During the screening experiments seven different polysaccharide-type chiral stationary phases (amylose-based Lux-Amylose-1, Lux-i-Amylose-1 and Lux-Amylose-2, as well as cellulose-based Lux-Cellulose-1, Lux-Cellulose-2, Lux-Cellulose-3 and Lux-Cellulose-4) were tested in polar organic mode using a mobile phase consisting of 0.1% diethylamine in methanol, ethanol, 2-propanol and acetonitrile with 0.5 mL min−1 flow rate at 20 °C. Best results were obtained on Lux Cellulose-3 column with the ethanol-based mobile phase. To increase the retention factor of two, early-eluting impurities, water was added to the mobile phase. In order to counterbalance the increased total analysis time, higher column temperature (40 °C) and gradient elution, combined with flow-programming` was applied. Using the optimized conditions baseline separations were achieved for all compounds within 30 min. The method was validated according to the International Council on Harmonization guideline Q2(R1) and applied to the analysis of an approved, tablet formulation and dapoxetine-containing products sold on the internet. As expected, in the case of the pharmacy-acquired product, all of the monitored impurities were below 0.1%. However, interesting results were obtained when internet-acquired samples were analyzed. These tablets contained racemic dapoxetine and/or high concentration of R-dapoxetine impurity. Based on this work polysaccharide-based chiral stationary phases can be successfully applied for the simultaneous determination of achiral and chiral impurities in reversed-phase mode applying gradient elution and flow-rate programs. The study further underlines the importance of not only achiral, but also enantiomeric quality control, whenever counterfeiting of a single enantiomeric agent is suspected.
Poor‐quality medicines present a serious public health problem, particularly in emerging economies and developing countries, and may have a significant impact on the national clinical and economic ...burden. Attention has largely focused on the increasing availability of deliberately falsified drugs, but substandard medicines are also reaching patients because of poor manufacturing and quality‐control practices in the production of genuine drugs (either branded or generic). Substandard medicines are widespread and represent a threat to health because they can inadvertently lead to healthcare failures, such as antibiotic resistance and the spread of disease within a community, as well as death or additional illness in individuals. This article reviews the different aspects of substandard drug formulation that can occur (for example, pharmacological variability between drug batches or between generic and originator drugs, incorrect drug quantity and presence of impurities). The possible means of addressing substandard manufacturing practices are also discussed. A concerted effort is required on the part of governments, drug manufacturers, charities and healthcare providers to ensure that only drugs of acceptable quality reach the patient.