Crowdsourcing ideas from consumers can enrich idea input in new product development. After a decade of initiatives (e.g., Starbucks’ MyStarbucksIdea, Dell's IdeaStorm), the implications of ...crowdsourcing for idea generation are well understood, but challenges remain in dealing with the large volume of rapidly generated ideas produced in crowdsourcing communities. This study proposes a model that can assist managers in efficiently processing crowdsourced ideas by identifying the aspects of ideas that are most predictive of future implementation and identifies three sources of information available for an idea: its content, the contributor proposing it, and the crowd's feedback on the idea (the “3Cs”). These information sources differ in their time of availability (content/contributor information is available immediately; crowd feedback accumulates over time) and in the extent to which they comprise structured or unstructured data. This study draws from prior research to operationalize variables corresponding to the 3Cs and develops a new measure to quantify an idea's distinctiveness. Applying automated information retrieval methods (latent semantic indexing) and testing several linear methods (linear discriminant analysis, regularized logistic regression) and nonlinear machine‐learning algorithms (stochastic adaptive boosting, random forests), this article identifies the variables that are most useful towards predicting idea implementation in a crowdsourcing community for an IT product (Mendeley). Our results indicate that consideration of content and contributor information improves ranking performance between 22.6 and 26.0% over random idea selection, and that adding crowd‐related information further improves performance by up to 48.1%. Crowd feedback is the best predictor of idea implementation, followed by idea content and distinctiveness, and the contributor's past idea‐generation experience. Firms are advised to implement two idea selection support systems: one to rank new ideas in real time based on content and contributor experience, and another that integrates the crowd's idea evaluation after it has had sufficient time to provide feedback.
The present investigation reports a comprehensible and responsive strategy for identifying atrazine in several conditions using an extensive electrochemical method. CdS Quantum dots were synthesized ...via a greener approach, and their formation was endorsed by numerous characterization techniques such as FTIR, SEM, Raman, UV-Vis, and XRD. Owing to the splendid electrocatalytic behavior, Green CdS quantum dots (QDs) of crystallite size ∼2 nm was opted as the sensor material and were, therefore, incorporated on the bare carbon paste electrode's surface. The developed sensor demonstrated an impressive outcome for atrazine sensing accompanied by superior selectivity and sensitivity. The lower detection limit (LLOD) of 0.53 μM was attained using the developed sensor in a linear concentration range of 10-100 μM. Furthermore, the practical pertinence of the developed sensor was examined on distilled water, wastewater, and fresh liquid milk, resulting in a tremendous retrieval of atrazine (91.33-99.8%).
Widespread microplastic pollution is raising growing concerns as to its detrimental effects upon living organisms. A realistic risk assessment must stand on representative data on the abundance, size ...distribution and chemical composition of microplastics. Raman microscopy is an indispensable tool for the analysis of very small microplastics (<20 μm). Still, its use is far from widespread, in part due to drawbacks such as long measurement time and proneness to spectral distortion induced by fluorescence. This review discusses each drawback followed by a showcase of interesting and easily available solutions that contribute to faster and better identification of microplastics using Raman spectroscopy. Among discussed topics are: enhanced signal quality with better detectors and spectrum processing; automated particle selection for faster Raman mapping; comprehensive reference libraries for successful spectral matching. A last section introduces non-conventional Raman techniques (non-linear Raman, hyperspectral imaging, standoff Raman) which permit more advanced applications such as real-time Raman detection and imaging of microplastics.
Display omitted
•Raman is the method of choice for identifying small microplastics (<20 μm).•Automated mapping routines and library matching allow fast microplastic detection.•Nonlinear Raman techniques enable real-time monitoring of microplastics.
•Ignition delay times of ammonia/hydrogen blends were measured in a shock tube.•Seven available kinetic models were validated against the experimental data.•Hydrogen addition can reduce the ignition ...delay time of ammonia nonlinearly.•NH3 + H <=> NH2 + H2 proceed in the reversed way after 5% hydrogen addition.•Hydrogen addition can promote the accumulation of the H, O, and OH radicals.
To understand the effect of hydrogen addition on the auto-ignition of ammonia at high temperatures, ignition delay times of stoichiometric ammonia/hydrogen blends were measured in a shock tube at temperatures from 1020 to 1945 K, pressures of 1.2 and 10 atm, and hydrogen fractions from 0% to 70%. The measured ignition delay times were compared with seven available kinetic models. Chemical kinetic analyses were performed using both the Glarborg Model and Otomo Model to interpret the interactions between ammonia and hydrogen during the high temperature auto-ignition. Experimental results show that ammonia ignites slower than hydrogen, and hydrogen addition can reduce the ignition delay time nonlinearly. The numerical analysis identifies that 5% hydrogen addition does not significantly affect the reaction flux of ammonia at 20% fuel consumption, however, it makes the fuel H-atom abstraction reaction NH3 + H <=> NH2 + H2 proceed in the reversed way at the initial stage of ignition, therefore generate active H radical for further chain branching and promote the reactivity.
Political polarization on the digital sphere poses a real challenge to many democracies around the world. Although the issue has received some scholarly attention, there is a need to improve the ...conceptual precision in the increasingly blurry debate. The use of computational communication science approaches allows us to track political conversations in a fine-grained manner within their natural settings - the realm of interactive social media. The present study combines different algorithmic approaches to studying social media data in order to capture both the interactional structure and content of dynamic political talk online. We conducted an analysis of political polarization across social media platforms (analyzing Facebook, Twitter, and WhatsApp) over 16 months, with close to a quarter million online contributions regarding a political controversy in Israel. Our comprehensive measurement of interactive political talk enables us to address three key aspects of political polarization: (1) interactional polarization - homophilic versus heterophilic user interactions; (2) positional polarization - the positions expressed, and (3) affective polarization - the emotions and attitudes expressed. Our findings indicate that political polarization on social media cannot be conceptualized as a unified phenomenon, as there are significant cross-platform differences. While interactions on Twitter largely conform to established expectations (homophilic interaction patterns, aggravating positional polarization, pronounced inter-group hostility), on WhatsApp, de-polarization occurred over time. Surprisingly, Facebook was found to be the least homophilic platform in terms of interactions, positions, and emotions expressed. Our analysis points to key conceptual distinctions and raises important questions about the drivers and dynamics of political polarization online.
Because microstructure plays an important role in the mechanical properties of structural materials, developing the capability to quantify microstructures rapidly is important to enabling ...high‐throughput screening of structural materials. Electron backscatter diffraction (EBSD) is a common method for studying microstructures and extracting information such as grain size distributions (GSDs), but is not particularly fast and thus could be a bottleneck in high‐throughput systems. One approach to accelerating EBSD is to reduce the number of points that must be scanned. In this work, we describe an iterative method for reducing the number of scan points needed to measure GSDs using incremental low‐discrepancy sampling, including on‐the‐fly grain size calculations and a convergence test for the resulting GSD based on the Kolmogorov–Smirnov test. We demonstrate this method on five real EBSD maps collected from magnesium AZ31B specimens and compare the effectiveness of sampling according to two different low discrepancy sequences, the Sobol and R2 sequences, and random sampling. We find that R2 sampling is able to produce GSDs that are statistically very similar to the GSDs of the full density grids using, on average, only 52% of the total scan points. For EBSD maps that contained monodisperse GSDs and over 1000 grains, R2 sampling only required an average of 39% of the total EBSD points.
Evolutionary algorithms (EAs) have found many successful real-world applications, where the optimization problems are often subject to a wide range of uncertainties. To understand the practical ...behaviors of EAs theoretically, there are a series of efforts devoted to analyzing the running time of EAs for optimization under uncertainties. Existing studies mainly focus on noisy and dynamic optimization, while another common type of uncertain optimization, i.e., robust optimization, has been rarely touched. In this paper, we analyze the expected running time of the (1+1)-EA solving robust linear optimization problems (i.e., linear problems under robust scenarios) with a cardinality constraint k. Two common robust scenarios, i.e., deletion-robust and worst-case, are considered. Particularly, we derive tight ranges of the robust parameter d or budget k allowing the (1+1)-EA to find an optimal solution in polynomial running time, which disclose the potential of EAs for robust optimization.
The Time-Sensitive Networking (TSN) set of standards introduces in IEEE 802.1 switches and end stations novel features to meet the requirements of a broad spectrum of applications that are ...characterized by time-sensitive and mission-critical traffic flows. In particular, the IEEE802.1Qbv-2015 amendment introduces enhancements that provide temporal isolation for scheduled traffic, i.e., a traffic class that requires transmission based on a known timescale, while the IEEE802.1Qbu-2016 introduces preemption as a mechanism to allow time-critical messages to interrupt ongoing non time-critical transmissions. Both amendments, that are now enrolled in the IEEE802.1Q-2018 standard, are very important for industrial networks, where scheduled traffic and low-latency real-time flows have to coexist, on the same network, with best-effort transmissions.
In this context, this work presents a response time analysis of TSN networks that encompasses the enhancements for scheduled traffic and preemption, in various combinations. The paper presents the proposed analysis and a performance comparison between the response times calculated by the analysis and the response times obtained through OMNeT++ simulations in three different scenarios.
•Present a schedulability analysis of TSN networks with gate mechanism and preemption.•Consider various combinations of TSN features in the schedulability analysis.•Develop a simulation tool in OMNeT++ to evaluate TSN networks with various features.•Compare the analysis and simulation results in three realistic industrial use cases.•Discuss improvement points for the proposed analysis and directions for future work.
ABSTRACT
Measuring the average time that a process takes from start to finish, using observation time windows (OTWs) of different length, is required for numerous operational monitoring and control ...processes. We refer to this measure as the Mean Lead‐Time (MLT). This study is based on the fact that computing the MLT in an operational context is often misleading for two reasons: (1) the computed value directly depends on the length of the used OTW; and (2) some jobs are usually still running at the end of the OTW, and thus their final lead‐time is not known yet. To overcome these issues, we revisit the definition of the MLT as well as the way to measure it. We develop a method to take these two aspects into account and apply this method to four real‐life cases in different business contexts. Using these practical cases, we show that the proposed methodology makes it possible to compute a standardized measure of the MLT, allowing for a meaningful comparison when different OTW lengths are considered. These new results open the door for an efficient use of the MLT as a commensurable performance indicator and allow near real‐time monitoring and comparison across different process steps, departments, and factories.
Environmental safety has become a significant issue for the safety of living species, humans, and the ecosystem as a consequence of the harmful and detrimental consequences of various pollutants such ...as pesticides, heavy metals, dyes, etc., emitted into the surroundings. To resolve this issue, various efforts, legal acts, scientific and technological perspectives have been embraced, but still remain a global concern. Furthermore, due to non-portability, complex detection, and inappropriate on-site recognition of sophisticated laboratory tools, the real-time analysis of these environmental contaminants has been limited. As a result of innovative nano bioconjugation and nanofabrication techniques, nanotechnology enables enhanced nanomaterials (NMs) based (bio)sensors demonstrating ultra-sensitivity and a short detection time in real-time analysis, as well as superior sensitivity, reliability, and selectivity have been developed. Several researchers have demonstrated the potent detection of pollutants such as Hg2+ ion by the usage of AgNP-MD in electronic and optoelectronic methods with a detection limit of 5–45 μM which is quite significant. Taking into consideration of such tremendous research, herein, the authors have highlighted 21st-century strategies towards NMs based biosensor technology for pollutants detection, including nano biosensors, enzyme-based biosensors, electrochemical-based biosensors, carbon-based biosensors and optical biosensors for on-site identification and detection of target analytes. This article will provide a brief overview of the significance of utilizing NMs-based biosensors for the detection of a diverse array of hazardous pollutants, and a thorough understanding of the detection processes of NMs-based biosensors, as well as the limit of quantification (LOQ) and limit of detection (LOD) values, rendering researchers to focus on the world's need for a sustainable earth.
Display omitted
•A recent analysis on the rapid detection of pollutants has been presented.•Nanomaterials-based biosensors plays a crucial role in the detection of pollutants.•Biosensors exhibited a viable alternative to establish effective analytical methods.•Nanomaterials-based biosensors exhibit enhanced sensitivity and reliability.