This paper describes a low-cost noninvasive brain-computer interface (BCI) hybridized with eye tracking. It also discusses its feasibility through a Fitts' law-based quantitative evaluation method. ...Noninvasive BCI has recently received a lot of attention. To bring the BCI applications into real life, user-friendly and easily portable devices need to be provided. In this work, as an approach to realize a real-world BCI, electroencephalograph (EEG)-based BCI combined with eye tracking is investigated. The two interfaces can be complementary to attain improved performance. Especially to consider public availability, a low-cost interface device is intentionally used for test. A low-cost commercial EEG recording device is integrated with an inexpensive custom-built eye tracker. The developed hybrid interface is evaluated through target pointing and selection experiments. Eye movement is interpreted as cursor movement and noninvasive BCI selects a cursor point with two selection confirmation schemes. Using Fitts' law, the proposed interface scheme is compared with other interface schemes such as mouse, eye tracking with dwell time, and eye tracking with keyboard. In addition, the proposed hybrid BCI system is discussed with respect to a practical interface scheme. Although further advancement is required, the proposed hybrid BCI system has the potential to be practically useful in a natural and intuitive manner.
The “like” feature on Facebook has emerged as a commonly used paralinguistic tool for communicating, and its importance as an indication of positive feelings toward the posts of others is likely to ...increase. Comprehensive research is needed into why and how users are motivated toward ‘liking’ behavior, and whether this behavior generates an intention to continue using Facebook over time. This study combines the theory of uses and gratification and a subjective norm perspective to create an integrated model that predicts liking behavior and usage intentions on Facebook. The research model is tested with data collected from online users of Facebook and the proposed model is supported by a measurement and structural model analysis based on empirical data collected from 267 Facebook users. The findings indicate that the most salient motivations for users to liking behavior are enjoyment, information seeking, social interaction, and subjective norms, and that they subsequently reinforce their continuous intention toward the Facebook. The results also revealed that subjective norms contribute strongly to the projections of liking behavior and continuous usage intention. The proposed research model contributes to global marketing research and information-technology service management by integrating personal and social motivators to understand the acceptance of social networking technologies by users in Asia. In particular, the outcomes stand to enhance the current state of knowledge of social networking site developers, managers, and organizations to improve acceptance of their services or products, development of customer support, advertising, and/or product development. The present results lay the foundation for uses and gratification theory and subjective norms model that have important theoretical and practical implications and may guide future research efforts in this context.
The abnormal posterior vitreous detachment (PVD) is speculated as an important mechanism of the development of the epiretinal membrane (ERM). However, there is only limited information about the ...molecular mechanism. Sphingosine-1-phosphate (S1P) is a mediator of the mechanosensitive response in several cell types that may have a role in the pathogenesis of ERM during abnormal PVD. Therefore, we evaluated the expression of S1P in the human ERM and the role of S1P in cultured human Muller glial cells. Among 24 ERM specimens, seven specimens (29.2%) exhibited S1P expression. Patients with secondary ERM or ellipsoid zone defects, which suggest abnormal PVD presented a significantly higher S1P+ cell density (secondary ERM: 128.20 ± 135.61 and 9.68 ± 36.01 cells, p = 0.002; EZ defects: 87.56 ± 117.79 vs 2.80 ± 8.85, p = 0.036). The addition of S1P increased the migrative ability and expression of N-cadherin and alpha-SMA in human Muller glial cells, suggesting S1P is a potential causative molecule for the development of ERM during abnormal PVD.
Celotno besedilo
Dostopno za:
DOBA, IZUM, KILJ, NUK, PILJ, PNG, SAZU, SIK, UILJ, UKNU, UL, UM, UPUK
This study used geographic object-based image analysis (GEOBIA) with very high spatial resolution (VHR) aerial imagery (0.3 m spatial resolution) to classify vegetation, channel and bare mud classes ...in a salt marsh. Three classification issues were investigated in the context of segmentation scale: (1) a comparison of single- and multi-scale GEOBIA using spectral bands, (2) the relative benefit of incorporating texture derived from the grey-level co-occurrence matrix (GLCM) in classifying the salt marsh features in single- and multi-scale GEOBIA and (3) the effect of quantization level of GLCM texture in the context of multi-scale GEOBIA. The single-scale GEOBIA experiments indicated that the optimal segmentation was both class and scale dependent. Therefore, the single-scale approach produced an only moderately accurate classification for all marsh classes. A multi-scale approach, however, facilitated the use of multiple scales that allowed the delineation of individual classes with increased between-class and reduced within-class spectral variation. With only spectral bands used, the multi-scale approach outperformed the single-scale GEOBIA with an overall accuracy of 82% vs. 76% (Kappa of 0.71 vs. 0.62). The study demonstrates the potential importance of ancillary data, GLCM texture, to compensate for limited between-class spectral discrimination. For example, gains in classification accuracies ranged from 3% to 12% when the GLCM mean texture was included in the multi-scale GEOBIA. The multi-scale classification overall accuracy varied with quantization level of the GLCM texture matrix. A quantization level of 2 reduced misclassifications of channel and bare mud and generated a statistically higher classification than higher quantization levels. Overall, the multi-scale GEOBIA produced the highest classification accuracy. The multi-scale GEOBIA is expected to be a useful methodology for creating a seamless spatial database of marsh landscape features to be used for further geographic information system (GIS) analyses.
We focus on open‐domain question‐answering tasks that involve a chain‐of‐reasoning, which are primarily implemented using large language models. With an emphasis on cost‐effectiveness, we designed ...EffiChainQA, an architecture centered on the use of small language models. We employed a retrieval‐based language model to address the limitations of large language models, such as the hallucination issue and the lack of updated knowledge. To enhance reasoning capabilities, we introduced a question decomposer that leverages a generative language model and serves as a key component in the chain‐of‐reasoning process. To generate training data for our question decomposer, we leveraged ChatGPT, which is known for its data augmentation ability. Comprehensive experiments were conducted using the HotpotQA dataset. Our method outperformed several established approaches, including the Chain‐of‐Thoughts approach, which is based on large language models. Moreover, our results are on par with those of state‐of‐the‐art Retrieve‐then‐Read methods that utilize large language models.
The emergence of large language models (LLMs) has led to the development of improved question‐answering models. However, LLMs suffer from challenges such as hallucinations and outdated information issues. In response, researchers have developed a new open‐domain question‐answering model named “EffiChainQA.” This architecture uses a novel chain‐of‐reasoning pipeline relying on small language models with an emphasis on cost‐effectiveness. The innovative algorithm can pave the way towards efficient, reliable, and transparent question‐answering models.
Display omitted
► A fast pre-processing (quasi-interpolation and min/max cell construction) using OpenCL computing kernels. ► A fast, accurate and stable evaluation of a spline field and its ...gradient. ► An efficient empty space skipping. ► A novel indexing scheme that allows an FCC dataset to be stored compactly as a four-channel (RGBA) 3D texture.
This paper presents an efficient and accurate isosurface rendering algorithm for the natural C1 splines on the face-centered cubic (FCC) lattice. Leveraging fast and accurate evaluation of a spline field and its gradient, accompanied by efficient empty-space skipping, the approach generates high-quality isosurfaces of FCC datasets at interactive speed (20–70fps). The pre-processing computation (quasi-interpolation and min/max cell construction) is improved 20–30-fold by OpenCL kernels. In addition, a novel indexing scheme is proposed that allows an FCC dataset to be stored as a four-channel 3D texture. When compared with other reconstruction schemes on the Cartesian and BCC (body-centered cubic) lattices, this method can be considered a practical reconstruction scheme that offers both quality and performance. The OpenCL and GLSL (OpenGL Shading Language) source codes are provided as a reference.
•Sampling by Codex guidelines detects Cronobacter in a recalled PIF batch profile.•Sampling would not reliably detect Cronobacter in a non-recalled batch profile.•Sampling PIF with stratification is ...potentially more powerful than random sampling.•Taking more samples, even if smaller, increases the power to detect contamination.
Cronobacteris a hazard in Powdered Infant Formula (PIF) products that is hard to detect due to localized and low-level contamination. We adapted a previously published sampling simulation to PIF sampling and benchmarked industry-relevant sampling plans across different numbers of grabs, total sample mass, and sampling patterns. We evaluated performance to detect published Cronobacter contamination profiles for a recalled PIF batch 42% prevalence, −1.8 ± 0.7 log(CFU/g) and a reference, nonrecalled, PIF batch 1% prevalence, −2.4 ± 0.8 log(CFU/g). Simulating a range of numbers of grabs n = 1–22,000 (representing testing every finished package) with 300 g total composite mass showed that taking 30 or more grabs detected contamination reliably (<1% median probability to accept the recalled batch). Benchmarking representative sampling plans (n = 30, mass grab = 10g, n = 30, m = 25g, n = 60, m = 25g, n = 180, m = 25g) showed that all plans would reject the recalled batch (<1% median probability to accept) but would rarely reject the reference batch (>50% median probability of acceptance, all plans). Overall, (i) systematic or stratified random sampling patterns are equal to or more powerful than random sampling of the same sample size and total sampled mass, and, (ii) taking more samples, even if smaller, can increase the power to detect contamination.
Meteorological satellite images provide crucial information on solar irradiation and weather conditions at spatial and temporal resolutions which are ideal for short-term photovoltaic (PV) power ...forecasts. Following the introduction of next-generation meteorological satellites, investigating their application on PV forecasts has become imminent. In this study, Communications, Oceans, and Meteorological Satellite (COMS) and Himawari-8 (H8) satellite images were inputted in a deep neural network (DNN) model for 2 hour (h)- and 1 h-ahead PV forecasts. A one-year PV power dataset acquired from two solar power test sites in Korea was used to directly forecast PV power. H8 was used as a proxy for GEO-KOMPSAT-2A (GK2A), the next-generation satellite after COMS, considering their similar resolutions, overlapping geographic coverage, and data availability. In addition, two different data sampling setups were designed to implement the input dataset. The first setup sampled chronologically ordered data using a relatively more inclusive time frame (6 a.m. to 8 p.m. in local time) to create a two-month test dataset, whereas the second setup randomly sampled 25% of data from each month from the one-year input dataset. Regardless of the setup, the DNN model generated superior forecast performance, as indicated by the lowest normalized mean absolute error (NMAE) and normalized root mean squared error (NRMSE) results in comparison to that of the support vector machine (SVM) and artificial neural network (ANN) models. The first setup results revealed that the visible (VIS) band yielded lower NMAE and NRMSE values, while COMS was found to be more influential for 1 h-ahead forecasts. For the second setup, however, the difference in NMAE results between COMS and H8 was not significant enough to distinguish a clear edge in performance. Nevertheless, this marginal difference and similarity of the results suggest that both satellite datasets can be used effectively for direct short-term PV forecasts. Ultimately, the comparative study between satellite datasets as well as spectral bands, time frames, forecast horizons, and forecast models confirms the superiority of the DNN and offers insights on the potential of transitioning to applying GK2A for future PV forecasts.
Connected vehicles are at risk of exposing their location history when using potentially untrusted location-based services (LBSs) in the driving process. We propose a method called mutually ...obfuscating paths (MOP) that enables vehicles to provide highly accurate realtime location updates to LBS while preventing the LBS from tracking vehicles. The instrument is to leverage connected vehicles' two network interfaces: in-car LTE Internet (for accessing LBS) and car-to-car Dedicated short-range communications (DSRC)-communication (for obfuscating their paths). The main idea of MOP is that vehicles, when appropriate, generate made-up but plausible location updates for each other, making their paths continuously branching off from the LBS' viewpoint. We evaluations show that MOP provides strong privacy protection even under continuous and highly accurate location updates.