The metabolome of an organism depends on environmental factors and intracellular regulation and provides information about the physiological conditions. Metabolomics helps to understand disease ...progression in clinical settings or estimate metabolite overproduction for metabolic engineering. The most popular analytical metabolomics platform is mass spectrometry (MS). However, MS metabolome data analysis is complicated, since metabolites interact nonlinearly, and the data structures themselves are complex. Machine learning methods have become immensely popular for statistical analysis due to the inherent nonlinear data representation and the ability to process large and heterogeneous data rapidly. In this review, we address recent developments in using machine learning for processing MS spectra and show how machine learning generates new biological insights. In particular, supervised machine learning has great potential in metabolomics research because of the ability to supply quantitative predictions. We review here commonly used tools, such as random forest, support vector machines, artificial neural networks, and genetic algorithms. During processing steps, the supervised machine learning methods help peak picking, normalization, and missing data imputation. For knowledge-driven analysis, machine learning contributes to biomarker detection, classification and regression, biochemical pathway identification, and carbon flux determination. Of important relevance is the combination of different omics data to identify the contributions of the various regulatory levels. Our overview of the recent publications also highlights that data quality determines analysis quality, but also adds to the challenge of choosing the right model for the data. Machine learning methods applied to MS-based metabolomics ease data analysis and can support clinical decisions, guide metabolic engineering, and stimulate fundamental biological discoveries.
The cause of the Jones–Ray effect has been controversially debated for years. Ultrafine gas bubbles were employed to lessen the surface excess of the surface-active impurities adsorbing to the ...air/water interface of the salt solutions, which would lead to a direct shift in surface tension observable by the Wilhelmy plate method. It was concluded in this study that once the surface excess of the inevitable impurities in the salts is lessened by the introduction of ultrafine gas bubbles, which possess great air/water interfacial area, the Jones–Ray effect becomes nonobservable. Therefore, our finding hypothesized that the Jones–Ray effect might not originate from salts.
The application of ultrafine bubbles as drug carriers in drug delivery is still in its developmental stage; it is important to obtain a thorough understanding of the factors affecting the formation ...and stability of drug carrier matrices. In this study, the polyethylene glycol (PEG)-conjugated human serum albumin (HSA)-based ultrafine bubble simulating the physiological electrolyte concentration in human blood (154 mM) for quercetin delivery was investigated. Optical absorbance measurements, surface tension measurements, and fluorescence laser imaging were also employed to assess the plausibility of polymer-conjugated albumin-stabilized ultrafine bubbles in drug loading and drug release. The incorporation of PEG/HSA into the system illustrated a significant enhancement in the matrix's stability as confirmed by surface tension measurements and drug-loading efficiency achieving approximately 90%. In addition,
in vitro
drug release was performed by the application of high-frequency ultrasound, indicating more than 40% of the loaded quercetin was astonishingly liberated within 5 minutes of exposure.
Ultrafine bubbles stabilized by human serum albumin conjugate polyethylene glycol ameliorates the stability of complex as well as the drug payload. Polyethylene glycol presents the crucial role in releasing drug by means of acoustic sound.
Among the flooding-based time synchronization protocols in wireless sensor networks, rapid-flooding outperforms slow-flooding in terms of synchronization speed. However, sometimes it is not realistic ...to apply rapid-flooding in dense or low-duty cycle networks due to its operational constraint that timing messages must be forwarded immediately after being received. On the contrary, slow-flooding protocols are free from this operational constraint, but they have a fundamental limitation that their convergence time increases with the size of the network. In this letter, we prove that this limitation can be overcome by sharing the compensated skew and timing information to the next hop nodes so that they can all be synchronized in the same round. The evaluation results demonstrate that we can achieve a comparable convergence speed of rapid-flooding protocols with the slow-flooding-based mechanism and can provide better accuracy for network-wide and per-hop synchronization.
In the absence of rapid on‐site evaluation (ROSE), it is not clear which method of tissue preparation is best to process tissue obtained from EUS guidance. Cytological smearing (CS), cell block (CB), ...and direct histology (DH) are the available techniques.
Aim
To compare the diagnostic yield of three techniques of tissue preparation for EUS‐guided tissue acquisition without ROSE.
Methods
Patients who were referred for EUS‐FNA of peri‐gastrointestinal masses were recruited. Without ROSE, each lesion was biopsied with three needle passes, and the order in which tissue is prepared was randomized to either (i) CS + CB, (ii) CB only, or (iii) DH only. The prepared specimens were reviewed.
Results
A total of 243 specimens were taken from 81 patients. Tissue diagnosis was achieved in 78/81 (96.3%) of patients, including 63 neoplasms (PDAC n = 45, pancreatic neuroendocrine tumors PNET; n = 4, cholangiocarcinoma n = 5, metastatic disease n = 4, lymphoma n = 1, linitis plastica n = 2, leiomyoma n = 2) and 15 benign pathologies (chronic pancreatitis n = 8, reactive nodes n = 5, inflammatory biliary stricture n = 1, and pancreatic rest n = 1). The three non‐diagnostic cases were found to be PDAC (n = 2) and PNET (n = 1). Sensitivity and diagnostic accuracy was highest with DH (94 and 95%), which was significantly better than that by CS + CB (43 and 54%; P = 0.0001) and CB‐only preparations (32 and 48.6%; P < 0.0001). There was no significant difference between the CS + CB and CB‐only arms (P > 0.22).
Conclusion
Without ROSE, our findings suggest that with just a single pass, DH should be the tissue preparation method of choice given its significantly higher diagnostic accuracy compared with CS and/or CB techniques.
Fog computing is a promising paradigm aimed at reducing latency and network traffic between end devices and cloud servers. In fog computing systems, a fog node can be overloaded by an increased ...number of requests from end devices. In this situation, an efficient offloading mechanism among fog nodes is necessary to meet the requirements of fog computing. Software Defined Networking (SDN) has become the de facto standard for future network infrastructures, including fog computing systems. In this paper, we propose a dynamic offloading service among fog nodes for an SDN-based fog computing system that aims at selecting an optimal offloading node and assisting the offloading path by providing an end-to-end bandwidth guarantee based on SDN technology. The proposed solution outperforms traditional approaches without SDN technology in terms of both throughput and request response time by selecting the appropriate offloading node using current computational and network resource information and providing a reliable end-to-end path.
•A dynamic fog-to-fog offloading mechanism in SDN-based fog computing systems is proposed.•Detect overloaded nodes by monitoring fog node status in a real-time manner.•Choose optimal offloading nodes and end-to-end routing paths between overloaded and offloading nodes.•The proposed solution outperforms traditional approaches without SDN in both throughput and request response time.
Smart home is one of the most promising applications of the Internet of Things. Although there have been studies about this technology in recent years, the adoption rate of smart homes is still low. ...One of the largest barriers is technological fragmentation within the smart home ecosystem. Currently, there are many protocols used in a connected home, increasing the confusion of consumers when choosing a product for their house. One possible solution for this fragmentation is to make a gateway to handle the diverse protocols as a central hub in the home. However, this solution brings about another issue for manufacturers: compatibility. Because of the various smart devices on the market, supporting all possible devices in one gateway is also an enormous challenge. In this paper, we propose a software architecture for a gateway in a smart home system to solve the compatibility problem. By creating a mechanism to dynamically download and update a device profile from a server, the gateway can easily handle new devices. Moreover, the proposed gateway also supports unified control over heterogeneous networks. We implemented a prototype to prove the feasibility of the proposed gateway architecture and evaluated its performance from the viewpoint of message execution time over heterogeneous networks, as well as the latency for device profile downloads and updates, and the overhead needed for handling unknown commands.
An autonomous robot with a limited vision range finds a path to the goal in an unknown environment in 2D avoiding polygonal obstacles. In the process of discovering the environmental map, the robot ...has to return to some positions marked previously, the regions where the robot traverses to reach that position are defined as sequences of bundles of line segments. This paper presents a novel algorithm for finding approximately shortest paths along the sequences of bundles of line segments based on the method of multiple shooting. Three factors of the approach including bundle partition, collinear condition, and update of shooting points are presented. We then show that if the collinear condition holds, the exact shortest path of the problem is determined, otherwise, the sequence lengths of paths obtained by the update of the method converges. The algorithm is implemented in Python and some numerical examples show that the running time of path-planing for autonomous robots using our method is faster than that using the rubber band technique of Li and Klette in
Euclidean Shortest Paths, Springer
, 53–89 (2011).
Kubernetes (K8s) is expected to be a key container orchestration tool for edge computing infrastructures owing to its various features for supporting container deployment and dynamic resource ...management. For example, its horizontal pod autoscaling feature provides service availability and scalability by increasing the number of replicas. kube-proxy provides traffic load-balancing between replicas by distributing client requests equally to all pods (replicas) of an application in a K8s cluster. However, this approach can result in long delays when requests are forwarded to remote workers, especially in edge computing environments where worker nodes are geographically dispersed. Moreover, if the receiving worker is overloaded, the request-processing delay can increase significantly. To overcome these limitations, this paper proposes an enhanced load balancer called resource adaptive proxy (RAP). RAP periodically monitors the resource status of each pod and the network status among worker nodes to aid in load-balancing decisions. Furthermore, it preferentially handles requests locally to the maximum extent possible. If the local worker node is overloaded, RAP forwards its requests to the best node in the cluster while considering resource availability. Our experimental results demonstrated that RAP could significantly improve throughput and reduce request latency compared with the default load-balancing mechanism of K8s.
•Optimization of building operating costs & facilitation of wind energy from smart grid.•Optimal battery schedule determined for a building with integrated microgrid.•Trade-off analysis between ...building operating costs and wind energy facilitation.•Small increases to building costs led to large increases in wind energy facilitation.
The aim of this paper was to investigate the trade-offs that can be achieved between optimizing the electricity costs of a building integrated microgrid, while simultaneously facilitating high levels of wind penetration in a smart grid. This study applied multi-objective optimization to obtain a daily charge and discharge schedule of a battery bank, which was used to store electricity from the microgrid and smart grid and could also provide electricity to the building and smart grid. Multi-objective optimization was employed due to the independent objectives of minimizing building operating cost and maximizing the facilitation of wind energy from the smart grid. The trade-offs between the two objectives were simulated, evaluated and analyzed. A priority weighting factor (α) was applied to each objective. The purpose of α was to vary the importance of each objective relative to the other in an inversely proportional manner. This enabled the algorithm to optimize the battery operating schedule for the economic performance of the microgrid, the facilitation of wind generation from the smart grid or for trade-offs in between. The results present a comprehensive evaluation of 96 scenarios with varying daily weather conditions, building electricity demand, electricity pricing, microgrid output and wind penetration from the smart grid. A multi-objective optimization approach was then applied for each of the 96 scenarios with 11 α values to determine optimal trade-offs in these scenarios. Generally for the 96 scenarios analyzed, when the α value was 20% or higher, the amount of extra wind generation facilitation obtained was negligible while microgrid operating costs continued to increase. The results showed that when changing from an α value of 0% to an α value of 20%, there was a large increase in wind generation facilitation compared to the corresponding increase in cost, with wind generation facilitation increasing from its minimum value to within 89% of its maximum value (10.7% to 14.3% of facilitated wind generation). The corresponding building cost increased from its minimum value to within 13% of its maximum value (€1.14/day to €1.37/day). This produced a cost of approximately €0.06 for every 1% increase in wind generation facilitation. In comparison to this, changing from an α value of 20% to an α value of 100% implied a cost of approximately €3.64 for every 1% increase in wind generation facilitation. These results indicated that smart grids with large percentages of wind penetration may be substantially aided by utilizing the storage capacity of building integrated microgrids for a relatively low monetary cost.