Glaucoma is a silent disease that leads to vision loss or irreversible blindness. Current deep learning methods can help glaucoma screening by extending it to larger populations using retinal images. ...Low-cost lenses attached to mobile devices can increase the frequency of screening and alert patients earlier for a more thorough evaluation. This work explored and compared the performance of classification and segmentation methods for glaucoma screening with retinal images acquired by both retinography and mobile devices. The goal was to verify the results of these methods and see if similar results could be achieved using images captured by mobile devices. The used classification methods were the Xception, ResNet152 V2 and the Inception ResNet V2 models. The models' activation maps were produced and analysed to support glaucoma classifier predictions. In clinical practice, glaucoma assessment is commonly based on the cup-to-disc ratio (CDR) criterion, a frequent indicator used by specialists. For this reason, additionally, the U-Net architecture was used with the Inception ResNet V2 and Inception V3 models as the backbone to segment and estimate CDR. For both tasks, the performance of the models reached close to that of state-of-the-art methods, and the classification method applied to a low-quality private dataset illustrates the advantage of using cheaper lenses.
Change and unpredictability characterize today's business environment. Organizational teams must effectively cope with this reality and ensure that high levels of performance are not compromised. By ...refining team adaptation with the integration of team improvisation, this study tests a team adaptation temporal framework comprising two processes - team improvised adaptation and team preemptive adaptation. We also investigate the relationships between these constructs and shared temporal cognitions, team learning behaviors, and team performance. We conducted four studies with three different samples, and the results suggest that the two framework constructs are distinct. The results also indicate that team improvised adaptation behaviors mediate the relationship between shared temporal cognitions and team performance, and that team learning behaviors moderate this mediation.
•Team preemptive adaptation and team improvised adaptation are the two facets of the team adaptation temporal framework.•The impact of team preemptive adaptation and team improvised adaptation on team performance is unequal.•Team improvised adaptation mediates the relationship between shared temporal cognitions and team performance.•Team learning behaviors moderate the relationship between shared temporal cognitions and team improvised adaptation.
Lung cancer is considered one of the deadliest diseases in the world. An early and accurate diagnosis aims to promote the detection and characterization of pulmonary nodules, which is of vital ...importance to increase the patients’ survival rates. The mentioned characterization is done through a segmentation process, facing several challenges due to the diversity in nodular shape, size, and texture, as well as the presence of adjacent structures. This paper tackles pulmonary nodule segmentation in computed tomography scans proposing three distinct methodologies. First, a conventional approach which applies the Sliding Band Filter (SBF) to estimate the filter’s support points, matching the border coordinates. The remaining approaches are Deep Learning based, using the U-Net and a novel network called SegU-Net to achieve the same goal. Their performance is compared, as this work aims to identify the most promising tool to improve nodule characterization. All methodologies used 2653 nodules from the LIDC database, achieving a Dice score of 0.663, 0.830, and 0.823 for the SBF, U-Net and SegU-Net respectively. This way, the U-Net based models yield more identical results to the ground truth reference annotated by specialists, thus being a more reliable approach for the proposed exercise. The novel network revealed similar scores to the U-Net, while at the same time reducing computational cost and improving memory efficiency. Consequently, such study may contribute to the possible implementation of this model in a decision support system, assisting the physicians in establishing a reliable diagnosis of lung pathologies based on this segmentation task.
Purpose The literature on communication in change processes, although fundamental, appears to still be very fragmented. The purpose of this study is to provide an explanatory and integrated framework ...for the communication process in organizational change processes. Design/methodology/approach The authors conducted 22 semi-structured interviews with employees from 21 companies and 13 different sectors in Germany. The four-step Gioia inductive coding approach was adopted as the methodological approach of the current study. Findings A final research model reveals that the organizational change communication (OCC) process is marked by specific change-restraining forces associated with the fear of the unknown, habits and convenience. Results also suggest the importance of communication timing and factors that help shape the OCC process, namely the scope, contents and channels of the communication process. Finally, the current research highlights contextual variables of the OCC process, such as credibility or the level of honesty. Originality/value The importance of OCC, in particular the understanding of the scope, contents of the message and the channels of communication adopted in the change management process, are important variables in the complexity of change. The paper illustrates the intricacy of communication in change and reinforces the internal and external variables that help shape the OCC process, with implications for change agents and scholars.
We propose iW-Net, a deep learning model that allows for both automatic and interactive segmentation of lung nodules in computed tomography images. iW-Net is composed of two blocks: the first one ...provides an automatic segmentation and the second one allows to correct it by analyzing 2 points introduced by the user in the nodule's boundary. For this purpose, a physics inspired weight map that takes the user input into account is proposed, which is used both as a feature map and in the system's loss function. Our approach is extensively evaluated on the public LIDC-IDRI dataset, where we achieve a state-of-the-art performance of 0.55 intersection over union vs the 0.59 inter-observer agreement. Also, we show that iW-Net allows to correct the segmentation of small nodules, essential for proper patient referral decision, as well as improve the segmentation of the challenging non-solid nodules and thus may be an important tool for increasing the early diagnosis of lung cancer.
PurposeThe purpose of this paper is to explore how a number of processes joined to create the microlevel strategies and procedures that resulted in the most lethal and tragic forest fire in ...Portugal's history, recalled as the EN236-1 road tragedy in the fire of Pedrógão Grande.Design/methodology/approachUsing an inductive theory development approach, the authors consider how the urgency and scale of perceived danger coupled with failures of system-wide communication led fire teams to improvise repeatedly.FindingsThe paper shows how structure collapse led teams to use only local information prompting acts of improvisational myopia, in the particular shape of corrosive myopia, and how a form of incidental improvisation led to catastrophic results.Practical implicationsThe research offers insights into the dangers of improvisation arising from corrosive myopia, identifying ways to minimize them with the development of improvisation practices that allow for the creation of new patterns of action. The implications for managing surprise through improvisation extend to risk contexts beyond wildfires.Originality/valueThe paper stands out for showing the impact of improvisational myopia, especially in its corrosive form, which stands in stark contrast to the central role of attention to the local context highlighted in previous research on improvisation. At the same time, by exploring the effects of incidental improvisation, it also departs from the agentic conception of improvisation widely discussed in the improvisation literature.
Abstract
Adjusting the molecular size, the valency and the pharmacokinetics of drug conjugates are as many leverages to improve their therapeutic window, notably by affecting tumor penetration, renal ...clearance, and short systemic exposure. In that regard, small tumor-targeting ligands are gaining attention. In this study, we demonstrate the benefits of the small Nanofitin alternative scaffolds (7 kDa) as selective tumor-targeting modules for the generation of drug conjugates, focusing on Nanofitins B10 and D8 directed against the EGFR. Owing to their small size and monovalent format, the two Nanofitins displayed a fast and deep tumor penetration in EGFR-positive A431 xenografts in BALB/c nude mice after intravenous administration, yielding to a targeting of respectively 67.9% ± 14.1 and 98.9% ± 0.7 of the tumor cells as demonstrated by IHC. Conjugation with the monomethyl auristatin E toxin provided homogeneous Nanofitin-drug conjugates, with an overall yield of ≥97%, for in vivo assessment in a curative xenograft model using bioluminescent, EGFR-positive, A431 cells in BALB/c nude mice. Internalization was found critical for efficient release of the toxin. Hence, the intravenous administration of the D8-based construct showed significant antitumor effect in vivo as determined by monitoring tumor volumes and bioluminescence levels over 2 months.
The application of optimization techniques to improve the performance of polymer processing technologies is of great practical consequence, since it may result in significant savings of materials and ...energy resources, assist recycling schemes and generate products with better properties. The present review aims at identifying and discussing the most important characteristics of polymer processing optimization problems in terms of the nature of the objective function, optimization algorithm, and process modelling approach that is used to evaluate the solutions and the parameters to optimize. Taking into account the research efforts developed so far, it is shown that several optimization methodologies can be applied to polymer processing with good results, without demanding important computational requirements. Furthermore, within the field of artificial intelligence, several approaches can reach significant success. The first part of this review demonstrated the advantages of the optimization approach in polymer processing, discussed some concepts on multi-objective optimization and reported the application of optimization methodologies to single and twin screw extruders, extrusion dies and calibrators. This second part focuses on injection molding, blow molding and thermoforming technologies.
Given the global economic and societal importance of the polymer industry, the continuous search for improvements in the various processing techniques is of practical primordial importance. This ...review evaluates the application of optimization methodologies to the main polymer processing operations. The most important characteristics related to the usage of optimization techniques, such as the nature of the objective function, the type of optimization algorithm, the modelling approach used to evaluate the solutions, and the parameters to optimize, are discussed. The aim is to identify the most important features of an optimization system for polymer processing problems and define the best procedure for each particular practical situation. For this purpose, the state of the art of the optimization methodologies usually employed is first presented, followed by an extensive review of the literature dealing with the major processing techniques, the discussion being completed by considering both the characteristics identified and the available optimization methodologies. This first part of the review focuses on extrusion, namely single and twin-screw extruders, extrusion dies, and calibrators. It is concluded that there is a set of methodologies that can be confidently applied in polymer processing with a very good performance and without the need of demanding computation requirements.