BACKGROUND: Plerixafor is used for patients at risk of Stem cell mobilization failure based on clinical factors or low peripheral blood CD34 count. It is also added upfront to any mobilization ...irrespective of risk factor, but the cost-effectiveness of the approach is an issue. Data on plerixafor in different settings of autologous hematopoietic stem cell (HSC) collection from India are scant. We are hereby reporting the experience of failure/success of mobilization rate and few important significant variables (CD34+ dosage, failed collection) between plerixafor and granulocyte colony-stimulating factor alone groups among autologous hematopoietic stem cell transplantation (aHSCT) at our institute.
METHODS: This was a record-based single-center study on patients who underwent aHSCT from January 2013 to June 2019 at a tertiary care hospital. Descriptive statistics were used for baseline characteristics, transplant-related factors, and peritransplant outcomes. All statistical analyses were performed at the 5% significance level.
RESULTS: During the study duration, a total of 96 patients had undergone autologous hematopoietic stem cell collection (aHSCC), all by peripheral blood stem cell harvest, requiring 131 apheretic collections. Of the total 131 collections in 96 patients, plerixafor was used in 63 apheresis collections (48% of total pheresis) in 40 patients. Among the 40 patients who were administered plerixafor to augment the collection, 34 patients had upfront use of plerixafor. We did not observe any significant adverse event related to plerixafor use.
CONCLUSION: A rational utilization of plerixafor can facilitate the process and logistics of aHSCC outcome.
Primary thyroid lymphoma – An uncommon entity Patnayak, Rashmi; Choudhuri, Rohini; Samal, Debashis ...
Journal of Dr. NTR University of Health sciences/Journal of Dr. NTR university of health sciences,
04/2017, Letnik:
6, Številka:
2
Journal Article
The molecular landscape of tumors has been traditionally established using a biopsy or resection specimens. These modalities result in sampling bias that offer only a single snapshot of tumor ...heterogeneity. Over the last decade intensive research towards alleviating such a bias and obtaining an integral yet accurate portrait of the tumors, evolved to the use of established molecular and genetic analysis using blood and several other body fluids, such as urine, saliva, and pleural effusions as liquid biopsies. Genomic profiling of the circulating markers including circulating cell-free tumor DNA (ctDNA), circulating tumor cells (CTCs) or even RNA, proteins, and lipids constituting exosomes, have facilitated the diligent monitoring of response to treatment, allowed one to follow the emergence of drug resistance, and enumerate minimal residual disease. The prevalence of tumor educated platelets (TEPs) and our understanding of how tumor cells influence platelets are beginning to unearth TEPs as a potentially dynamic component of liquid biopsies. Here, we review the biology, methodology, approaches, and clinical applications of biomarkers used to assess liquid biopsies. The current review addresses recent technological advances and different forms of liquid biopsy along with upcoming challenges and how they can be integrated to get the best possible tumor-derived genetic information that can be leveraged to more precise therapies for patient as liquid biopsies become increasingly routine in clinical practice.
Abstract
Background
In autologous stem cell transplant (ASCT) for lymphomas, no standard conditioning regimen has been defined so far. Thus, the choice is guided by the center's familiarity and ...experience with a particular regimen.
Objective
To determine the response, toxicity, and survival outcomes in lymphoma patients who underwent ASCT with CBV (cyclophosphamide, carmustine, and etoposide) conditioning regimen.
Materials and Methods
Between January 2013 and May 2019, 45 consecutive lymphoma patients who had ASCT with CBV conditioning regimen were included in this retrospective study. CBV consisted of cyclophosphamide (1.5 g/m
2
/day × 4 days), carmustine (300 mg/m
2
× 1 day), and etoposide (125 mg/m
2
twice daily × 3 days). Baseline characteristics, pre transplant response, apheresis, post-transplant toxicities, post-transplant response, and survival outcomes were collected. Endpoints were toxicity, response, event-free survival (EFS), and overall survival (OS).
Results
The median age was 30 (range: 6–64) years. Diagnosis was Hodgkin lymphoma (HL) in 26 (58%) and non-Hodgkin lymphoma (NHL) in 19 (42%). Forty-three patients (95%) had chemosensitive disease; 22(49%) in CR, and 21 (46%) in PR. The median CD34 was 2.95 × 10
6
/kg (range: 0.9–9.56). The median time to neutrophil engraftment was 11 days (9–23) and 13 (8–36) days for platelets. All patients had febrile neutropenia, clinically and/or microbiologically documented infection was seen in 75% of patients. The most common grade 3/4 toxicities were mucositis (
n
= 4, 9%), diarrhea (
n
= 4, 9%), and nausea/vomiting (
n
= 2, 4%). The average days of hospitalization was 18 (range: 10–37). Day 100 mortality was 6.6% (
n
= 3). The median follow-up was 44.8 months. The median EFS for the entire cohort was 23.8 months; for HL, the median EFS was not reached, and for NHL, it was 7.97 months (95% confidence interval CI: 1.57–14.37). The median OS for the entire cohort and for HL was not reached; for NHL, it was 24.3 months (95% CI: 0.56–48.11).
Conclusion
CBV conditioning regimen was well tolerated with low grade 3/4 toxicities and efficacy comparable to literature data.
This proposed research work discusses fog assisted IoT enabled health monitoring system performance. This system support local processing nearer the client machine for faster service and removing ...redundant data. This architecture is suitable for heavy traffic health data monitoring system like old or child monitoring or fitness care monitoring system. After processing huge data, only filtering data will be forwarded to cloud for processing. To transfer the huge amount of health monitoring data form IoT layer to Data Management and Processing Layer (DMPL) consume lots of energy. To minimize the energy requirement in the fog assisted DMPL layer, we adopt sleep mode operation and batch transferring data packet transfer technology. This paper studies the effect of sleeping mode on mean delay, number of data packets in the buffer and probability of blocking. The results show that proposed approach saves energy and effective framework for fog assisted health monitoring framework.
A newborn requires constant vigilance, rapid recognition of the events and swift intervention during anaesthesia. The anaesthetic considerations in neonatal surgical emergencies are based on the ...physiological immaturity of various body systems, poor tolerance of the anaesthetic drugs, associated congenital disorders and considerations regarding the use of high concentration of oxygen. The main goal is for titration of anaesthetics to desired effects, while carefully monitoring of the cardiorespiratory status. The use of regional anaesthesia has shown to be safe and effective. Advancements in neonatology have resulted in the improvement of the survival of the premature and critically ill newborn babies. Most of the disorders previously considered as neonatal surgical emergencies in the past no longer require immediate surgery due to new technology and new methods of treating sick neonates. This article describes the common neonatal surgical emergencies and focuses on factors that affect the anaesthetic management of patients with these disorders.
In recent years there is an exponential surge in healthcare IOT devices that subsequently led to generation of massive amount of data. IOT devices send these complex and huge medical data to cloud ...for analysis and storage. Most of the organization do not prefer this due to latency, privacy and security issues. To overcome the limitations of cloud-based systems, a novel paradigm called as fog computing has been created. Even though fog nodes have several advantages, they require high amount of energy to function. Software Defined Networking or SDN is a cutting-edge technology which enables intelligent and centralized network management also 'programming' using software applications. In this paper we present a energy efficient SDN enabled fog computing architecture for healthcare data by controlling the service rate. In this model based on buffer load it will be decided whether they will upload it in batch mode with higher processing speed or they will process the data in listening interval with low processing speed. The proposed SDN based architecture perform effectively and save energy compared to existing model. This model balances the load dynamically and handle real time data traffic concurrently.
In current digital age, the Internet of Things (IoT) plays a critical role in real-time data perception and computation in order to better manage the system in an automated manner. In this work, we ...will present the edge computing concept of IoT architecture, which increases the efficiency of complicated application processing and is known as fog computing. The rapid expansion of computing resources may be regarded to improve real-time data capabilities such as detection, capture, collecting, and processing across billions of linked devices and support a range of applications such as smart wearable devices, smart meters, and smart homes. The advancement of big data technologies makes it simpler to process and analyze massive volumes of IoT data. Smart devices continue to confront a number of problems in terms of computational power, memory storage, batteries, and frequency bandwidth, all of which degrade their Quality of Service (QoS) and user activities. Fog-embedded cloud computing is viewed as a computer paradigm that alleviates the fixed resource load for smart equipment by allowing end-users to develop programs with flexible resources at the lowest possible cost in terms of architecture, software, and platforms.
When it comes to new and developing applications that require high computation and low latency, fog computing is an attractive option to consider for implementation. The peripheral network devices ...are able to share idle resources and work together to complete fog computing tasks. It's up to the task publishers to decide how to distribute computer jobs that take into account transmission line quality and energy utilization. To urge these gadgets to engage in computational offloading, effective stimuli approaches are needed to stimulate them. Three heterogeneous servers with varying service rates are investigated using a finite constant buffer queue. Depending on a Poisson distribution with a mean and service times, the jobs are exponentially spread by distinct mean values There is a limit to how many jobs can be held in a queue while the servers are busy. It's assumed that Because of this, jobs are considered lost if they arrive after a buffer has reached capacity. If there is no client in queue, the next customer will be presumed to be in the first server if there is no customer in line. In the case of routers, this can be seen in their different speeds when serving packets on the network.