Cobalamin deficiency is common in patients with Crohn's disease (CD). Intramuscular cobalamin continues to be the standard therapy for the deficiency and maintenance treatment in these patients, ...although oral route has been demonstrated to be effective in other pathologies with impaired absorption. Our aims were to evaluate the efficacy of oral therapy in the treatment of cobalamin deficiency and in long-term maintenance in patients with Crohn's disease. We performed a multicenter retrospective cohort study that included 94 patients with Crohn's disease and cobalamin deficiency. Seventy-six patients had B12 deficiency and 94.7% of them normalized their cobalamin levels with oral treatment. The most used dose was 1 mg/day, but there were no significant differences in treatment effectiveness depending on the dose used (≥1 mg/24 h vs. <1 mg/24 h). Eighty-two patients had previous documented B12 deficiency and were treated with oral B12 to maintain their correct cobalamin levels. After a mean follow-up of 3 years, the oral route was effective as maintenance treatment in 81.7% of patients. A lack of treatment adherence was admitted by 46.6% of patients in who the oral route failed. In conclusion, our study shows that oral cyanocobalamin provides effective acute and maintenance treatment for vitamin B12 deficiency caused by CD with or without ileum resection.
Liver resections are a significant source of primary human hepatocytes used mainly in artificial liver devices and pharmacological and biomedical studies. However, it is not well known how ...patient-donor and surgery-dependent factors influence isolated hepatocytes' yield, viability, and function. Hence, we aimed to analyze the impact of all these elements on the outcome of human hepatocyte isolation.
Hepatocytes were isolated from liver tissue from patients undergoing partial hepatectomy using a two-step collagenase method. Hepatocyte viability, cell yield, adhesion, and functionality were measured. In addition, clinical and analytical patient variables were collected and the use or absence of vascular clamping and its type (continuous or intermittent) plus the ischemia times during surgery.
Malignant disease, previous chemotherapy, and male gender were associated with lower hepatocyte viability and isolation cell yields. The previous increase in transaminases was also associated with lower yields on isolation and lower albumin production. Furthermore, ischemia secondary to vascular clamping during surgery was inversely correlated with the isolated hepatocyte viability. An ischemia time higher than 15 min was related to adverse effects on viability.
Several factors correlated with the patient and the surgery directly influence the success of human hepatocyte isolation from patients undergoing liver resection.
Background and objective
Different factors may influence colonoscopy performance measures. We aimed to analyze procedure‐ and endoscopist‐related factors associated with detection of colorectal ...lesions and whether these factors have a similar influence in the context of different colonoscopy indications: positive fecal immunochemical test (+FIT) and post‐polypectomy surveillance colonoscopies.
Methods
This multicenter cross‐sectional study included adults aged 40–80 years. Endoscopists (N = 96) who had performed ≥50 examinations were assessed for physician‐related factors. Adenoma detection rate (ADR), adenomas per colonoscopy rate (APCR), advanced ADR, serrated polyp detection (SDR), and serrated polyps per colonoscopy rate (SPPCR) were calculated.
Results
We included 12,932 procedures, with 4810 carried out after a positive FIT and 1967 for surveillance. Of the 96 endoscopists evaluated, 43.8% were women, and the mean age was 41.9 years. The ADR, advanced ADR, and SDR were 39.7%, 17.7%, and 12.8%, respectively. Adenoma detection rate was higher in colonoscopies after a +FIT (50.3%) with a more than doubled advanced ADR compared to non‐FIT procedures (27.6% vs. 13.0%) and similar results in serrated lesions (14.7% vs. 13.5%). Among all the detection indicators analyzed, withdrawal time was the only factor independently related to improvement (p < 0.001). Regarding FIT‐positive and surveillance procedures, for both indications, withdrawal time was also the only factor associated with a higher detection of adenomas and serrated polyps (p < 0.001). Endoscopist‐related factors (i.e., weekly hours dedicated to endoscopy, annual colonoscopy volume and lifetime number of colonoscopies performed) had also impact on lesion detection (APCR, advanced ADR and SPPCR).
Conclusions
Withdrawal time was the factor most commonly associated with improved detection of colonic lesions globally and in endoscopies for + FIT and post‐polypectomy surveillance. Physician‐related factors may help to address strategies to support training and service provision. Our results can be used for establishing future benchmarking and quality improvement in different colonoscopy indications.
RESUMEN Introducción: existen diversos indicadores para la valoración de la supervivencia del injerto hepático (DRI americano y ET-DRI europeo, entre otros), pero existen diferencias importantes ...entre los programas de trasplante de los diferentes países y podría ser que dichos indicadores no sean válidos en nuestro medio. Objetivos: el objetivo de este estudio es describir un nuevo indicador nacional de riesgo del injerto hepático a partir de los resultados del Registro Español de Trasplante Hepático (RETH) y validar el DRI y el ET-DRI. Metodología: el RETH incluye un análisis de Cox de los factores relacionados con la supervivencia del injerto. En base a sus resultados se define el indicador graft risk index (GRI). Las variables que contempla dependen del proceso de donación: edad, causa de muerte, compatibilidad sanguínea y tiempo de isquemia fría; y del receptor: edad, enfermedad de base, virus C, número de trasplante, estado UNOS y técnica quirúrgica. Se obtuvo la curva de la regresión logística y se calcularon las curvas de supervivencia del injerto por estratificación. La precisión se evaluó mediante el área ROC. Resultados: un GRI de 1 se corresponde con una probabilidad de pérdida del injerto del 23,25%; cada punto de aumento del GRI supone que la probabilidad se multiplica por 1,33. El GRI mostró la mejor discriminación por estratificación. El área ROC del DRI fue 0,54 (95% IC, 0,50-0,59) y del ET-DRI, 0,56 (95% IC, 0,51-0,61), frente al GRI 0,70 (95% IC, 0,65-0,73) (p < 0,0001). Conclusiones: el DRI y el ET-DRI no parecen útiles en nuestro medio y sería necesario disponer de un indicador propio. El GRI requiere un estudio nacional que perfile más el indicador y realice una validación más amplia.