Despite improvements in multidisciplinary management, patients with biliary tract cancer have a poor outcome. Only 20% of patients are eligible for surgical resection with curative intent, with ...5-year overall survival of less than 10% for all patients. To our knowledge, no studies have described a benefit of adjuvant therapy. We aimed to determine whether adjuvant capecitabine improved overall survival compared with observation following surgery for biliary tract cancer.
This randomised, controlled, multicentre, phase 3 study was done across 44 specialist hepatopancreatobiliary centres in the UK. Eligible patients were aged 18 years or older and had histologically confirmed cholangiocarcinoma or muscle-invasive gallbladder cancer who had undergone a macroscopically complete resection (which includes liver resection, pancreatic resection, or, less commonly, both) with curative intent, and an Eastern Cooperative Oncology Group performance status of less than 2. Patients who had not completely recovered from previous surgery or who had previous chemotherapy or radiotherapy for biliary tract cancer were also excluded. Patients were randomly assigned 1:1 to receive oral capecitabine (1250 mg/m2 twice daily on days 1–14 of a 21-day cycle, for eight cycles) or observation commencing within 16 weeks of surgery. Treatment was not masked, and allocation concealment was achieved with a computerised minimisation algorithm that stratified patients by surgical centre, site of disease, resection status, and performance status. The primary outcome was overall survival. As prespecified, analyses were done by intention to treat and per protocol. This study is registered with EudraCT, number 2005-003318-13.
Between March 15, 2006, and Dec 4, 2014, 447 patients were enrolled; 223 patients with biliary tract cancer resected with curative intent were randomly assigned to the capecitabine group and 224 to the observation group. The data cutoff for this analysis was March 6, 2017. The median follow-up for all patients was 60 months (IQR 37–60). In the intention-to-treat analysis, median overall survival was 51·1 months (95% CI 34·6–59·1) in the capecitabine group compared with 36·4 months (29·7–44·5) in the observation group (adjusted hazard ratio HR 0·81, 95% CI 0·63–1·04; p=0·097). In a protocol-specified sensitivity analysis, adjusting for minimisation factors and nodal status, grade, and gender, the overall survival HR was 0·71 (95% CI 0·55–0·92; p=0·010). In the prespecified per-protocol analysis (210 patients in the capecitabine group and 220 in the observation group), median overall survival was 53 months (95% CI 40 to not reached) in the capecitabine group and 36 months (30–44) in the observation group (adjusted HR 0·75, 95% CI 0·58–0·97; p=0·028). In the intention-to-treat analysis, median recurrence-free survival was 24·4 months (95% CI 18·6–35·9) in the capecitabine group and 17·5 months (12·0–23·8) in the observation group. In the per-protocol analysis, median recurrence-free survival was 25·9 months (95% CI 19·8–46·3) in the capecitabine group and 17·4 months (12·0–23·7) in the observation group. Adverse events were measured in the capecitabine group only, and of the 213 patients who received at least one cycle, 94 (44%) had at least one grade 3 toxicity, the most frequent of which were hand-foot syndrome in 43 (20%) patients, diarrhoea in 16 (8%) patients, and fatigue in 16 (8%) patients. One (<1%) patient had grade 4 cardiac ischaemia or infarction. Serious adverse events were observed in 47 (21%) of 223 patients in the capecitabine group and 22 (10%) of 224 patients in the observation group. No deaths were deemed to be treatment related.
Although this study did not meet its primary endpoint of improving overall survival in the intention-to-treat population, the prespecified sensitivity and per-protocol analyses suggest that capecitabine can improve overall survival in patients with resected biliary tract cancer when used as adjuvant chemotherapy following surgery and could be considered as standard of care. Furthermore, the safety profile is manageable, supporting the use of capecitabine in this setting.
Cancer Research UK and Roche.
Patterns of skull shape in Carnivora provide examples of parallel and convergent evolution for similar ecomorphological adaptations. However, although most researchers report on skull homoplasies ...among hypercarnivorous taxa, evolutionary trends towards herbivory remain largely unexplored. In this study, we analyse the skull of the living herbivorous carnivorans to evaluate the importance of natural selection and phylogenetic legacy in shaping the skulls of these peculiar species. We quantitatively estimated shape variability using geometric morphometrics. A principal components analysis of skull shape incorporating all families of arctoid carnivorans recognized several common adaptations towards herbivory. Ancestral state reconstructions of skull shape and the reconstructed phylogenetic history of morphospace occupation more explicitly reveal the true patterns of homoplasy among the herbivorous carnivorans. Our results indicate that both historical constraints and adaptation have interplayed in the evolution towards herbivory of the carnivoran skull, which has resulted in repeated patterns of biomechanical homoplasy.
Allometric patterns of skull-shape variation can have significant impacts on cranial mechanics and feeding performance, but have received little attention in previous studies. Here, we examine the ...impacts of allometric skull-shape variation on feeding capabilities in the cat family (Felidae) with linear morphometrics and finite element analysis. Our results reveal that relative bite force diminishes slightly with increasing skull size, and that the skulls of the smallest species undergo the least strain during biting. However, larger felids are able to produce greater gapes for a given angle of jaw opening, and they have overall stronger skulls. The two large felids in this study achieved increased cranial strength by increasing skull bone volume relative to surface area. Allometry of skull geometry in large felids reflects a trade-off between the need to increase gape to access larger prey while maintaining the ability to resist unpredictable loading when taking large, struggling prey.
The widespread availability of three-dimensional imaging and computational power has fostered a rapid increase in the number of biologists using finite element analysis (FEA) to investigate the ...mechanical function of living and extinct organisms. The inevitable rise of studies that compare finite element models brings to the fore two critical questions about how such comparative analyses can and should be conducted: (1) what metrics are appropriate for assessing the performance of biological structures using finite element modeling? and, (2) how can performance be compared such that the effects of size and shape are disentangled? With respect to performance, we argue that energy efficiency is a reasonable optimality criterion for biological structures and we show that the total strain energy (a measure of work expended deforming a structure) is a robust metric for comparing the mechanical efficiency of structures modeled with finite elements. Results of finite element analyses can be interpreted with confidence when model input parameters (muscle forces, detailed material properties) and/or output parameters (reaction forces, strains) are well-documented by studies of living animals.
However, many researchers wish to compare species for which these input and validation data are difficult or impossible to acquire. In these cases, researchers can still compare the performance of structures that differ in shape if variation in size is controlled. We offer a theoretical framework and empirical data demonstrating that scaling finite element models to equal force: surface area ratios removes the effects of model size and provides a comparison of stress-strength performance based solely on shape.
Further, models scaled to have equal applied force:volume ratios provide the basis for strain energy comparison. Thus, although finite element analyses of biological structures should be validated experimentally whenever possible, this study demonstrates that the relative performance of un-validated models can be compared so long as they are scaled properly.
Abstract Objectives The aim of this randomized controlled study was to cephalometrically assess possible changes in craniofacial morphology associated with long-term use of an adjustable ...oral-appliance compared with continuous positive airway pressure (CPAP) in patients with the obstructive sleep apnea/hypopnea syndrome (OSAHS). In addition, we wanted to study the relationship between these possible changes and the degree of mandibular protrusion associated with oral-appliance therapy. Methods Fifty-one patients were randomized to oral-appliance therapy and 52 patients to CPAP therapy. At baseline and after follow-up (2.3 ± 0.2 years), a lateral cephalogram of all patients was made in maximum intercuspation to determine relevant cephalometric variables. Both baseline and follow-up cephalograms were traced digitally whereupon cephalometric variables were compared. Changes in craniofacial morphology between the oral-appliance- and CPAP group were evaluated with a linear regression analysis. Results Compared with CPAP, long-term use of an oral-appliance resulted in small but significant (dental) changes. Overbite and overjet decreased, 1.0 (±1.5) mm and 1.7 (±1.6) mm, respectively. Furthermore we found a retroclination (−2.0 (±2.8)°) of the upper incisors and a proclination (3.7 (±5.4)°) of the lower incisors. Moreover, the lower- and total anterior facial height increased significantly, 0.8 (±1.5) mm and 0.9 (±1.4) mm, respectively. No changes in skeletal variables were found. Linear regression analysis revealed that the decrease in overbite was associated with the mean mandibular protrusion during follow-up ( B = −0.029, SE = 0.014, p < 0.05). Conclusions Oral-appliance therapy should be considered as a life long treatment, and there is a risk of craniofacial changes to occur. Therefore, patients treated with an oral-appliance, need a thorough follow-up by a dentist or dental-specialist experienced in the field of dental sleep medicine.
Abstract Introduction Extraction of broken femoral nails in peri-implant fractures is becoming an increasingly common problem faced by orthopaedic surgeons. Different closed and open techniques for ...removal of broken nails have been previously described but due to variations in equipment and fracture configurations these methods are not always easily reproducible. We describe an open surgical technique using simple equipment that can be utilised when other methods of extraction have failed. Case Presentation and Surgical Technique We present a case of a peri-implant fracture secondary to non-union involving a short cephalomedullary nail where the broken distal segment of nail was significantly more distal to the femoral fracture site. After multiple failed attempts at extraction with previously described closed techniques a rectangular cortical window was created 2 cm distal to the tip of the broken nail using a saw. An antegrade guide wire was passed through the nail and pulled out of the bony window. A flexible intramedullary reamer was subsequently passed in retrograde fashion over the guide wire and a simple pushout technique was used to push both segments of the broken nail through the original insertion site. An exchange nailing was performed and the cortical window was reattached using a cable. Discussion This is a simple technique that does not require any specialist equipment and does not require the fracture site to be disturbed. The use of a flexible reamer as a pushout device is ideal as there are multiple size options allowing the surgeon to match the size of the medullary canal with the reamer. Furthermore, the flexibility of the reamer allows easy access through a lateral bone window. Conclusion Broken femoral nail extraction can be technically challenging and when other closed methods have failed we believe our technique offers a simple alternative that can be added to the armamentarium of solutions.
The shape of the cranium varies widely among members of the order Carnivora, but the factors that drive the evolution of differences in shape remain unclear. Selection for increased bite force, bite ...speed or skull strength may all affect cranial morphology. We investigated the relationship between cranial form and function in the trophically diverse dog family, Canidae, using linear morphometrics and finite element (FE) analyses that simulated the internal and external forces that act on the skull during the act of prey capture and killing. In contrast to previous FE-based studies, we compared models using a newly developed method that removes the effects of size and highlights the relationship between shape and performance. Cranial shape varies among canids based on diet, and different selective forces presumably drove evolution of these phenotypes. The long, narrow jaws of small prey specialists appear to reflect selection for fast jaw closure at the expense of bite force. Generalists have intermediate jaw dimensions and produce moderate bite forces, but their crania are comparable in strength to those of small prey specialists. Canids that take large prey have short, broad jaws, produce the largest bite forces and possess very strong crania. Our FE simulations suggest that the remarkable strength of skulls of large prey specialists reflect the additional ability to resist extrinsic loads that may be encountered while struggling with large prey items.
Cardiogenic shock (CS) is the leading cause of death for patients hospitalized with acute myocardial infarction (AMI).
To assess the effect of early revascularization (ERV) on 1-year survival for ...patients with AMI complicated by CS.
The SHOCK (Should We Emergently Revascularize Occluded Coronaries for Cardiogenic Shock) Trial, an unblinded, randomized controlled trial from April 1993 through November 1998.
Thirty-six referral centers with angioplasty and cardiac surgery facilities.
Three hundred two patients with AMI and CS due to predominant left ventricular failure who met specified clinical and hemodynamic criteria.
Patients were randomly assigned to an initial medical stabilization (IMS; n = 150) group, which included thrombolysis (63% of patients), intra-aortic balloon counterpulsation (86%), and subsequent revascularization (25%), or to an ERV group (n = 152), which mandated revascularization within 6 hours of randomization and included angioplasty (55%) and coronary artery bypass graft surgery (38%).
All-cause mortality and functional status at 1 year, compared between the ERV and IMS groups.
One-year survival was 46.7% for patients in the ERV group compared with 33.6% in the IMS group (absolute difference in survival, 13.2%; 95% confidence interval CI, 2.2%-24.1%; P<.03; relative risk for death, 0.72; 95% CI, 0.54-0.95). Of the 10 prespecified subgroup analyses, only age (<75 vs >/= 75 years) interacted significantly (P<.03) with treatment in that treatment benefit was apparent only for patients younger than 75 years (51.6% survival in ERV group vs 33.3% in IMS group). Eighty-three percent of 1-year survivors (85% of ERV group and 80% of IMS group) were in New York Heart Association class I or II.
For patients with AMI complicated by CS, ERV resulted in improved 1-year survival. We recommend rapid transfer of patients with AMI complicated by CS, particularly those younger than 75 years, to medical centers capable of providing early angiography and revascularization procedures.