Plasma enhanced chemical vapour deposition (PECVD) growth of carbon nanotubes and silicon nanowires has been studied in an Oxford Instruments Plasma Technology reactor. Typical growth regimes involve ...a catalyst pre-treatment step and a growth step where a precursor gas is decomposed to form the desired nanostructure. For both catalyst pre-treatment and nanostructure growth, utilising plasma gives advantages over other growth methods. During catalyst pre-treatment, a plasma step can promote formation of nanoparticles from a thin metal film, while also increasing the catalytic activity compared with thermal pre-treatment. In the case of carbon nanotube growth, PECVD can result in vertically aligned nanotubes where thermal CVD gave randomly ordered structures. Further, gas composition is seen to strongly affect the morphology and dimensions of the nanotubes grown. For Si nanowire growth PECVD can reduce the growth temperature, and enable the use of catalysts more compatible for fabrication of Si-based devices. In both cases catalyst particles are observed at the tips of the grown nanostructures, indicating a tip-growth mechanism.
CMS is a general purpose experiment, designed to study the physics of pp collisions at 14 TeV at the Large Hadron Collider (LHC). It currently involves more than 2000 physicists from more than 150 ...institutes and 37 countries. The LHC will provide extraordinary opportunities for particle physics based on its unprecedented collision energy and luminosity when it begins operation in 2007. The principal aim of this report is to present the strategy of CMS to explore the rich physics programme offered by the LHC. This volume demonstrates the physics capability of the CMS experiment. The prime goals of CMS are to explore physics at the TeV scale and to study the mechanism of electroweak symmetry breaking--through the discovery of the Higgs particle or otherwise. To carry out this task, CMS must be prepared to search for new particles, such as the Higgs boson or supersymmetric partners of the Standard Model particles, from the start-up of the LHC since new physics at the TeV scale may manifest itself with modest data samples of the order of a few fb−1 or less. The analysis tools that have been developed are applied to study in great detail and with all the methodology of performing an analysis on CMS data specific benchmark processes upon which to gauge the performance of CMS. These processes cover several Higgs boson decay channels, the production and decay of new particles such as Z' and supersymmetric particles, Bs production and processes in heavy ion collisions. The simulation of these benchmark processes includes subtle effects such as possible detector miscalibration and misalignment. Besides these benchmark processes, the physics reach of CMS is studied for a large number of signatures arising in the Standard Model and also in theories beyond the Standard Model for integrated luminosities ranging from 1 fb−1 to 30 fb−1. The Standard Model processes include QCD, B-physics, diffraction, detailed studies of the top quark properties, and electroweak physics topics such as the W and Z0 boson properties. The production and decay of the Higgs particle is studied for many observable decays, and the precision with which the Higgs boson properties can be derived is determined. About ten different supersymmetry benchmark points are analysed using full simulation. The CMS discovery reach is evaluated in the SUSY parameter space covering a large variety of decay signatures. Furthermore, the discovery reach for a plethora of alternative models for new physics is explored, notably extra dimensions, new vector boson high mass states, little Higgs models, technicolour and others. Methods to discriminate between models have been investigated. This report is organized as follows. Chapter 1, the Introduction, describes the context of this document. Chapters 2-6 describe examples of full analyses, with photons, electrons, muons, jets, missing ET, B-mesons and τ's, and for quarkonia in heavy ion collisions. Chapters 7-15 describe the physics reach for Standard Model processes, Higgs discovery and searches for new physics beyond the Standard Model
The CMS analysis chain in a distributed environment Barrass, T.; Bonacorsi, D.; Ciraolo, G. ...
Nuclear instruments & methods in physics research. Section A, Accelerators, spectrometers, detectors and associated equipment,
04/2006, Letnik:
559, Številka:
1
Journal Article
Recenzirano
Odprti dostop
The CMS collaboration is undertaking a big effort to define the analysis model and to develop software tools with the purpose of analysing several millions of simulated and real data events by a ...large number of people in many geographically distributed sites. From the computing point of view, one of the most complex issues when doing remote analysis is the data discovery and access. Some software tools were developed in order to move data, make them available to the full international community and validate them for the subsequent analysis. The batch analysis processing is performed with workload management tools developed on purpose, which are mainly responsible for the job preparation and the job submission. The job monitoring and the output management are implemented as the last part of the analysis chain. Grid tools provided by the LCG project are evaluated to gain access to the data and the resources by providing a user friendly interface to the physicists submitting the analysis jobs. An overview of the current implementation and of the interactions between the previous components of the CMS analysis system is presented in this work.
The pool of persistent objects for LHC (POOL) project, part of the large Hadron collider (LHC) computing grid (LCG), is now entering its third year of active development. POOL provides the baseline ...persistency framework for three LHC experiments. It is based on a strict component model, insulating experiment software from a variety of storage technologies. This paper gives a brief overview of the POOL architecture, its main design principles and the experience gained with integration into LHC experiment frameworks. It also presents recent developments in the POOL works areas of relational database abstraction and object storage into relational database management systems (RDBMS) systems
The CMS experiment is currently developing a computing system capable of serving, processing and archiving the large number of events that will be generated when the CMS detector starts taking data. ...During 2004 CMS undertook a large scale data challenge to demonstrate the ability of the CMS computing system to cope with a sustained data-taking rate equivalent to 25% of startup rate. Its goals were: to run CMS event reconstruction at CERN for a sustained period at 25 Hz input rate; to distribute the data to several regional centers; and enable data access at those centers for analysis. Grid middleware was utilized to help complete all aspects of the challenge. To continue to provide scalable access from anywhere in the world to the data, CMS is developing a layer of software that uses Grid tools to gain access to data and resources, and that aims to provide physicists with a user friendly interface for submitting their analysis jobs. This paper describes the data challenge experience with Grid infrastructure and the current development of the CMS analysis system.
Integrating the SRB with the GIGGLE framework Barrass, T.A.; Maroney, O.J.E.; Metson, S. ...
Nuclear instruments & methods in physics research. Section A, Accelerators, spectrometers, detectors and associated equipment,
11/2004, Letnik:
534, Številka:
1
Journal Article
Recenzirano
Distributed data transfer is currently characterised by the use of widely disparate tools, meaning that significant human effort is required to maintain the distributed system. In order to realise ...the possibilities represented by Grid infrastructure, the reality of a heterogenous computing environment must be tackled by providing means by which these disparate elements can communicate.
Two such data distribution tools are the SRB and the EU DataGrid's Data Management fabric, both widely used by many large scientific projects. Both provide similar functionality—the replication and cataloguing of datasets in a globally distributed environment. Significant quantities of data are currently stored in both. Moving data from the SRB to the EUDG, however, requires significant intervention and is therefore not scalable.
This paper presents a mechanism by which the SRB can automatically interact with the GIGGLE framework as implemented by the EUDG, allowing access to SRB data using Grid tools.
In the early 1990's researchers at the Robert Bosch facility in Stuttgart invented a novel method of etching very deep features into silicon. This led to a patent being granted in 1994. Originally ...intended as a method of fabricating devices for the then emerging automotive MEMS sector, its use has since diversified to cover virtually all other MEMS markets. Most recently the use of the Bosch Process has expanded into through silicon via (TSV) etching in 3D packaging applications. This paper describes what is possible today using the Bosch process and will consider future applications and uses. The diversity of the approach is illustrated through various examples including cavities etched at high rates, features with aspect ratios of 90:1, profile tilt to ~±0.15°, TSVs for wafer stacking and the increasing demand for more precise control including end-point detection down to 0.05% open area.
Understanding ship stability is critical for all maritime students or professionals who are studying for a deck or engineering certificate of competency, or seeking promotion to a higher rank within ...any branch of the merchant marine or Navy. The sixth edition of the now classic 'Ship Stability' provides a comprehensive introduction to all aspects of ship stability and ship strength, squat, interaction and trim, materials stresses and forces.* The market leading ship stability text, widely used at sea and on shore * New content inclues coverage of now-mandatory double-skin tankers and fast ferries * Meets STCW (Standards of Training, Certification & Watchkeeping) requirements and includes self-examination material: essential reading for professionals and students alike
The POOL project, as a part of the LHC Computing Grid (LCG), is now entering its third year of active development POOL provides the baseline persistency framework for three LHC experiment and is ...based on a strict component model, insulating experiment software from a variety of storage technology choices. This paper gives a brief overview of the POOL architecture, its main design principles and the experience gained with integration into LHC experiment frameworks. In also presents recent developments in the area of relational database abstraction and object storage into RDBMS systems.