Fisheries management often relies heavily on precautionary reference points estimated from complex statistical models. An alternative approach uses management strategies defined by mathematical ...algorithms that calculate controls, like catch quotas, directly from the observed data. We combine these two distinct paradigms into a common framework using arguments from the historical development of quantum mechanics. In fisheries, as in physics, the core of the argument lies in the technical details. We illustrate the process of designing a management algorithm similar to one actually used by the International Whaling Commission. Reference points and surplus production models play a conceptual role in defining management strategies, even if marine populations do not obey such simplistic rules. Physicists have encountered similar problems in formulating quantum theory, where mathematical objects with seemingly unrealistic properties generate results of great practical importance.
This paper describes a convenient simulation model, based on the compound binomial-gamma distribution, to assist the planning and design of groundfish trawl surveys. The analysis uses swept-area ...density measurements from stratified tows to give a simple nonparametric biomass estimate. A parametric simulation model requires only three input parameters for each stratum, which can be estimated initially from past surveys or commercial fishery data. Analytical results provide intuitive algorithms for estimating variances, investigating tow allocation strategies, and exploring potential survey results. Simulations make it possible to compare the estimated biomass with its true value and to assess coverage properties of confidence intervals obtained from bootstraps. Bias correction and acceleration both improve the results, but small samples taken from populations with highly variable densities tend to produce underestimates of available biomass. The simulation framework allows easy adaptation to address broader issues, such as the design of a multispecies survey.
Abstract
We report a measurement of the flux-integrated $\nu_{\mu}$ charged-current cross sections on water, hydrocarbon, and iron in the T2K on-axis neutrino beam with a mean neutrino energy of 1.5 ...GeV. The measured cross sections on water, hydrocarbon, and iron are $\sigma^{\rm{H_{2}O}}_{\rm{CC}} = (0.840\pm 0.010(\mathrm{stat.})^{+0.10}_{-0.08}(\mathrm{syst.}))\times10^{-38}\,\mathrm{cm}^2$/nucleon, $\sigma^{\rm{CH}}_{\rm{CC}} = (0.817\pm 0.007(\mathrm{stat.})^{+0.11}_{-0.08}(\mathrm{syst.}))\times10^{-38}\,\mathrm{cm}^2$/nucleon, and $\sigma^{\rm{Fe}}_{\rm{CC}} = (0.859\pm 0.003(\mathrm{stat.})^{+0.12}_{-0.10}(\mathrm{syst.}))\times10^{-38}\,\mathrm{cm}^2$/nucleon, respectively, for a restricted phase space of induced muons: $\theta_{\mu}<45^{\circ}$ and $p_{\mu}>$0.4 GeV/$c$ in the laboratory frame. The measured cross section ratios are ${\sigma^{\rm{H_{2}O}}_{\rm{CC}}}/{\sigma^{\rm{CH}}_{\rm{CC}}} = 1.028\pm 0.016(\mathrm{stat.})\pm 0.053(\mathrm{syst.})$, ${\sigma^{\rm{Fe}}_{\rm{CC}}}/{\sigma^{\rm{H_{2}O}}_{\rm{CC}}} = 1.023\pm 0.012(\mathrm{stat.})\pm 0.058(\mathrm{syst.})$, and ${\sigma^{\rm{Fe}}_{\rm{CC}}}/{\sigma^{\rm{CH}}_{\rm{CC}}} = 1.049\pm 0.010(\mathrm{stat.})\pm 0.043(\mathrm{syst.})$. These results, with an unprecedented precision for the measurements of neutrino cross sections on water in the studied energy region, show good agreement with the current neutrino interaction models used in the T2K oscillation analyses.
Abstract The T2K experiment presents new measurements of neutrino oscillation parameters using $$19.7(16.3)\times 10^{20}$$ 19.7 ( 16.3 ) × 10 20 protons on target (POT) in (anti-)neutrino mode at ...the far detector (FD). Compared to the previous analysis, an additional $$4.7\times 10^{20}$$ 4.7 × 10 20 POT neutrino data was collected at the FD. Significant improvements were made to the analysis methodology, with the near-detector analysis introducing new selections and using more than double the data. Additionally, this is the first T2K oscillation analysis to use NA61/SHINE data on a replica of the T2K target to tune the neutrino flux model, and the neutrino interaction model was improved to include new nuclear effects and calculations. Frequentist and Bayesian analyses are presented, including results on $$\sin ^2\theta _{13}$$ sin 2 θ 13 and the impact of priors on the $$\delta _{\textrm{CP}}$$ δ CP measurement. Both analyses prefer the normal mass ordering and upper octant of $$\sin ^2\theta _{23}$$ sin 2 θ 23 with a nearly maximally CP-violating phase. Assuming the normal ordering and using the constraint on $$\sin ^2\theta _{13}$$ sin 2 θ 13 from reactors, $$\sin ^2\theta _{23}=0.561^{+0.021}_{-0.032}$$ sin 2 θ 23 = 0 . 561 - 0.032 + 0.021 using Feldman–Cousins corrected intervals, and $$\varDelta {}m^2_{32}=2.494_{-0.058}^{+0.041}\times 10^{-3}~\text {eV}^2$$ Δ m 32 2 = 2 . 494 - 0.058 + 0.041 × 10 - 3 eV 2 using constant $$\varDelta \chi ^{2}$$ Δ χ 2 intervals. The CP-violating phase is constrained to $$\delta _{\textrm{CP}}=-1.97_{-0.70}^{+0.97}$$ δ CP = - 1 . 97 - 0.70 + 0.97 using Feldman–Cousins corrected intervals, and $$\delta _{\textrm{CP}}=0,\pi $$ δ CP = 0 , π is excluded at more than 90% confidence level. A Jarlskog invariant of zero is excluded at more than $$2\sigma $$ 2 σ credible level using a flat prior in $$\delta _{\textrm{CP}},$$ δ CP , and just below $$2\sigma $$ 2 σ using a flat prior in $$\sin \delta _{\textrm{CP}}.$$ sin δ CP . When the external constraint on $$\sin ^2\theta _{13}$$ sin 2 θ 13 is removed, $$\sin ^2\theta _{13}=28.0^{+2.8}_{-6.5}\times 10^{-3},$$ sin 2 θ 13 = 28 . 0 - 6.5 + 2.8 × 10 - 3 , in agreement with measurements from reactor experiments. These results are consistent with previous T2K analyses.
Schnute, J. T. and Haigh, R. 2007. Compositional analysis of catch curve data, with an application to Sebastes maliger. – ICES Journal of Marine Science, 64: 218–233. This paper applies modern ...compositional analysis to catch curve data from a quillback rockfish (Sebastes maliger) population in British Columbia, Canada. Bubble plots and ternary diagrams portray variable age distributions and highlight distinctions between commercial and survey sample data. The models formalize important historical issues in catch curve analysis related to selectivity and recruitment variability, where a particular model corresponds to a prescribed vector of design parameters. The roles that compositional distributions (multinomial, Dirichlet, logistic-normal) can play in fishery data analysis are described, and Bayesian methods are used to examine how the distribution of a key mortality parameter depends on model choice. The framework provides a direct link between model designs and policy outcomes that depend on estimated mortalities or mortality ratios.
Rhizopus delemar lipase catalysed ester hydrolysis of the α-methoxy-β-phenylpropanoate
1 affords the (
R)-(+) and (
S)-(−) isomers in >84% enantiomeric excess. Absolute stereochemistry was determined ...by a single crystal X-ray analysis of a related synthetic analogue. The activity of these two enantiomers on glucose transport in vitro and as anti-diabetic agents in vivo is reported and their unexpected equivalence attributed to an enzyme-mediated stereospecific isomerisation of the (
R)-(+) isomer. Binding studies using recombinant human PPARγ (peroxisomal proliferator activated receptor γ), now established as a molecular target for this compound class, indicate a 20-fold higher binding affinity for the (
S) antipode relative to the (
R) antipode.
To evaluate the extent to which the inter-institutional, inter-disciplinary mobilisation of data and skills in the Farr Institute contributed to establishing the emerging field of data science for ...health in the UK.
We evaluated evidence of six domains characterising a new field of science:defining central scientific challenges,demonstrating how the central challenges might be solved,creating novel interactions among groups of scientists,training new types of experts,re-organising universities,demonstrating impacts in society.We carried out citation, network and time trend analyses of publications, and a narrative review of infrastructure, methods and tools.
Four UK centres in London, North England, Scotland and Wales (23 university partners), 2013-2018.
1. The Farr Institute helped define a central scientific challenge publishing a research corpus, demonstrating insights from electronic health record (EHR) and administrative data at each stage of the translational cycle in 593 papers with at least one Farr Institute author affiliation on PubMed. 2. The Farr Institute offered some demonstrations of how these scientific challenges might be solved: it established the first four ISO27001 certified trusted research environments in the UK, and approved more than 1000 research users, published on 102 unique EHR and administrative data sources, although there was no clear evidence of an increase in novel, sustained record linkages. The Farr Institute established open platforms for the EHR phenotyping algorithms and validations (>70 diseases, CALIBER). Sample sizes showed some evidence of increase but remained less than 10% of the UK population in primary care-hospital care linked studies. 3.The Farr Institute created novel interactions among researchers: the co-author publication network expanded from 944 unique co-authors (based on 67 publications in the first 30 months) to 3839 unique co-authors (545 papers in the final 30 months). 4. Training expanded substantially with 3 new masters courses, training >400 people at masters, short-course and leadership level and 48 PhD students. 5. Universities reorganised with 4/5 Centres established 27 new faculty (tenured) positions, 3 new university institutes. 6. Emerging evidence of impacts included: > 3200 citations for the 10 most cited papers and Farr research informed eight practice-changing clinical guidelines and policies relevant to the health of millions of UK citizens.
The Farr Institute played a major role in establishing and growing the field of data science for health in the UK, with some initial evidence of benefits for health and healthcare. The Farr Institute has now expanded into Health Data Research (HDR) UK but key challenges remain including, how to network such activities internationally.
The T2K experiment widely uses plastic scintillator as a target for neutrino interactions and an active medium for the measurement of charged particles produced in neutrino interactions at its near ...detector complex. Over 10 years of operation the measured light yield recorded by the scintillator based subsystems has been observed to degrade by 0.9--2.2\% per year. Extrapolation of the degradation rate through to 2040 indicates the recorded light yield should remain above the lower threshold used by the current reconstruction algorithms for all subsystems. This will allow the near detectors to continue contributing to important physics measurements during the T2K-II and Hyper-Kamiokande eras. Additionally, work to disentangle the degradation of the plastic scintillator and wavelength shifting fibres shows that the reduction in light yield can be attributed to the ageing of the plastic scintillator.