8,623 research outputs found

    Reducing Prawn-trawl Bycatch in Australia: An Overview and an Example from Queensland

    Get PDF
    Prawn trawling occurs in most states of Australia in tropical, subtropical, and temperate waters. Bycatch occurs to some degree in all Australian trawl fisheries, and there is pressure to reduce the levels of trawl fishery bycatch. This paper gives a brief overview of the bycatch issues and technological solutions that have been evaluated or adopted in Australian prawn-trawl fi sheries. Turtle excluder devices (TED’s) and bycatch reduction devices (BRD’s) are the principal solutions to bycatch in Australian prawn-trawl fisheries. This paper focuses on a major prawn-trawl fishery of northeastern Australia, and the results of commercial use of TED’s and BRD’s in the Queensland east coast trawl fishery are presented. New industry designs are described, and the status of TED and BRD adoption and regulation is summarized. The implementation of technological solutions to reduce fishery bycatch is assumed generally to assist prawn-trawl fisheries within Australia in achieving legislative requirements for minimal environmental impact and ecological sustainable development

    Betti number signatures of homogeneous Poisson point processes

    Full text link
    The Betti numbers are fundamental topological quantities that describe the k-dimensional connectivity of an object: B_0 is the number of connected components and B_k effectively counts the number of k-dimensional holes. Although they are appealing natural descriptors of shape, the higher-order Betti numbers are more difficult to compute than other measures and so have not previously been studied per se in the context of stochastic geometry or statistical physics. As a mathematically tractable model, we consider the expected Betti numbers per unit volume of Poisson-centred spheres with radius alpha. We present results from simulations and derive analytic expressions for the low intensity, small radius limits of Betti numbers in one, two, and three dimensions. The algorithms and analysis depend on alpha-shapes, a construction from computational geometry that deserves to be more widely known in the physics community.Comment: Submitted to PRE. 11 pages, 10 figure

    Bayesian Exponential Random Graph Models with Nodal Random Effects

    Get PDF
    We extend the well-known and widely used Exponential Random Graph Model (ERGM) by including nodal random effects to compensate for heterogeneity in the nodes of a network. The Bayesian framework for ERGMs proposed by Caimo and Friel (2011) yields the basis of our modelling algorithm. A central question in network models is the question of model selection and following the Bayesian paradigm we focus on estimating Bayes factors. To do so we develop an approximate but feasible calculation of the Bayes factor which allows one to pursue model selection. Two data examples and a small simulation study illustrate our mixed model approach and the corresponding model selection.Comment: 23 pages, 9 figures, 3 table

    Cold atom gravimetry with a Bose-Einstein Condensate

    Full text link
    We present a cold atom gravimeter operating with a sample of Bose-condensed Rubidium-87 atoms. Using a Mach-Zehnder configuration with the two arms separated by a two-photon Bragg transition, we observe interference fringes with a visibility of 83% at T=3 ms. We exploit large momentum transfer (LMT) beam splitting to increase the enclosed space-time area of the interferometer using higher-order Bragg transitions and Bloch oscillations. We also compare fringes from condensed and thermal sources, and observe a reduced visibility of 58% for the thermal source. We suspect the loss in visibility is caused partly by wavefront aberrations, to which the thermal source is more susceptible due to its larger transverse momentum spread. Finally, we discuss briefly the potential advantages of using a coherent atomic source for LMT, and present a simple mean-field model to demonstrate that with currently available experimental parameters, interaction-induced dephasing will not limit the sensitivity of inertial measurements using freely-falling, coherent atomic sources.Comment: 6 pages, 4 figures. Final version, published PR

    Exponential Random Graph Modeling for Complex Brain Networks

    Get PDF
    Exponential random graph models (ERGMs), also known as p* models, have been utilized extensively in the social science literature to study complex networks and how their global structure depends on underlying structural components. However, the literature on their use in biological networks (especially brain networks) has remained sparse. Descriptive models based on a specific feature of the graph (clustering coefficient, degree distribution, etc.) have dominated connectivity research in neuroscience. Corresponding generative models have been developed to reproduce one of these features. However, the complexity inherent in whole-brain network data necessitates the development and use of tools that allow the systematic exploration of several features simultaneously and how they interact to form the global network architecture. ERGMs provide a statistically principled approach to the assessment of how a set of interacting local brain network features gives rise to the global structure. We illustrate the utility of ERGMs for modeling, analyzing, and simulating complex whole-brain networks with network data from normal subjects. We also provide a foundation for the selection of important local features through the implementation and assessment of three selection approaches: a traditional p-value based backward selection approach, an information criterion approach (AIC), and a graphical goodness of fit (GOF) approach. The graphical GOF approach serves as the best method given the scientific interest in being able to capture and reproduce the structure of fitted brain networks

    Artificial neural networks and player recruitment in professional soccer

    Get PDF
    The aim was to objectively identify key performance indicators in professional soccer that influence outfield players’ league status using an artificial neural network. Mean technical performance data were collected from 966 outfield players’ (mean SD; age: 25 ± 4 yr, 1.81 ±) 90-minute performances in the English Football League. ProZone’s MatchViewer system and online databases were used to collect data on 347 indicators assessing the total number, accuracy and consistency of passes, tackles, possessions regained, clearances and shots. Players were assigned to one of three categories based on where they went on to complete most of their match time in the following season: group 0 (n = 209 players) went on to play in a lower soccer league, group 1 (n = 637 players) remained in the Football League Championship, and group 2 (n = 120 players) consisted of players who moved up to the English Premier League. The models created correctly predicted between 61.5% and 78.8% of the players’ league status. The model with the highest average test performance was for group 0 v 2 (U21 international caps, international caps, median tackles, percentage of first time passes unsuccessful upper quartile, maximum dribbles and possessions gained minimum) which correctly predicted 78.8% of the players’ league status with a test error of 8.3%. To date, there has not been a published example of an objective method of predicting career trajectory in soccer. This is a significant development as it highlights the potential for machine learning to be used in the scouting and recruitment process in a professional soccer environment

    A multibeam atom laser: coherent atom beam splitting from a single far detuned laser

    Full text link
    We report the experimental realisation of a multibeam atom laser. A single continuous atom laser is outcoupled from a Bose-Einstein condensate (BEC) via an optical Raman transition. The atom laser is subsequently split into up to five atomic beams with slightly different momenta, resulting in multiple, nearly co-propagating, coherent beams which could be of use in interferometric experiments. The splitting process itself is a novel realization of Bragg diffraction, driven by each of the optical Raman laser beams independently. This presents a significantly simpler implementation of an atomic beam splitter, one of the main elements of coherent atom optics

    Quantum projection noise limited interferometry with coherent atoms in a Ramsey type setup

    Full text link
    Every measurement of the population in an uncorrelated ensemble of two-level systems is limited by what is known as the quantum projection noise limit. Here, we present quantum projection noise limited performance of a Ramsey type interferometer using freely propagating coherent atoms. The experimental setup is based on an electro-optic modulator in an inherently stable Sagnac interferometer, optically coupling the two interfering atomic states via a two-photon Raman transition. Going beyond the quantum projection noise limit requires the use of reduced quantum uncertainty (squeezed) states. The experiment described demonstrates atom interferometry at the fundamental noise level and allows the observation of possible squeezing effects in an atom laser, potentially leading to improved sensitivity in atom interferometers.Comment: 8 pages, 8 figures, published in Phys. Rev.

    Statistical Mechanics of Steiner trees

    Get PDF
    The Minimum Weight Steiner Tree (MST) is an important combinatorial optimization problem over networks that has applications in a wide range of fields. Here we discuss a general technique to translate the imposed global connectivity constrain into many local ones that can be analyzed with cavity equation techniques. This approach leads to a new optimization algorithm for MST and allows to analyze the statistical mechanics properties of MST on random graphs of various types

    Statistical inference of the generation probability of T-cell receptors from sequence repertoires

    Full text link
    Stochastic rearrangement of germline DNA by VDJ recombination is at the origin of immune system diversity. This process is implemented via a series of stochastic molecular events involving gene choices and random nucleotide insertions between, and deletions from, genes. We use large sequence repertoires of the variable CDR3 region of human CD4+ T-cell receptor beta chains to infer the statistical properties of these basic biochemical events. Since any given CDR3 sequence can be produced in multiple ways, the probability distribution of hidden recombination events cannot be inferred directly from the observed sequences; we therefore develop a maximum likelihood inference method to achieve this end. To separate the properties of the molecular rearrangement mechanism from the effects of selection, we focus on non-productive CDR3 sequences in T-cell DNA. We infer the joint distribution of the various generative events that occur when a new T-cell receptor gene is created. We find a rich picture of correlation (and absence thereof), providing insight into the molecular mechanisms involved. The generative event statistics are consistent between individuals, suggesting a universal biochemical process. Our distribution predicts the generation probability of any specific CDR3 sequence by the primitive recombination process, allowing us to quantify the potential diversity of the T-cell repertoire and to understand why some sequences are shared between individuals. We argue that the use of formal statistical inference methods, of the kind presented in this paper, will be essential for quantitative understanding of the generation and evolution of diversity in the adaptive immune system.Comment: 20 pages, including Appendi
    corecore