5,582 research outputs found

    Whole-gear efficiency of a benthic survey trawl for flatfish

    Get PDF
    Whole-gear efficiency (the proportion of fish passing between the otter doors of a bottom trawl that are subsequently captured) was estimated from data collected during experiments to measure the herding efficiency of bridles and doors, the capture efficiency of the net, and the length of the bridles sufficiently close to the seafloor to elicit a herding response. The experiments were focused on four species of flatfish: arrowtooth flounder (Atheresthes stomias), flathead sole (Hippoglossoides elassodon), rex sole (Glyptocephalus zachirus), and Dover sole (Microstomus pacificus). Whole-gear efficiency varied with fish length and reached maximum values between 40% and 50% for arrowtooth flounder, flathead sole, and rex sole. For Dover sole, however, whole-gear efficiency declined from a maximum of 33% over the length range sampled. Such efficiency estimates can be used to determine catchability, which, in turn, can be used to improve the accuracy of stock assessment models when the time series of a survey is short

    Quantum computation with optical coherent states

    Get PDF
    We show that quantum computation circuits using coherent states as the logical qubits can be constructed from simple linear networks, conditional photon measurements and "small" coherent superposition resource states

    Selection from read-only memory with limited workspace

    Full text link
    Given an unordered array of NN elements drawn from a totally ordered set and an integer kk in the range from 11 to NN, in the classic selection problem the task is to find the kk-th smallest element in the array. We study the complexity of this problem in the space-restricted random-access model: The input array is stored on read-only memory, and the algorithm has access to a limited amount of workspace. We prove that the linear-time prune-and-search algorithm---presented in most textbooks on algorithms---can be modified to use Θ(N)\Theta(N) bits instead of Θ(N)\Theta(N) words of extra space. Prior to our work, the best known algorithm by Frederickson could perform the task with Θ(N)\Theta(N) bits of extra space in O(NlgN)O(N \lg^{*} N) time. Our result separates the space-restricted random-access model and the multi-pass streaming model, since we can surpass the Ω(NlgN)\Omega(N \lg^{*} N) lower bound known for the latter model. We also generalize our algorithm for the case when the size of the workspace is Θ(S)\Theta(S) bits, where lg3NSN\lg^3{N} \leq S \leq N. The running time of our generalized algorithm is O(Nlg(N/S)+N(lgN)/lgS)O(N \lg^{*}(N/S) + N (\lg N) / \lg{} S), slightly improving over the O(Nlg(N(lgN)/S)+N(lgN)/lgS)O(N \lg^{*}(N (\lg N)/S) + N (\lg N) / \lg{} S) bound of Frederickson's algorithm. To obtain the improvements mentioned above, we developed a new data structure, called the wavelet stack, that we use for repeated pruning. We expect the wavelet stack to be a useful tool in other applications as well.Comment: 16 pages, 1 figure, Preliminary version appeared in COCOON-201

    Effect of multimode entanglement on lossy optical quantum metrology

    Get PDF
    In optical interferometry multimode entanglement is often assumed to be the driving force behind quantum enhanced measurements. Recent work has shown this assumption to be false: single-mode quantum states perform just as well as their multimode entangled counterparts. We go beyond this to show that when photon losses occur, an inevitability in any realistic system, multimode entanglement is actually detrimental to obtaining quantum enhanced measurements. We specifically apply this idea to a superposition of coherent states, demonstrating that these states show a robustness to loss that allows them to significantly outperform their competitors in realistic systems. A practically viable measurement scheme is then presented that allows measurements close to the theoretical bound, even with loss. These results promote an alternate way of approaching optical quantum metrology using single-mode states that we expect to have great implications for the future

    Finding the Median (Obliviously) with Bounded Space

    Full text link
    We prove that any oblivious algorithm using space SS to find the median of a list of nn integers from {1,...,2n}\{1,...,2n\} requires time Ω(nloglogSn)\Omega(n \log\log_S n). This bound also applies to the problem of determining whether the median is odd or even. It is nearly optimal since Chan, following Munro and Raman, has shown that there is a (randomized) selection algorithm using only ss registers, each of which can store an input value or O(logn)O(\log n)-bit counter, that makes only O(loglogsn)O(\log\log_s n) passes over the input. The bound also implies a size lower bound for read-once branching programs computing the low order bit of the median and implies the analog of PNPcoNPP \ne NP \cap coNP for length o(nloglogn)o(n \log\log n) oblivious branching programs

    easySTORM: a robust, lower-cost approach to localisation and TIRF microscopy

    Get PDF
    TIRF and STORM microscopy are super-resolving fluorescence imaging modalities for which current implementations on standard microscopes can present significant complexity and cost. We present a straightforward and low-cost approach to implement STORM and TIRF taking advantage of multimode optical fibres and multimode diode lasers to provide the required excitation light. Combined with open source software and relatively simple protocols to prepare samples for STORM, including the use of Vectashield for non-TIRF imaging, this approach enables TIRF and STORM imaging of cells labelled with appropriate dyes or expressing suitable fluorescent proteins to become widely accessible at low cost

    Absolute quantification of the host-to-parasite DNA ratio in Theileria parva-infected lymphocyte cell lines

    Get PDF
    Theileria parva is a tick-transmitted intracellular apicomplexan pathogen of cattle in sub-Saharan Africa that causes East Coast fever (ECF). ECF is an acute fatal disease that kills over one million cattle annually, imposing a tremendous burden on African small-holder cattle farmers. The pathology and level of T. parva infections in its wildlife host, African buffalo (Syncerus caffer), and in cattle are distinct. We have developed an absolute quantification method based on quantitative PCR (qPCR) in which recombinant plasmids containing single copy genes specific to the parasite (apical membrane antigen 1 gene, ama1) or the host (hypoxanthine phosphoribosyltransferase 1, hprt1) are used as the quantification reference standards. Our study shows that T. parva and bovine cells are present in similar numbers in T. parva-infected lymphocyte cell lines and that consequently, due to its much smaller genome size, T. parva DNA comprises between 0.9% and 3% of the total DNA samples extracted from these lines. This absolute quantification assay of parasite and host genome copy number in a sample provides a simple and reliable method of assessing T. parva load in infected bovine lymphocytes, and is accurate over a wide range of host-to-parasite DNA ratios. Knowledge of the proportion of target DNA in a sample, as enabled by this method, is essential for efficient high-throughput genome sequencing applications for a variety of intracellular pathogens. This assay will also be very useful in future studies of interactions of distinct host-T. parva stocks and to fully characterize the dynamics of ECF infection in the field

    Single photon quantum non-demolition in the presence of inhomogeneous broadening

    Get PDF
    Electromagnetically induced transparency (EIT) has been often proposed for generating nonlinear optical effects at the single photon level; in particular, as a means to effect a quantum non-demolition measurement of a single photon field. Previous treatments have usually considered homogeneously broadened samples, but realisations in any medium will have to contend with inhomogeneous broadening. Here we reappraise an earlier scheme [Munro \textit{et al.} Phys. Rev. A \textbf{71}, 033819 (2005)] with respect to inhomogeneities and show an alternative mode of operation that is preferred in an inhomogeneous environment. We further show the implications of these results on a potential implementation in diamond containing nitrogen-vacancy colour centres. Our modelling shows that single mode waveguide structures of length 200μm200 \mu\mathrm{m} in single-crystal diamond containing a dilute ensemble of NV^- of only 200 centres are sufficient for quantum non-demolition measurements using EIT-based weak nonlinear interactions.Comment: 21 pages, 9 figures (some in colour) at low resolution for arXiv purpose

    On-line construction of position heaps

    Get PDF
    We propose a simple linear-time on-line algorithm for constructing a position heap for a string [Ehrenfeucht et al, 2011]. Our definition of position heap differs slightly from the one proposed in [Ehrenfeucht et al, 2011] in that it considers the suffixes ordered from left to right. Our construction is based on classic suffix pointers and resembles the Ukkonen's algorithm for suffix trees [Ukkonen, 1995]. Using suffix pointers, the position heap can be extended into the augmented position heap that allows for a linear-time string matching algorithm [Ehrenfeucht et al, 2011].Comment: to appear in Journal of Discrete Algorithm
    corecore