21,826 research outputs found
Sixty yet still active!
The British Orthoptic Society published the first British
Orthoptic Journal in 1939, the second appeared in 1944
then, with the exception of 1946, annually. In the first
copy of the British Orthoptic Journal the editorial
outlines the events leading up to the formation of the
British Orthoptic Society, and this and the subsequent
history of the Society is described in the 1987 editorial,
the year of the Society’s Golden Jubilee. In the
president’s letter published in that first edition, Mary
Maddox wrote: ‘This journal will afford a method of
recording the progress of orthoptics.
Minimising latency of pitch detection algorithms for live vocals on low-cost hardware
A pitch estimation device was proposed for live vocals to output appropriate pitch data through the musical instrument digital interface (MIDI). The intention was to ideally achieve unnoticeable latency while maintaining estimation accuracy. The projected target platform was low-cost, standalone hardware based around a microcontroller such as the Microchip PIC series. This study investigated, optimised and compared the performance of suitable algorithms for this application.
Performance was determined by two key factors: accuracy and latency. Many papers have been published over the past six decades assessing and comparing the accuracy of pitch detection algorithms on various signals, including vocals. However, very little information is available concerning the latency of pitch detection algorithms and methods with which this can be minimised. Real-time audio introduces a further latency challenge that is sparsely studied, minimising the length of sampled audio required by the algorithms in order to reduce overall total latency.
Thorough testing was undertaken in order to determine the best-performing algorithm and optimal parameter combination. Software modifications were implemented to facilitate accurate, repeatable, automated testing in order to build a comprehensive set of results encompassing a wide range of test conditions.
The results revealed that the infinite-peak-clipping autocorrelation function (IACF) performed better than the other autocorrelation functions tested and also identified ideal parameter values or value ranges to provide the optimal latency/accuracy balance.
Although the results were encouraging, testing highlighted some fundamental issues with vocal pitch detection. Potential solutions are proposed for further development
Spending Out - Making It Happen
While it may be of interest to a wider audience, this companion guide is focused on the practicalities of spending out and targeted at those foundations that have decided this is the path for them. By sharing the practical experience of those who are well into the process or have already completed it, we hope to make it easier for others wishing to follow in their footsteps
CGIwithR: Facilities for processing web forms using R
CGIwithR is a package for use with the R statistical computing environment, to facilitate processing of information from web-based forms, and reporting of results in the Hypertext Markup Language (HTML), through the Common Gateway Interface (CGI). CGIwithR permits the straightforward use of R as a CGI scripting language. This paper serves as an extended user manual for CGIwithR, supplementary to the R help pages installed with the package.
Quasi-variances in Xlisp-Stat and on the web
The most common summary of a fitted statistical model, a list of parameter estimates and standard errors, does not give the precision of estimated combinations of the parameters, such as differences or ratios. For this, covariances also are needed; but space constraints typically mean that the full covariance matrix cannot routinely be reported. In the important case of parameters associated with the discrete levels of an experimental factor or with a categorical classifying variable, the identifiable parameter combinations are linear contrasts. The QV Calculator computes "quasi-variances" which may be used as an alternative summary of the precision of the estimated parameters. The summary based on quasi-variances is simple and permits good approximation of the standard error of any desired contrast. The idea of such a summary has been suggested by Ridout (1989) and, under the name "floating absolute risk", by Easton, Peto & Babiker (1991). It applies to a wide variety of statistical models, including linear and nonlinear regressions, generalized-linear and GEE models, Cox proportional-hazard models for survival data, generalized additive models, etc. The QV Calculator is written in Xlisp-Stat (Tierney,'90) and can be used either directly by users who have access to Xlisp-Stat or through a web interface by those who do not. The user either supplies the covariance matrix for the effect parameters of interest, or, if using Xlisp-Stat directly, can generate that matrix by interaction with a model object.
Jeffreys-prior penalty, finiteness and shrinkage in binomial-response generalized linear models
Penalization of the likelihood by Jeffreys' invariant prior, or by a positive
power thereof, is shown to produce finite-valued maximum penalized likelihood
estimates in a broad class of binomial generalized linear models. The class of
models includes logistic regression, where the Jeffreys-prior penalty is known
additionally to reduce the asymptotic bias of the maximum likelihood estimator;
and also models with other commonly used link functions such as probit and
log-log. Shrinkage towards equiprobability across observations, relative to the
maximum likelihood estimator, is established theoretically and is studied
through illustrative examples. Some implications of finiteness and shrinkage
for inference are discussed, particularly when inference is based on Wald-type
procedures. A widely applicable procedure is developed for computation of
maximum penalized likelihood estimates, by using repeated maximum likelihood
fits with iteratively adjusted binomial responses and totals. These theoretical
results and methods underpin the increasingly widespread use of reduced-bias
and similarly penalized binomial regression models in many applied fields
Dissipative solitons in pattern-forming nonlinear optical systems : cavity solitons and feedback solitons
Many dissipative optical systems support patterns. Dissipative solitons are generally found where a pattern coexists with a stable unpatterned state. We consider such phenomena in driven optical cavities containing a nonlinear medium (cavity solitons) and rather similar phenomena (feedback solitons) where a driven nonlinear optical medium is in front of a single feedback mirror. The history, theory, experimental status, and potential application of such solitons is reviewed
Blur point versus indistinguishable point in assessment of accommodation: objective and subjective findings in early presbyopes
Aim: To measure the distance from the eye and the
refraction of the eye at the point at which print blurs
and the point at which it becomes unreadable.
Methods: Subjective accommodation in 7 early
presbyopic subjects (mean age 45 years), with no
additional near correction, was tested using 6/12
reduced Snellen and 6/12 Lea symbols. The point at
which blur was first noticed and the point at which
the print became indistinguishable were noted in
centimetres. Objective measures of refraction were
taken at each of these points.
Results: Subjective and objective results for reduced
Snellen and Lea symbols were similar ( p = 0.91;
p = 0.81) as were the points where the print was no
longer distinguishable ( p = 0.23; p = 0.72). The difference between the blur point and the indistinguishable
point measured in centimetres for both the
reduced Snellen text and Lea symbols were statistically
significant ( p = 0.005; p = 0.0001). The objective
measures for these points, however, were not
statistically different ( p = 0.32 and p = 0.63, respectively).
Conclusion: A clinically significant difference exists
in the distance from the eyes between the point at
which text blurs and the point at which it becomes
indistinguishable. No significant change occurs in
accommodation when measured objectively after the
blur point. It is recommended that the end point of
this test is the point at which print starts to blur
Economics of organic fruit production (OF0151)
This is the final report of Defra project OF0151
Despite a very strong consumer demand for organic fruit, it is the least developed sector of the UK organic industry. The main constraint to growth in supply is the lack of organic fruit growers, especially those on a large enough scale to supply the wholesale, multiple and processing markets. The UK Organic Fruit Focus Group was set up in 1997 as a producer initiative to develop the market and production of UK organic fruit. At the first meeting of the group it was concluded that a) the absence of written technical information on how to grow organic fruit b) the lack of experienced advisors c) the lack of fruit and d) a lack of information on the economics of organic fruit were major barriers to grower confidence and hence expanding production.
In June 1998 HDRA began a one year study into the Economics of Organic Fruit Production. The study aims to provide information on:
• the size of the organic fruit market and potential for future growth
• returns and costs of growing organic top and soft fruit
Information for this study has been obtained through contact and visits to marketing organisations, fruit processors and growers. For information on the market major buyers of organic fruit have been contacted to ascertain quantities bought and market trends. In consultation with the ADAS Fruit Team and the Welsh Institute of Rural Studies, data collection forms were devised to enable full costing techniques (all costs allocated to different cost centres) to arrive at net margins and costs of producing organic fruit per hectare (acre), and per kg (lb). In determining the financial returns, average yields over a number of years (5-10) have been used rather than those related to a specific year and where necessary costs were related to those yields.
Presently there are a very small number of specialised organic fruit growers, therefore the sample was small: dessert apples (5), culinary apples (3) pear growers (3), strawberry growers (5). It was not possible to find any commercial data from growers of other organic fruit. Case study data from these growers of apples, pears and strawberries were used to provide ‘best possible estimates’ for the physical and financial performance of these organic fruit enterprises.
The general conclusions are that despite low (lower than conventional) and sometimes variable yields most organic fruit growers are currently able to generate economic returns. Profitability is related to current high prices (premiums of 60 100% over conventional) for fruit and ability to sell the whole crop to various outlets. Although individual costs differ the overall costs of production are similar between conventional and organic fruit. The profitability of organic fruit appears to be similar or greater than average conventional production. Break even budgets indicate that even if prices fell by approximately 20% then organic fruit production could still be profitable. Price premiums of approximately 40% are still required to enable organic fruit production to be profitable at current yields.
Current price premiums offer potential economically profitable returns; however, conventional growers are reluctant to convert. To give growers confidence to take up the challenge of organic fruit production they need encouragement from government and industry in terms of continued aid to assist conversion, more money for research to improve the quantity and quality of economic data available, to improve production techniques, and finally, money to disseminate this information to growers. This report suggests that continued economic monitoring of converting and existing organic fruit farms should be undertaken. Fruit buyers should also encourage UK growers by offering them market incentives. Unless the UK organic fruit growers receive this encouragement, the majority of organic fruit may continue to be imported
TAPER: query-aware, partition-enhancement for large, heterogenous, graphs
Graph partitioning has long been seen as a viable approach to address Graph
DBMS scalability. A partitioning, however, may introduce extra query processing
latency unless it is sensitive to a specific query workload, and optimised to
minimise inter-partition traversals for that workload. Additionally, it should
also be possible to incrementally adjust the partitioning in reaction to
changes in the graph topology, the query workload, or both. Because of their
complexity, current partitioning algorithms fall short of one or both of these
requirements, as they are designed for offline use and as one-off operations.
The TAPER system aims to address both requirements, whilst leveraging existing
partitioning algorithms. TAPER takes any given initial partitioning as a
starting point, and iteratively adjusts it by swapping chosen vertices across
partitions, heuristically reducing the probability of inter-partition
traversals for a given pattern matching queries workload. Iterations are
inexpensive thanks to time and space optimisations in the underlying support
data structures. We evaluate TAPER on two different large test graphs and over
realistic query workloads. Our results indicate that, given a hash-based
partitioning, TAPER reduces the number of inter-partition traversals by around
80%; given an unweighted METIS partitioning, by around 30%. These reductions
are achieved within 8 iterations and with the additional advantage of being
workload-aware and usable online.Comment: 12 pages, 11 figures, unpublishe
- …
