1,599 research outputs found
Discriminative Segmental Cascades for Feature-Rich Phone Recognition
Discriminative segmental models, such as segmental conditional random fields
(SCRFs) and segmental structured support vector machines (SSVMs), have had
success in speech recognition via both lattice rescoring and first-pass
decoding. However, such models suffer from slow decoding, hampering the use of
computationally expensive features, such as segment neural networks or other
high-order features. A typical solution is to use approximate decoding, either
by beam pruning in a single pass or by beam pruning to generate a lattice
followed by a second pass. In this work, we study discriminative segmental
models trained with a hinge loss (i.e., segmental structured SVMs). We show
that beam search is not suitable for learning rescoring models in this
approach, though it gives good approximate decoding performance when the model
is already well-trained. Instead, we consider an approach inspired by
structured prediction cascades, which use max-marginal pruning to generate
lattices. We obtain a high-accuracy phonetic recognition system with several
expensive feature types: a segment neural network, a second-order language
model, and second-order phone boundary features
Multitask Learning with Low-Level Auxiliary Tasks for Encoder-Decoder Based Speech Recognition
End-to-end training of deep learning-based models allows for implicit
learning of intermediate representations based on the final task loss. However,
the end-to-end approach ignores the useful domain knowledge encoded in explicit
intermediate-level supervision. We hypothesize that using intermediate
representations as auxiliary supervision at lower levels of deep networks may
be a good way of combining the advantages of end-to-end training and more
traditional pipeline approaches. We present experiments on conversational
speech recognition where we use lower-level tasks, such as phoneme recognition,
in a multitask training approach with an encoder-decoder model for direct
character transcription. We compare multiple types of lower-level tasks and
analyze the effects of the auxiliary tasks. Our results on the Switchboard
corpus show that this approach improves recognition accuracy over a standard
encoder-decoder model on the Eval2000 test set
Granular rheology: measuring boundary forces with laser-cut leaf springs
In granular physics experiments, it is a persistent challenge to obtain the
boundary stress measurements necessary to provide full a rheological
characterization of the dynamics. Here, we describe a new technique by which
the outer boundary of a 2D Couette cell both confines the granular material and
provides spatially- and temporally- resolved stress measurements. This key
advance is enabled by desktop laser-cutting technology, which allows us to
design and cut linearly-deformable walls with a specified spring constant. By
tracking the position of each segment of the wall, we measure both the normal
and tangential stress throughout the experiment. This permits us to calculate
the amount of shear stress provided by basal friction, and thereby determine
accurate values of .Comment: 4 pages, 5 figures, powder and grains 2017 conferenc
Did they Find it? Developing a Revised Materials Availability Survey
The purpose of this paper is to report on work being done by Curtin University in Perth, Western Australia to bring up to date an old library idea: the “materials availability survey.
Quality Assurance Improvements in Australian University Libraries
Purpose: The purpose of this paper is to examine the growth in quality assurance maturity within the six Australian and New Zealand university libraries which make up the Libraries of the Australian Technology Network (LATN). Design/methodology/approach: The paper is based on benchmarking surveys of library quality assurance commissioned by LATN in 2005/2006, with a follow up study in 2010. The author led the conduct and analysis of both surveys. The 2005/2006 study reviewed quality assurance practices at the member libraries, to draw out examples of best practice and identify gaps and possible areas for improvement within the libraries. It was based on a review of member libraries’ websites, a questionnaire completed by a nominee from each member library, and follow up in person interviews with each nominee and the University Librarian of each institution. In 2009/2010 the same questionnaire was re-administered to investigate whether changes had occurred in the intervening period, including what improvements had been made and where there were still gaps. Had the conduct of quality audits by the Australian Universities Quality Agency had an impact? Had members made improvements to their quality assurance processes based on the findings of the first study or for other reasons? To elicit additional information, follow up interviews are being carried out in 2011. Findings: In 2005/2006 the reviewers found three models of responsibility for quality assurance: centralised, within a manager's portfolio and devolved. Each was appropriate to a different level of quality maturity, with a centralised model considered to be most appropriate at the early stages of development. Whereas in 2005/2006 only one library had a centralised model, by 2010 three libraries had adopted this model and one had moved on from it.The paper compares applications of these models in the libraries and looks at the extent to which growth in quality assurance in the libraries is associated with adoption of the centralised model. It distinguishes the formal creation and appointment of a quality officer position from the ad hoc individual efforts in quality which can and do occur in many libraries. In 2005/2006 only two libraries had a functioning and well-maintained quality framework which the LATN reviewers considered to be a hallmark of best practice in quality assurance. By 2010 this number had doubled to four. The paper looks at the quality, planning and/or performance frameworks in place and whether they were selected or developed by the library or imposed by their parent university. The impact of the adoption of a framework on the development of quality policies, procedures and documentation to achieve comprehensiveness, standardisation and repeatability in quality assurance are considered. A notable change between the 2005/2006 and the 2010 surveys was the growth in individual work planning and performance review, which was identified by the LATN reviewers as a sector-wide gap in 2005/2006. Ideally, use of such plans and assessments should assist in the taking quality beyond library management, to develop amongst the library staff a culture of continuous improvement. Originality/value: The paper provides real examples of how quality assurance can and has been improved in libraries, within a five year timeframe. While it is based on the experience of Australian and New Zealand libraries, it addresses concerns and provides solutions which are appropriate internationally. It provides a range of options which an individual library could adopt depending on its own context
Closing the gap: the maturing of quality assurance in Australian university libraries
A benchmarking review of the quality assurance practices of the libraries of the Australian Technology Network conducted in 2006 revealed exemplars of best practice, but also sector-wide gaps. A follow-up review in 2010 indicated the best practices that remain relevant.While some gaps persist, there has been improvement across the libraries and the development of greater “quality maturity”
Interaction design and emotional wellbeing
The World Health Organisation has concluded that
emotional wellbeing is fundamental to our quality of
life. It enables us to experience life as meaningful and
is an essential component of social cohesion, peace and
stability in the living environment [21]. This workshop
will bring together a diverse community to consolidate
existing knowledge and identify new opportunities for
research on technologies designed to support emotional
wellbeing. The workshop will examine uses of
technology in mental health settings, but will also
consider the importance of emotional needs in physical
healthcare and wellbeing more generally. The design of
technology to provide social support and to extend
traditional care networks will be key workshop themes
E-Reserve as a solution to digital copyright management at Curtin University of Technology
This paper addresses how Curtin University Library & Information Service successfully adapted its E-Reserve system to accommodate the 2001 amendments to Australian copyright legislation, offering the University a solution to digital copyright management without sacrificing the needs of students, academic staff or the Library itself.The paper examines how the Library assisted Curtin to keep its copying under the educational provisions of the Copyright Act to a minimum. The Library's role in achieving necessary changes to University copyright policy is discussed and its linking system to course/unit learning management systems described.The paper considers the legislative requirements for educational electronic copying, including warning notices, authentication and institutional limits, and how these were incorporated into Curtin's E-Reserve. It also covers each of the fields universities are required to report on for the Electronic Use System monitoring scheme 2003-2007. To minimize Library workloads and inconvenience to staff and students, the Library determined only to collect some of this data year-round. Data for other fields would only be collected during a monitoring period. The reasons for this decision and how the data was successfully collected when the University was monitored in 2002-2003 are covered
Combining isotonic regression and EM algorithm to predict genetic risk under monotonicity constraint
In certain genetic studies, clinicians and genetic counselors are interested
in estimating the cumulative risk of a disease for individuals with and without
a rare deleterious mutation. Estimating the cumulative risk is difficult,
however, when the estimates are based on family history data. Often, the
genetic mutation status in many family members is unknown; instead, only
estimated probabilities of a patient having a certain mutation status are
available. Also, ages of disease-onset are subject to right censoring. Existing
methods to estimate the cumulative risk using such family-based data only
provide estimation at individual time points, and are not guaranteed to be
monotonic or nonnegative. In this paper, we develop a novel method that
combines Expectation-Maximization and isotonic regression to estimate the
cumulative risk across the entire support. Our estimator is monotonic,
satisfies self-consistent estimating equations and has high power in detecting
differences between the cumulative risks of different populations. Application
of our estimator to a Parkinson's disease (PD) study provides the age-at-onset
distribution of PD in PARK2 mutation carriers and noncarriers, and reveals a
significant difference between the distribution in compound heterozygous
carriers compared to noncarriers, but not between heterozygous carriers and
noncarriers.Comment: Published in at http://dx.doi.org/10.1214/14-AOAS730 the Annals of
Applied Statistics (http://www.imstat.org/aoas/) by the Institute of
Mathematical Statistics (http://www.imstat.org
- …