840 research outputs found
Regression with Linear Factored Functions
Many applications that use empirically estimated functions face a curse of
dimensionality, because the integrals over most function classes must be
approximated by sampling. This paper introduces a novel regression-algorithm
that learns linear factored functions (LFF). This class of functions has
structural properties that allow to analytically solve certain integrals and to
calculate point-wise products. Applications like belief propagation and
reinforcement learning can exploit these properties to break the curse and
speed up computation. We derive a regularized greedy optimization scheme, that
learns factored basis functions during training. The novel regression algorithm
performs competitively to Gaussian processes on benchmark tasks, and the
learned LFF functions are with 4-9 factored basis functions on average very
compact.Comment: Under review as conference paper at ECML/PKDD 201
Determination of Elastic Constants by Line-Focus V(Z) Measurements of Multiple Saw Modes
Line focus acoustic microscopy (LFAM) provides a method to determine the elastic constants of homogeneous materials and thin-film/substrate configurations, see Refs. [1–5]. The elastic constants are determined from the velocities of surface acoustic waves, which are obtained from measurement of the V(z) curve. Generally more than one elastic constant has to be determined. It is interesting to note that the procurement of sufficient data is sometimes more complicated for isotropic materials. For anisotropic solids the velocity can be measured as a function of the angle defining the propagation direction in the surface to yield a sufficiently large data set. For thin-film/substrate configurations measurements at various frequencies or for different film thickness may be carried out to obtain sufficient data. There are, however, obvious advantages to work with a single specimen and at a single frequency. This can be done by considering the contributions of more than one leaky SAW mode to the V(z) curve
How does reviewing the evidence change veterinary surgeons' beliefs regarding the treatment of ovine footrot? A quantitative and qualitative study
Footrot is a widespread, infectious cause of lameness in sheep, with major economic and welfare costs. The aims of this research were: (i) to quantify how veterinary surgeons’ beliefs regarding the efficacy of two treatments for footrot changed following a review of the evidence (ii) to obtain a consensus opinion following group discussions (iii) to capture complementary qualitative data to place their beliefs within a broader clinical context. Grounded in a Bayesian statistical framework, probabilistic elicitation (roulette method) was used to quantify the beliefs of eleven veterinary surgeons during two one-day workshops. There was considerable heterogeneity in veterinary surgeons’ beliefs before they listened to a review of the evidence. After hearing the evidence, seven participants quantifiably changed their beliefs. In particular, two participants who initially believed that foot trimming with topical oxytetracycline was the better treatment, changed to entirely favour systemic and topical oxytetracycline instead. The results suggest that a substantial amount of the variation in beliefs related to differences in veterinary surgeons’ knowledge of the evidence. Although considerable differences in opinion still remained after the evidence review, with several participants having non-overlapping 95% credible intervals, both groups did achieve a consensus opinion. Two key findings from the qualitative data were: (i) veterinary surgeons believed that farmers are unlikely to actively seek advice on lameness, suggesting a proactive veterinary approach is required (ii) more attention could be given to improving the way in which veterinary advice is delivered to farmers. In summary this study has: (i) demonstrated a practical method for probabilistically quantifying how veterinary surgeons’ beliefs change (ii) revealed that the evidence that currently exists is capable of changing veterinary opinion (iii) suggested that improved transfer of research knowledge into veterinary practice is needed (iv) identified some potential obstacles to the implementation of veterinary advice by farmers
Recommended from our members
Integration of visual and joint information to enable linear reaching motions
A new dynamics-driven control law was developed for a robot arm, based on the feedback control law which uses the linear transformation directly from work space to joint space. This was validated using a simulation of a two-joint planar robot arm and an optimisation algorithm was used to find the optimum matrix to generate straight trajectories of the end-effector in the work space. We found that this linear matrix can be decomposed into the rotation matrix representing the orientation of the goal direction and the joint relation matrix (MJRM) representing the joint response to errors in the Cartesian work space. The decomposition of the linear matrix indicates the separation of path planning in terms of the direction of the reaching motion and the synergies of joint coordination. Once the MJRM is numerically
obtained, the feedfoward planning of reaching direction allows us to provide asymptotically stable, linear trajectories in the entire work space through rotational transformation, completely avoiding the use of inverse kinematics. Our dynamics-driven control law suggests an interesting framework for interpreting human reaching motion control alternative to the dominant inverse method based explanations, avoiding expensive computation of the inverse kinematics and the point-to-point control along the desired trajectories
Quantum Bounds on Bell inequalities
We have determined the maximum quantum violation of 241 tight bipartite Bell
inequalities with up to five two-outcome measurement settings per party by
constructing the appropriate measurement operators in up to six-dimensional
complex and eight-dimensional real component Hilbert spaces using numerical
optimization. Out of these inequalities 129 has been introduced here. In 43
cases higher dimensional component spaces gave larger violation than qubits,
and in 3 occasions the maximum was achieved with six-dimensional spaces. We
have also calculated upper bounds on these Bell inequalities using a method
proposed recently. For all but 20 inequalities the best solution found matched
the upper bound. Surprisingly, the simplest inequality of the set examined,
with only three measurement settings per party, was not among them, despite the
high dimensionality of the Hilbert space considered. We also computed detection
threshold efficiencies for the maximally entangled qubit pair. These could be
lowered in several instances if degenerate measurements were also allowed.Comment: 12 pages, 4 tables; corrected Table I and modified Table III to
comply with Table I; more detailed results are available at
http://www.atomki.hu/atomki/TheorPhys/Bell_violation
Self Hyper-parameter Tuning for Stream Recommendation Algorithms
E-commerce platforms explore the interaction between users and digital content – user generated streams of events – to build and maintain dynamic user preference models which are used to make meaningful recommendations. However, the accuracy of these incremental models is critically affected by the choice of hyper-parameters. So far, the incremental recommendation algorithms used to process data streams rely on human expertise for hyper-parameter tuning. In this work we apply our Self Hyper-Parameter Tuning (SPT) algorithm to incremental recommendation algorithms. SPT adapts the Melder-Mead optimisation algorithm to perform hyper-parameter tuning. First, it creates three models with random hyper-parameter values and, then, at dynamic size intervals, assesses and applies the Melder-Mead operators to update their hyper-parameters until the models converge. The main contribution of this work is the adaptation of the SPT method to incremental matrix factorisation recommendation algorithms. The proposed method was evaluated with well-known recommendation data sets. The results show that SPT systematically improves data stream recommendations.info:eu-repo/semantics/publishedVersio
Predicting physical properties of woven fabrics via automated machine learning and textile design and finishing features
This paper presents a novel Machine Learning (ML) approach to support the creation of woven fabrics. Using data from a textile company, two CRoss-Industry Standard Process for Data Mining (CRISP-DM) iterations were executed, aiming to compare three input feature representation strategies related with fabric design and finishing processes. During the modeling stage of CRISP-DM, an Automated ML (AutoML) procedure was used to select the best regression model among six distinct state-of-the-art ML algorithms. A total of nine textile physical properties were modeled (e.g., abrasion, elasticity, pilling). Overall, the simpler yarn representation strategy obtained better predictive results. Moreover, for eight fabric properties (e.g., elasticity, pilling) the addition of finishing features improved the quality of the predictions. The best ML models obtained low predictive errors (from 2% to 7%) and are potentially valuable for the textile company, since they can be used to reduce the number of production attempts (saving time and costs).This work was carried out within the project “TexBoost: less Commodities moreSpecialities” reference POCI-01-0247-FEDER-024523, co-funded byFundo Eu-ropeu de Desenvolvimento Regional(FEDER), through Portugal 2020 (P2020)
Simplified tabu search with random-based searches for bound constrained global optimization
This paper proposes a simplified version of the tabu search algorithm that solely uses randomly generated direction vectors in the exploration and intensification search procedures, in order to define a set of trial points while searching in the neighborhood of a given point. In the diversification procedure, points that are inside any already visited region with a relative small visited frequency may be accepted, apart from those that are outside the visited regions. The produced numerical results show the robustness of the proposed method. Its efficiency when compared to other known metaheuristics available in the literature is encouraging.FCT - Fundação para a Ciência e a Tecnologia(UIDB/00013/2020); FCT – Fundação para a Ciência e Tecnologia within the R&D Units Project Scope: UIDB/00319/2020, UIDB/00013/2020 and UIDP/00013/2020 of CMAT-UM
Understanding Variation in Sets of N-of-1 Trials.
A recent paper in this journal by Chen and Chen has used computer simulations to examine a number of approaches to analysing sets of n-of-1 trials. We have examined such designs using a more theoretical approach based on considering the purpose of analysis and the structure as regards randomisation that the design uses. We show that different purposes require different analyses and that these in turn may produce quite different results. Our approach to incorporating the randomisation employed when the purpose is to test a null hypothesis of strict equality of the treatment makes use of Nelder's theory of general balance. However, where the purpose is to make inferences about the effects for individual patients, we show that a mixed model is needed. There are strong parallels to the difference between fixed and random effects meta-analyses and these are discussed
- …