2,756 research outputs found
Investigating the missing data mechanism in quality of life outcomes: a comparison of approaches
Background: Missing data is classified as missing completely at random (MCAR), missing at
random (MAR) or missing not at random (MNAR). Knowing the mechanism is useful in identifying
the most appropriate analysis. The first aim was to compare different methods for identifying this
missing data mechanism to determine if they gave consistent conclusions. Secondly, to investigate
whether the reminder-response data can be utilised to help identify the missing data mechanism.
Methods: Five clinical trial datasets that employed a reminder system at follow-up were used.
Some quality of life questionnaires were initially missing, but later recovered through reminders.
Four methods of determining the missing data mechanism were applied. Two response data
scenarios were considered. Firstly, immediate data only; secondly, all observed responses
(including reminder-response).
Results: In three of five trials the hypothesis tests found evidence against the MCAR assumption.
Logistic regression suggested MAR, but was able to use the reminder-collected data to highlight
potential MNAR data in two trials.
Conclusion: The four methods were consistent in determining the missingness mechanism. One
hypothesis test was preferred as it is applicable with intermittent missingness. Some inconsistencies between the two data scenarios were found. Ignoring the reminder data could potentially give a distorted view of the missingness mechanism. Utilising reminder data allowed the possibility of MNAR to be considered.The Chief Scientist Office of the Scottish Government Health Directorate.
Research Training Fellowship (CZF/1/31
Supply driven mortgage choice
Variable mortgage contracts dominate the UK mortgage market (Miles, 2004). The dominance of the variable rate mortgage contracts has important consequences for the transmission mechanism of monetary policy decisions and systemic risks (Khandani et al., 2012; Fuster and Vickery, 2013). This raises an obvious concern that a mortgage market such as that in the UK, where the major proportion of mortgage debt is either at a variable or fixed for less than two years rate (Badarinza, et al., 2013; CML, 2012), is vulnerable to alterations in the interest rate regime. Theoretically, mortgage choice is determined by demand and supply factors. So far, most of the existing literature has focused on the demand side perspective, and what is limited is consideration of supply side factors in empirical investigation on mortgage choice decisions. This paper uniquely explores whether supply side factors may partially explain observed/ex-post mortgage type decisions. Empirical results detect that lenders’ profit motives and mortgage funding/pricing issues may have assisted in preferences toward variable rate contracts. Securitisation is found to positively impact upon gross mortgage lending volumes while negatively impacting upon the share of variable lending flows. This shows that an increase in securitisation not only improves liquidity in the supply of mortgage funds, but also has the potential to shift mortgage choices toward fixed mortgage debt. The policy implications may involve a number of measures, including reconsideration of the capital requirements for the fixed, as opposed to the variable rate mortgage debt, growing securitisation and optimisation of the mortgage pricing policies
A review of RCTs in four medical journals to assess the use of imputation to overcome missing data in quality of life outcomes
Background: Randomised controlled trials (RCTs) are perceived as the gold-standard method for evaluating healthcare interventions, and increasingly include quality of life (QoL) measures. The observed results are susceptible to bias if a substantial proportion of outcome data are missing. The review aimed to determine whether imputation was used to deal with missing QoL outcomes. Methods: A random selection of 285 RCTs published during 2005/6 in the British Medical Journal, Lancet, New England Journal of Medicine and Journal of American Medical Association were identified. Results: QoL outcomes were reported in 61 (21%) trials. Six (10%) reported having no missing data, 20 (33%) reported ≤ 10% missing, eleven (18%) 11%–20% missing, and eleven (18%) reported >20% missing. Missingness was unclear in 13 (21%). Missing data were imputed in 19 (31%) of the 61 trials. Imputation was part of the primary analysis in 13 trials, but a sensitivity analysis in six. Last value carried forward was used in 12 trials and multiple imputation in two. Following imputation, the most common analysis method was analysis of covariance (10 trials). Conclusion: The majority of studies did not impute missing data and carried out a complete-case analysis. For those studies that did impute missing data, researchers tended to prefer simpler methods of imputation, despite more sophisticated methods being available.The Health Services Research Unit is funded by the Chief Scientist Office of the Scottish Government Health Directorate. Shona Fielding is also currently funded by the Chief Scientist Office on a Research Training Fellowship (CZF/1/31)
Flows and cohesion: balancing capabilities across an expanded union
The dynamics of physical relocation of intellectual capital is seen in the flow of skilled workers across international boundaries and the internal movements within the increasingly integrated economy of the European Union. This article describes a research framework developed within the context of a globalised economy and its potential application to issues within the boundaries of the European Unio
A review of RCTs in four medical journals to assess the use of imputation to overcome missing data in quality of life outcomes
Peer reviewedPublisher PD
Study protocol : the empirical investigation of methods to correct for measurement error in biobanks with dietary assessment
Peer reviewedPublisher PD
Dilation of the Giant Vortex State in a Mesoscopic Superconducting Loop
We have experimentally investigated the magnetisation of a mesoscopic
aluminum loop at temperatures well below the superconducting transition
temperature . The flux quantisation of the superconducting loop was
investigated with a -Hall magnetometer in magnetic field intensities
between . The magnetic field intensity periodicity observed in
the magnetization measurements is expected to take integer values of the
superconducting flux quanta . A closer inspection of the
periodicity, however, reveal a sub flux quantum shift. This fine structure we
interpret as a consequence of a so called giant vortex state nucleating towards
either the inner or the outer side of the loop. These findings are in agreement
with recent theoretical reports.Comment: 12 pages, 5 figures. Accepted for publication in Phys. Rev.
Avoiding catastrophic failure in correlated networks of networks
Networks in nature do not act in isolation but instead exchange information,
and depend on each other to function properly. An incipient theory of Networks
of Networks have shown that connected random networks may very easily result in
abrupt failures. This theoretical finding bares an intrinsic paradox: If
natural systems organize in interconnected networks, how can they be so stable?
Here we provide a solution to this conundrum, showing that the stability of a
system of networks relies on the relation between the internal structure of a
network and its pattern of connections to other networks. Specifically, we
demonstrate that if network inter-connections are provided by hubs of the
network and if there is a moderate degree of convergence of inter-network
connection the systems of network are stable and robust to failure. We test
this theoretical prediction in two independent experiments of functional brain
networks (in task- and resting states) which show that brain networks are
connected with a topology that maximizes stability according to the theory.Comment: 40 pages, 7 figure
Coupling models of cattle and farms with models of badgers for predicting the dynamics of bovine tuberculosis (TB)
Bovine TB is a major problem for the agricultural industry in several
countries. TB can be contracted and spread by species other than cattle and
this can cause a problem for disease control. In the UK and Ireland, badgers
are a recognised reservoir of infection and there has been substantial
discussion about potential control strategies. We present a coupling of
individual based models of bovine TB in badgers and cattle, which aims to
capture the key details of the natural history of the disease and of both
species at approximately county scale. The model is spatially explicit it
follows a very large number of cattle and badgers on a different grid size for
each species and includes also winter housing. We show that the model can
replicate the reported dynamics of both cattle and badger populations as well
as the increasing prevalence of the disease in cattle. Parameter space used as
input in simulations was swept out using Latin hypercube sampling and
sensitivity analysis to model outputs was conducted using mixed effect models.
By exploring a large and computationally intensive parameter space we show that
of the available control strategies it is the frequency of TB testing and
whether or not winter housing is practised that have the most significant
effects on the number of infected cattle, with the effect of winter housing
becoming stronger as farm size increases. Whether badgers were culled or not
explained about 5%, while the accuracy of the test employed to detect infected
cattle explained less than 3% of the variance in the number of infected cattle
Electron quantum metamaterials in van der Waals heterostructures
In recent decades, scientists have developed the means to engineer synthetic
periodic arrays with feature sizes below the wavelength of light. When such
features are appropriately structured, electromagnetic radiation can be
manipulated in unusual ways, resulting in optical metamaterials whose function
is directly controlled through nanoscale structure. Nature, too, has adopted
such techniques -- for example in the unique coloring of butterfly wings -- to
manipulate photons as they propagate through nanoscale periodic assemblies. In
this Perspective, we highlight the intriguing potential of designer
sub-electron wavelength (as well as wavelength-scale) structuring of electronic
matter, which affords a new range of synthetic quantum metamaterials with
unconventional responses. Driven by experimental developments in stacking
atomically layered heterostructures -- e.g., mechanical pick-up/transfer
assembly -- atomic scale registrations and structures can be readily tuned over
distances smaller than characteristic electronic length-scales (such as
electron wavelength, screening length, and electron mean free path). Yet
electronic metamaterials promise far richer categories of behavior than those
found in conventional optical metamaterial technologies. This is because unlike
photons that scarcely interact with each other, electrons in subwavelength
structured metamaterials are charged, and strongly interact. As a result, an
enormous variety of emergent phenomena can be expected, and radically new
classes of interacting quantum metamaterials designed
- …