2,991 research outputs found

    Challenge-Driven Innovation Policy: Towards a New Policy Toolkit

    Get PDF
    Policy makers are increasingly embracing the idea of using industrial and innovation policy to tackle the ‘grand challenges’ facing modern societies. This article argues that through well-defined goals, or more specifically ‘missions’, that are focused on solving important societal challenges, policymakers have the opportunity to determine the direction of growth by making strategic investments across many different sectors and nurturing new industrial landscapes, which the private sector can develop further, and as a result induce cross-sectoral learning and increase macroeconomic stability. This ‘mission-oriented’ approach to industrial policy is not about ‘top down’ planning by an overbearing state; it is about providing a direction for growth and increasing business expectations about future growth areas and catalysing activity that otherwise would not happen. It is not about de-risking and levelling the playing field, nor about supporting more competitive sectors over less since the market does not always ‘know best’ but tilting the playing field in the direction of the desired societal goals, such as the sustainable development goals. To achieve this requires a different policy framework, what we call the ‘ROAR’ framework, which involves strategic thinking about the desired direction of travel (Routes), the structure and capacity of public sector Organisations, the way in which policy is Assessed and the incentive structure for both private and public sectors (Risks and Rewards). The article argues that if we want to take grand challenges such as the SDGs seriously as policy goals, market shaping should become the overarching approach followed in various policy fields

    The impact of bidding aggregation levels on truckload rates

    Get PDF
    Thesis (M. Eng. in Logistics)--Massachusetts Institute of Technology, Engineering Systems Division, 2010.Cataloged from student submitted PDF version of thesis.Includes bibliographical references (p. 79-80).The objective of this thesis was to determine if line-haul rates are impacted by bid type, and if aggregation of bidding lanes can reduce costs for both shippers and carriers. Using regression analysis, we developed a model to isolate and test the cost effects that influence line-haul rate for long-haul shipments. We have determined that aggregation of low-volume lanes from point-to-point lanes to aggregated lanes can provide costs savings when lanes with origins and destinations in close proximity to each other can be bundled. In addition, bidding out region-to-region lanes can supplement point-to-point lanes by reducing the need to turn to the spot market. The model shows that bundling lanes can provide significant cost savings to a shipper because contract lanes of any type are on average less costly than spot moves. This thesis provides guidelines and suggestions for aggregation when creating bids during the first stage of the truckload procurement process.by Julia M. Collins and R. Ryan Quinlan.M.Eng.in Logistic

    Neuromotor control during stair ambulation in individuals with patellofemoral osteoarthritis compared to asymptomatic controls

    Get PDF
    Patellofemoral OA is characterized by PF pain during activities that load a flexed knee. Stair stepping ability is frequently impaired, yet little is known of the muscular recruitment strategies utilized during this task. Altered recruitment strategies may provide targets for clinical interventions. We aimed to determine if people with PFOA ascend and descend stairs with different muscular recruitment strategies compared to similar aged healthy individuals.Twenty-two people with PFOA and 20 controls were recruited. Electromyographic recordings from gluteus maximus and medius, medial and lateral hamstrings, vastus medialis and lateralis, medial and lateral gastrocnemius and soleus were acquired during stair ascent and descent. Force plate data was acquired to determine timing of foot placements and characterize dynamic stability.Seventeen people with PFOA (59 ± 10 years, 73 ± 13 kg, 167 ± 9 cm) and 15 controls (57 ± 10 years, 73 ± 16 kg, 171 ± 11 cm) had complete data. People with PFOA demonstrated: longer vastii activation duration during descent (lateralis: p = 0.01; medialis: p = 0.02); earlier onset of vastus lateralis for ascent (p

    Variable stroke timing of rubber fins' duty cycle improves force

    Get PDF

    Towards a formalism for mapping the spacetimes of massive compact objects: Bumpy black holes and their orbits

    Full text link
    Observations have established that extremely compact, massive objects are common in the universe. It is generally accepted that these objects are black holes. As observations improve, it becomes possible to test this hypothesis in ever greater detail. In particular, it is or will be possible to measure the properties of orbits deep in the strong field of a black hole candidate (using x-ray timing or with gravitational-waves) and to test whether they have the characteristics of black hole orbits in general relativity. Such measurements can be used to map the spacetime of a massive compact object, testing whether the object's multipoles satisfy the strict constraints of the black hole hypothesis. Such a test requires that we compare against objects with the ``wrong'' multipole structure. In this paper, we present tools for constructing bumpy black holes: objects that are almost black holes, but that have some multipoles with the wrong value. The spacetimes which we present are good deep into the strong field of the object -- we do not use a large r expansion, except to make contact with weak field intuition. Also, our spacetimes reduce to the black hole spacetimes of general relativity when the ``bumpiness'' is set to zero. We propose bumpy black holes as the foundation for a null experiment: if black hole candidates are the black holes of general relativity, their bumpiness should be zero. By comparing orbits in a bumpy spacetime with those of an astrophysical source, observations should be able to test this hypothesis, stringently testing whether they are the black holes of general relativity. (Abridged)Comment: 16 pages + 2 appendices + 3 figures. Submitted to PR

    Multifactor Dimensionality Reduction Reveals a Three-Locus Epistatic Interaction Associated with Susceptibility to Pulmonary Tuberculosis

    Get PDF
    Background: Identifying high-order genetics associations with non-additive (i.e. epistatic) effects in population-based studies of common human diseases is a computational challenge. Multifactor dimensionality reduction (MDR) is a machine learning method that was designed specifically for this problem. The goal of the present study was to apply MDR to mining high-order epistatic interactions in a population-based genetic study of tuberculosis (TB). Results: The study used a previously published data set consisting of 19 candidate single-nucleotide polymorphisms (SNPs) in 321 pulmonary TB cases and 347 healthy controls from Guniea-Bissau in Africa. The ReliefF algorithm was applied first to generate a smaller set of the five most informative SNPs. MDR with 10-fold cross-validation was then applied to look at all possible combinations of two, three, four and five SNPs. The MDR model with the best testing accuracy (TA) consisted of SNPs rs2305619, rs187084, and rs11465421 (TA = 0.588) in PTX3, TLR9 and DC-Sign, respectively. A general 1000-fold permutation test of the null hypothesis of no association confirmed the statistical significance of the model (p = 0.008). An additional 1000-fold permutation test designed specifically to test the linear null hypothesis that the association effects are only additive confirmed the presence of non-additive (i.e. nonlinear) or epistatic effects (p = 0.013). An independent information-gain measure corroborated these results with a third-order epistatic interaction that was stronger than any lower-order associations. Conclusions: We have identified statistically significant evidence for a three-way epistatic interaction that is associated with susceptibility to TB. This interaction is stronger than any previously described one-way or two-way associations. This study highlights the importance of using machine learning methods that are designed to embrace, rather than ignore, the complexity of common diseases such as TB. We recommend future studies of the genetics of TB take into account the possibility that high-order epistatic interactions might play an important role in disease susceptibility

    Kantowski-Sachs String Cosmologies

    Get PDF
    We present new exact solutions of the low-energy-effective-action string equations with both dilaton ϕ\phi and axion HH fields non-zero. The background universe is of Kantowski-Sachs type. We consider the possibility of a pseudoscalar axion field hh (H=eϕ(dh)∗H=e^\phi (dh)^{*}) that can be either time or space dependent. The case of time-dependent hh reduces to that of a stiff perfect-fluid cosmology. For space-dependent hh there is just one non-zero time-space-space component of the axion field HH, and this corresponds to a distinguished direction in space which prevents the models from isotropising. Also, in the latter case, both the axion field HH and its tensor potential BB (H=dBH=dB) are dependent on time and space yet the energy-momentum tensor remains time-dependent as required by the homogeneity of the cosmological model.Comment: 23 pages, REVTEX, 6 figures available on reques

    Heavy quark action on the anisotropic lattice

    Get PDF
    We investigate the O(a)O(a) improved quark action on anisotropic lattice as a potential framework for the heavy quark, which may enable precision computation of hadronic matrix elements of heavy-light mesons. The relativity relations of heavy-light mesons as well as of heavy quarkonium are examined on a quenched lattice with spatial lattice cutoff aσ−1≃a_\sigma^{-1} \simeq 1.6 GeV and the anisotropy ξ=4\xi=4. We find that the bare anisotropy parameter tuned for the massless quark describes both the heavy-heavy and heavy-light mesons within 2% accuracy for the quark mass aσmQ<0.8a_\sigma m_Q < 0.8, which covers the charm quark mass. This bare anisotropy parameter also successfully describes the heavy-light mesons in the quark mass region aσmQ≤1.2a_\sigma m_Q \leq 1.2 within the same accuracy. Beyond this region, the discretization effects seem to grow gradually. The anisotropic lattice is expected to extend by a factor ξ\xi the quark mass region in which the parameters in the action tuned for the massless limit are applicable for heavy-light systems with well controlled systematic errors.Comment: 11 pages, REVTeX4, 11 eps figure

    A likelihood-based particle imaging filter using prior information

    Get PDF
    Background: Particle imaging can increase precision in proton and ion therapy. Interactions with nuclei in the imaged object increase image noise and reduce image quality, especially for multinucleon ions that can fragment, such as helium. Purpose: This work proposes a particle imaging filter, referred to as the Prior Filter, based on using prior information in the form of an estimated relative stopping power (RSP) map and the principles of electromagnetic interaction, to identify particles that have undergone nuclear interaction. The particles identified as having undergone nuclear interactions are then excluded from the image reconstruction, reducing the image noise. Methods: The Prior Filter uses Fermi–Eyges scattering and Tschalär straggling theories to determine the likelihood that a particle only interacts electromagnetically. A threshold is then set to reject those particles with a low likelihood. The filter was evaluated and compared with a filter that estimates this likelihood based on the measured distribution of energy and scattering angle within pixels, commonly implemented as the 3σ filter. Reconstructed radiographs from simulated data of a 20-cm water cylinder and an anthropomorphic chest phantom were generated with both protons and helium ions to assess the effect of the filters on noise reduction. The simulation also allowed assessment of secondary particle removal through the particle histories. Experimental data were acquired of the Catphan CTP 404 Sensitometry phantom using the U.S. proton CT (pCT) collaboration prototype scanner. The proton and helium images were filtered with both the prior filtering method and a state-of-the-art method including an implementation of the 3σ filter. For both cases, a dE-E telescope filter, designed for this type of detector, was also applied. Results: The proton radiographs showed a small reduction in noise (1 mm of water-equivalent thickness [WET]) but a larger reduction in helium radiographs (up to 5–6 mm of WET) due to better secondary filtering. The proton and helium CT images reflected this, with similar noise at the center of the phantom (0.02 RSP) for the proton images and an RSP noise of 0.03 for the proposed filter and 0.06 for the 3σ filter in the helium images. Images reconstructed from data with a dose reduction, up to a factor of 9, maintained a lower noise level using the Prior Filter over the state-of-the-art filtering method. Conclusions: The proposed filter results in images with equal or reduced noise compared to those that have undergone a filtering method typical of current particle imaging studies. This work also demonstrates that the proposed filter maintains better performance against the state of the art with up to a nine-fold dose reduction
    • …
    corecore