56 research outputs found

    Market orientation and vertical de-integration: creating customer and company value

    Get PDF
    This thesis explores the relationship between a firm's supply chain and a firm's degree of market orientation and economic performance. The results suggest that certain types of supply chain design - in particular those models that make for close links with the firm's customers - lead to superior marketing and shareholder value. Two sets of environmental forces have been particularly influential in reshaping supply chains over recent years. One is the enormous growth in production capacity, especially in the Far East, which has lead to more industries operating with excess capacity. Production skills and resources were once seen as at the heart of a firm's core capabilities and the source of its competitive advantage. Today, in more and more sectors, the key skill is marketing - creating customer preference in oversupplied markets through branding and customer relationship management. Downstream activities in the supply chain have risen in prominence compared to upstream activities. The second change has been the information revolution brought about by the computer and the Internet. This has lowered the transaction costs of integrating the activities performed by the different businesses constituting a supply chain and made it increasingly attractive to achieve control without ownership. Supply chains can now become networks integrated through seamless in formation exchanges We explore these changes at the microeconomic level. The research draws upon the existing literature and on primary data including exploratory interviews, main-study in-depth interviews and survey data. Matched pair samples of 20 high performance and 20 low performance business units based in the UK provided the main body of data results. Data analysis involved four distinct phases; within case analysis and cross case analysis for the qualitative data collected; exploratory factor analysis (EFA) to identify dimensions of influence as a method of integration; discriminant analysis and Lambda to investigate the association between supply chain configuration typologies, market orientation and business performance. Two major contributions stem from this research. First, the interdisciplinary domain for supply chain configuration can be established. Whereas traditionally competitive advantage has been built through a focus on operations efficiency - streamlining processes to reduce cost, today increased communications, global markets and the speed at which Internet technologies are developing, demand and facilitate an additional perspective for supply chain management - the effectiveness perspective. The concept of effectiveness brings the subject of supply chain management from the sphere of operations management into the domain of marketing strategy. From this perspective the building, maintenance and management of customer relationships becomes central to the supply chain configuration. Highly efficient production processes, where fiercely protected technical know-how enables the delivery of superior quality products, no longer acts as a sustainable source of competitive advantage. To achieve this, firms must focus on two principle activities: building brand value and carefully fostering relationships with key customers. For firms positioned upstream in the supply chain, building a strong brand identity offers potentially a means to integrate downstream with both customers and consumers. The second contribution comes from the association of supply chain configuration with other variables. Our results show a relationship between market orientation, business performance and supply chain configuration. We conclude that companies are beginning to recognise opportunities that arise from using technology and information to blur traditional boundaries between suppliers, manufacturers and end users. We discuss how technology enables co-ordination across company boundaries to achieve new levels of efficiency and effectiveness, as well as extraordinary returns to investors. For example, a company, its suppliers, and even its customers might begin to share information and activities to speed the design of a product and raise the likelihood of its success in the marketplace. This should enable suppliers to begin developing components before the overall product design is complete, providing vital timely feedback regarding component specification, cost and time objectives. Equally, customers are able to review a product as it evolves and provide input on how it meets their needs. Managers must concern themselves with the design stages of the product and facilitate knowledge and information flows through the entire supply chain. Business seems to be on the threshold of a new era of inter-firm relationships. Supply chain customers sharing the same suppliers are able to provide leadership, encouraging shared distribution systems and payment/ordering systems. Over capacity in firms forces such considerations. Collaborative approaches can drive down costs and ultimately offer improved services for consumers, making available the goods they want, where and when they want them. But this configuration of an interconnected, interdependent supply network requires much more openness. Interfirm boundaries must become almost invisible. Trust, commitment, open communication and information sharing must permeate the culture of partnering firms. The sharing of real time customer information both within and between firms facilitates the reduction of inventory and increases speed to market, reducing risk and increasing cost savings. Customer information provides a sound basis for segmenting markets, allowing the understanding of customer needs to develop in a deeper way. This customer closeness gives access to information critical in aiding accurate forecasting which is central to the elimination of unnecessary costs and enabling firms to dramatically extend the value they deliver to customers thus creating competitive advantage. Shrinking the time and the resources it takes to meet customers' needs in a world where those needs are constantly changing is the challenge. As Wayne Gretzky, the famous hockey player explained, "the key to winning is getting first to where the puck is going next". The same could be said about succeeding in business. Listening to customers and then using and sharing this most valuable information resource throughout the supply chain will be the key

    The Open Innovation Team:An Independent Evaluation of a Cabinet Office Initiative

    Get PDF
    The Cabinet Office Open Innovation Team (‘OIT’) was set up in 2016 to help Whitehall departments generate analysis and ideas by deepening collaboration with academics. In practice, this requires the OIT to engage deeply with the policy work of multiple departments, and the many academics who might be relevant and valuable to those policy makers. Through events, workshops, individual meetings, tailored reports and literature reviews, the OIT brings these new sources of academic expertise in to Whitehall to help shape government thinking. The OIT is relatively small, with four full time officials and a rotating group of PhD students on secondment from leading UK universities. Despite this size, the ability of the OIT to catalyse new relationships and knowledge sharing is already evident (see OIT reports to University partners). The OIT’s pilot phase ends in late 2018, and the team is now agreeing a second round of funding from university partners. These partners provide the financial resource to cover direct salary and operating expenses, whilst Cabinet Office covers location and infrastructure costs, making this a unique business model within government in terms of its funding approach and ways of working. This review captures insights from 14 months of academic research with the OIT, in order to: â–Ș Make visible the promising practices developed by the OIT, giving stakeholders a greater understanding of its strategic choices, operating structures, and ways of creating value. â–Ș Shape management practice within the OIT through our analysis and recommendations. â–Ș Provide an evaluation of the OIT that yields actionable information for all stakeholders

    Managing to make markets : Marketization and the conceptualization work of strategic nets in the life science sector

    Get PDF
    Abstract This paper presents one of the first studies to identify and explain the marketization work of a strategic net. Through a study of the Stevenage Bioscience Catalyst – a strategic net formed to support the marketization of Life Science Discoveries - we generate insights into the everyday work that makes marketization happen. Marketization is understood as the process that enables the conceptualisation, production and exchange of goods. Our findings focus on one specific form of marketization work found to be core to the strategic net: conceptualisation work. Three forms of conceptualisation work are identified: conceptualising actors' roles, conceptualising markets and conceptualising goods. These manifest as routinized, recursive practices. Our analysis reveals how these practices gather together multiple forms of scientific, technical and market knowledge to generate new market devices that transform market rules and conventions, and introduce new methods and instruments of valuation that change the market. In contrast to extant studies that claim a strategic net's activities influence markets; our findings position the conceptualisation work of the strategic net as constitutive of markets and the broader system of provision for ‘healthcare’ and ‘health futures’

    Large-scale assessment of 7-11-year-olds’ cognitive and sensorimotor function within the Born in Bradford longitudinal birth cohort study

    Get PDF
    Background: Cognitive ability and sensorimotor function are crucial aspects of children’s development, and are associated with physical and mental health outcomes and educational attainment. This paper describes cross-sectional sensorimotor and cognitive function data collected on over 15,000 children aged 7-10 years, collected as part of the Born in Bradford (BiB) longitudinal birth-cohort study. Methodological details of the large-scale data collection process are described, along with initial analyses of the data involving the relationship between cognition/sensorimotor ability and age and task difficulty, and associations between tasks. Method: Data collection was completed in 86 schools between May 2016 and July 2019. Children were tested at school, individually, using a tablet computer with a digital stylus or finger touch for input. Assessments comprised a battery of three sensorimotor tasks (Tracking, Aiming, &amp; Steering) and five cognitive tasks (three Working Memory tasks, Inhibition, and Processing Speed), which took approximately 40 minutes. Results: Performance improved with increasing age and decreasing task difficulty, for each task. Performance on all three sensorimotor tasks was correlated, as was performance on the three working memory tasks. In addition, performance on a composite working memory score correlated with performance on both inhibition and processing speed. Interestingly, within age-group variation was much larger than between age-group variation. Conclusions: The current project collected computerised measures of a range of cognitive and sensorimotor functions at 7-10 years of age in over 15,000 children. Performance varied as expected by age and task difficulty, and showed the predicted correlations between related tasks. Large within-age group variation highlights the need to consider the profile of individual children in studying cognitive and sensorimotor development. These data can be linked to the wider BiB dataset including measures of physical and mental health, biomarkers and genome-wide data, socio-demographic information, and routine data from local health and education services.</p

    CMB observations from the CBI and VSA: A comparison of coincident maps and parameter estimation methods

    Full text link
    We present coincident observations of the Cosmic Microwave Background (CMB) from the Very Small Array (VSA) and Cosmic Background Imager (CBI) telescopes. The consistency of the full datasets is tested in the map plane and the Fourier plane, prior to the usual compression of CMB data into flat bandpowers. Of the three mosaics observed by each group, two are found to be in excellent agreement. In the third mosaic, there is a 2 sigma discrepancy between the correlation of the data and the level expected from Monte Carlo simulations. This is shown to be consistent with increased phase calibration errors on VSA data during summer observations. We also consider the parameter estimation method of each group. The key difference is the use of the variance window function in place of the bandpower window function, an approximation used by the VSA group. A re-evaluation of the VSA parameter estimates, using bandpower windows, shows that the two methods yield consistent results.Comment: 10 pages, 6 figures. Final version. Accepted for publication in MNRA

    Cosmological parameter estimation using Very Small Array data out to l=1500

    Get PDF
    We estimate cosmological parameters using data obtained by the Very Small Array (VSA) in its extended configuration, in conjunction with a variety of other CMB data and external priors. Within the flat Λ\LambdaCDM model, we find that the inclusion of high resolution data from the VSA modifies the limits on the cosmological parameters as compared to those suggested by WMAP alone, while still remaining compatible with their estimates. We find that Ωbh2=0.0234−0.0014+0.0012\Omega_{\rm b}h^2=0.0234^{+0.0012}_{-0.0014}, Ωdmh2=0.111−0.016+0.014\Omega_{\rm dm}h^2=0.111^{+0.014}_{-0.016}, h=0.73−0.05+0.09h=0.73^{+0.09}_{-0.05}, nS=0.97−0.03+0.06n_{\rm S}=0.97^{+0.06}_{-0.03}, 1010AS=23−3+710^{10}A_{\rm S}=23^{+7}_{-3} and τ=0.14−0.07+0.14\tau=0.14^{+0.14}_{-0.07} for WMAP and VSA when no external prior is included.On extending the model to include a running spectral index of density fluctuations, we find that the inclusion of VSA data leads to a negative running at a level of more than 95% confidence (nrun=−0.069±0.032n_{\rm run}=-0.069\pm 0.032), something which is not significantly changed by the inclusion of a stringent prior on the Hubble constant. Inclusion of prior information from the 2dF galaxy redshift survey reduces the significance of the result by constraining the value of Ωm\Omega_{\rm m}. We discuss the veracity of this result in the context of various systematic effects and also a broken spectral index model. We also constrain the fraction of neutrinos and find that fÎœ<0.087f_{\nu}< 0.087 at 95% confidence which corresponds to mÎœ<0.32eVm_\nu<0.32{\rm eV} when all neutrino masses are the equal. Finally, we consider the global best fit within a general cosmological model with 12 parameters and find consistency with other analyses available in the literature. The evidence for nrun<0n_{\rm run}<0 is only marginal within this model

    High sensitivity measurements of the CMB power spectrum with the extended Very Small Array

    Full text link
    We present deep Ka-band (Μ≈33\nu \approx 33 GHz) observations of the CMB made with the extended Very Small Array (VSA). This configuration produces a naturally weighted synthesized FWHM beamwidth of ∌11\sim 11 arcmin which covers an ℓ\ell-range of 300 to 1500. On these scales, foreground extragalactic sources can be a major source of contamination to the CMB anisotropy. This problem has been alleviated by identifying sources at 15 GHz with the Ryle Telescope and then monitoring these sources at 33 GHz using a single baseline interferometer co-located with the VSA. Sources with flux densities \gtsim 20 mJy at 33 GHz are subtracted from the data. In addition, we calculate a statistical correction for the small residual contribution from weaker sources that are below the detection limit of the survey. The CMB power spectrum corrected for Galactic foregrounds and extragalactic point sources is presented. A total ℓ\ell-range of 150-1500 is achieved by combining the complete extended array data with earlier VSA data in a compact configuration. Our resolution of Δℓ≈60\Delta \ell \approx 60 allows the first 3 acoustic peaks to be clearly delineated. The is achieved by using mosaiced observations in 7 regions covering a total area of 82 sq. degrees. There is good agreement with WMAP data up to ℓ=700\ell=700 where WMAP data run out of resolution. For higher ℓ\ell-values out to ℓ=1500\ell = 1500, the agreement in power spectrum amplitudes with other experiments is also very good despite differences in frequency and observing technique.Comment: 16 pages. Accepted in MNRAS (minor revisions

    Radio source calibration for the VSA and other CMB instruments at around 30 GHz

    Get PDF
    Accurate calibration of data is essential for the current generation of CMB experiments. Using data from the Very Small Array (VSA), we describe procedures which will lead to an accuracy of 1 percent or better for experiments such as the VSA and CBI. Particular attention is paid to the stability of the receiver systems, the quality of the site and frequent observations of reference sources. At 30 GHz the careful correction for atmospheric emission and absorption is shown to be essential for achieving 1 percent precision. The sources for which a 1 percent relative flux density calibration was achieved included Cas A, Cyg A, Tau A and NGC7027 and the planets Venus, Jupiter and Saturn. A flux density, or brightness temperature in the case of the planets, was derived at 33 GHz relative to Jupiter which was adopted as the fundamental calibrator. A spectral index at ~30 GHz is given for each. Cas A,Tau A, NGC7027 and Venus were examined for variability. Cas A was found to be decreasing at 0.394±0.0190.394 \pm 0.019 percent per year over the period March 2001 to August 2004. In the same period Tau A was decreasing at 0.22±0.070.22\pm 0.07 percent per year. A survey of the published data showed that the planetary nebula NGC7027 decreased at 0.16±0.040.16\pm 0.04 percent per year over the period 1967 to 2003. Venus showed an insignificant (1.5±1.31.5 \pm 1.3 percent) variation with Venusian illumination. The integrated polarization of Tau A at 33 GHz was found to be 7.8±0.67.8\pm 0.6 percent at pa =148∘±3∘ = 148^\circ \pm 3^\circ.}Comment: 13 pages, 15 figures, submitted to MNRA

    Randomised controlled trial comparing intraoperative cell salvage and autotransfusion with standard care in the treatment of hip fractures : a protocol for the WHITE 9 study

    Get PDF
    Introduction: People who sustain a hip fracture are typically elderly, frail and require urgent surgery. Hip fracture and the urgent surgery is associated with acute blood loss, compounding patients’ pre-existing comorbidities including anaemia. Approximately 30% of patients require a donor blood transfusion in the perioperative period. Donor blood transfusions are associated with increased rates of infections, allergic reactions and longer lengths of stay. Furthermore, there is a substantial cost associated with the use of donor blood. Cell salvage and autotransfusion is a technique that recovers, washes and transfuses blood lost during surgery back to the patient. The objective of this study is to determine the clinical and cost effectiveness of intraoperative cell salvage, compared with standard care, in improving health related quality-of-life of patients undergoing hip fracture surgery. Methods and analysis: Multicentre, parallel group, two-arm, randomised controlled trial. Patients aged 60 years and older with a hip fracture treated with surgery are eligible. Participants will be randomly allocated on a 1:1 basis to either undergo cell salvage and autotransfusion or they will follow the standard care pathway. Otherwise, all care will be in accordance with the National Institute for Health and Care Excellence guidance. A minimum of 1128 patients will be recruited to obtain 90% power to detect a 0.075-point difference in the primary endpoint: EuroQol-5D-5L HRQoL at 4 months post injury. Secondary outcomes will include complications, postoperative delirium, residential status, mobility, allogenic blood use, mortality and resource use. Ethics and dissemination: NHS ethical approval was provided on 14 August 2019 (19/WA/0197) and the trial registered (ISRCTN15945622). After the conclusion of this trial, a manuscript will be prepared for peer-review publication. Results will be disseminated in lay form to participants and the public. Trial registration number: ISRCTN15945622

    Impact of a referral management “gateway” on the quality of referral letters; a retrospective time series cross sectional review

    Get PDF
    Background Referral management centres (RMC) for elective referrals are designed to facilitate the primary to secondary care referral path, by improving quality of referrals and easing pressures on finite secondary care services, without inadvertently compromising patient care. This study aimed to evaluate whether the introduction of a RMC which includes triage and feedback improved the quality of elective outpatient referral letters. Methods Retrospective, time-series, cross-sectional review involving 47 general practices in one primary care trust (PCT) in South-East England. Comparison of a random sample of referral letters at baseline (n = 301) and after seven months of referral management (n = 280). Letters were assessed for inclusion of four core pieces of information which are used locally to monitor referral quality (blood pressure, body mass index, past medical history, medication history) and against research-based quality criteria for referral letters (provision of clinical information and clarity of reason for referral). Results Following introduction of the RMC, the proportion of letters containing each of the core items increased compared to baseline. Statistically significant increases in the recording of ‘past medical history’ (from 71% to 84%, p < 0.001) and ‘medication history’ (78% to 87%, p = 0.006) were observed. Forty four percent of letters met the research-based quality criteria at baseline but there was no significant change in quality of referral letters judged on these criteria across the two time periods. Conclusion Introduction of RMC has improved the inclusion of past medical history and medication history in referral letters, but not other measures of quality. In approximately half of letters there remains room for further improvement
    • 

    corecore