105 research outputs found

    Multi-tasking computer control of video related equipment

    Get PDF
    The flexibility, cost-effectiveness and widespread availability of personal computers now makes it possible to completely integrate the previously separate elements of video post-production into a single device. Specifically, a personal computer, such as the Commodore-Amiga, can perform multiple and simultaneous tasks from an individual unit. Relatively low cost, minimal space requirements and user-friendliness, provides the most favorable environment for the many phases of video post-production. Computers are well known for their basic abilities to process numbers, text and graphics and to reliably perform repetitive and tedious functions efficiently. These capabilities can now apply as either additions or alternatives to existing video post-production methods. A present example of computer-based video post-production technology is the RGB CVC (Computer and Video Creations) WorkSystem. A wide variety of integrated functions are made possible with an Amiga computer existing at the heart of the system

    Inexperienced decision-makers' use of positive heuristics for marketing decisions

    Get PDF
    Purpose: Research has reliably demonstrated that decision-makers, especially expert ones, use heuristics to make decisions under uncertainty. However, whether decision-makers with little or no experience also do, and if so, how? is unknown. This research addresses this issue in the marketing context by studying how a group of young and generally inexperienced entrepreneurs decide when asked to set a price and choose a distribution channel in a scenario involving a hypothetical firm. Design/methodology/approach: The authors used think-aloud protocols to elicit data and then used inductive procedures to code the data for analysis. Findings: The inexperienced entrepreneurs in the sample used three types of heuristics in their decision-making, forming a structured process that narrows in scope. First, metacognitive heuristics, which specify a decision-making approach, were used, followed by heuristics representing the criteria they considered, and finally, heuristics detailing the execution of a selected option. The authors also found that heuristics relating to a market orientation, especially customer-centric criteria, were the most common, but these were balanced with ones representing an internal orientation or growth. Research limitations/implications: The generally inexperienced decision-makers the authors’ studied used heuristics in a structured way that helped them to select and balance several potentially conflicting decision-making criteria. As with most research using qualitative research designs, the generalizability of these findings is unclear. Further research on the mechanisms by which relatively inexperienced decision-makers learn the heuristics they use is recommended. Originality/value: This research's novelty lies in its focus on heuristic use by nonexpert decision-makers under conditions of uncertainty and the findings about their scope and the order they are used. As the authors collected data from think-aloud protocols with relatively young entrepreneurs with limited experience, they also offer a description of the heuristics used by nascent entrepreneurs when making marketing decisions about pricing and channels. The most surprising conclusion is that even without relevant domain-specific knowledge, decision-makers can use heuristics in an ecologically rational way (i.e. structured to match the environment).Peer reviewe

    Campus Vol III N 3

    Get PDF
    Olwin, Lynn. The Vacuum. Prose. 2. Gilbert, Ralph and Terry Thurn. Backstage With Home of The Brave . Prose. 4. Marshall, Jim. Boy Meets Laundromat . Prose. 6. Cooperrider, Tom. From One Room . Prose. 7. Thurn, Terry. Evaluation of a Blind Date . Picture. 8. Wishard, Rod. The Case Presented . Prose. 10. Horyn, Gene. Tug of War With Time Clocks . Prose. 11. Gould, James and Jack Matthews. Cigarettes and Coke and Wild, Wild Coeds . Prose. 13

    Genetic diversity among pandemic 2009 influenza viruses isolated from a transmission chain

    Get PDF
    BACKGROUND: Influenza viruses such as swine-origin influenza A(H1N1) virus (A(H1N1)pdm09) generate genetic diversity due to the high error rate of their RNA polymerase, often resulting in mixed genotype populations (intra-host variants) within a single infection. This variation helps influenza to rapidly respond to selection pressures, such as those imposed by the immunological host response and antiviral therapy. We have applied deep sequencing to characterize influenza intra-host variation in a transmission chain consisting of three cases due to oseltamivir-sensitive viruses, and one derived oseltamivir-resistant case. METHODS: Following detection of the A(H1N1)pdm09 infections, we deep-sequenced the complete NA gene from two of the oseltamivir-sensitive virus-infected cases, and all eight gene segments of the viruses causing the remaining two cases. RESULTS: No evidence for the resistance-causing mutation (resulting in NA H275Y substitution) was observed in the oseltamivir-sensitive cases. Furthermore, deep sequencing revealed a subpopulation of oseltamivir-sensitive viruses in the case carrying resistant viruses. We detected higher levels of intra-host variation in the case carrying oseltamivir-resistant viruses than in those infected with oseltamivir-sensitive viruses. CONCLUSIONS: Oseltamivir-resistance was only detected after prophylaxis with oseltamivir, suggesting that the mutation was selected for as a result of antiviral intervention. The persisting oseltamivir-sensitive virus population in the case carrying resistant viruses suggests either that a small proportion survive the treatment, or that the oseltamivir-sensitive virus rapidly re-establishes itself in the virus population after the bottleneck. Moreover, the increased intra-host variation in the oseltamivir-resistant case is consistent with the hypothesis that the population diversity of a RNA virus can increase rapidly following a population bottleneck

    The mortality rates and the space-time patterns of John Snow’s cholera epidemic map

    Get PDF
    Background Snow’s work on the Broad Street map is widely known as a pioneering example of spatial epidemiology. It lacks, however, two significant attributes required in contemporary analyses of disease incidence: population at risk and the progression of the epidemic over time. Despite this has been repeatedly suggested in the literature, no systematic investigation of these two aspects was previously carried out. Using a series of historical documents, this study constructs own data to revisit Snow’s study to examine the mortality rate at each street location and the space-time pattern of the cholera outbreak. Methods This study brings together records from a series of historical documents, and prepares own data on the estimated number of residents at each house location as well as the space-time data of the victims, and these are processed in GIS to facilitate the spatial-temporal analysis. Mortality rates and the space-time pattern in the victims’ records are explored using Kernel Density Estimation and network-based Scan Statistic, a recently developed method that detects significant concentrations of records such as the date and place of victims with respect to their distance from others along the street network. The results are visualised in a map form using a GIS platform. Results Data on mortality rates and space-time distribution of the victims were collected from various sources and were successfully merged and digitised, thus allowing the production of new map outputs and new interpretation of the 1854 cholera outbreak in London, covering more cases than Snow’s original report and also adding new insights into their space-time distribution. They confirmed that areas in the immediate vicinity of the Broad Street pump indeed suffered from excessively high mortality rates, which has been suspected for the past 160 years but remained unconfirmed. No distinctive pattern was found in the space-time distribution of victims’ locations. Conclusions The high mortality rates identified around the Broad Street pump are consistent with Snow’s theory about cholera being transmitted through contaminated water. The absence of a clear space-time pattern also indicates the water-bourne, rather than the then popular belief of air bourne, nature of cholera. The GIS data constructed in this study has an academic value and would cater for further research on Snow’s map

    A framework for the definition and interpretation of the use of surrogate endpoints in interventional trials

    Get PDF
    Background: Interventional trials that evaluate treatment effects using surrogate endpoints have become increasingly common. This paper describes four linked empirical studies and the development of a framework for defining, interpreting and reporting surrogate endpoints in trials. Methods: As part of developing the CONSORT (Consolidated Standards of Reporting Trials) and SPIRIT (Standard Protocol Items: Recommendations for Interventional Trials) extensions for randomised trials reporting surrogate endpoints, we undertook a scoping review, e-Delphi study, consensus meeting, and a web survey to examine current definitions and stakeholder (including clinicians, trial investigators, patients and public partners, journal editors, and health technology experts) interpretations of surrogate endpoints as primary outcome measures in trials. Findings: Current surrogate endpoint definitional frameworks are inconsistent and unclear. Surrogate endpoints are used in trials as a substitute of the treatment effects of an intervention on the target outcome(s) of ultimate interest, events measuring how patients feel, function, or survive. Traditionally the consideration of surrogate endpoints in trials has focused on biomarkers (e.g., HDL cholesterol, blood pressure, tumour response), especially in the medical product regulatory setting. Nevertheless, the concept of surrogacy in trials is potentially broader. Intermediate outcomes that include a measure of function or symptoms (e.g., angina frequency, exercise tolerance) can also be used as substitute for target outcomes (e.g., all-cause mortality)—thereby acting as surrogate endpoints. However, we found a lack of consensus among stakeholders on accepting and interpreting intermediate outcomes in trials as surrogate endpoints or target outcomes. In our assessment, patients and health technology assessment experts appeared more likely to consider intermediate outcomes to be surrogate endpoints than clinicians and regulators. Interpretation: There is an urgent need for better understanding and reporting on the use of surrogate endpoints, especially in the setting of interventional trials. We provide a framework for the definition of surrogate endpoints (biomarkers and intermediate outcomes) and target outcomes in trials to improve future reporting and aid stakeholders' interpretation and use of trial surrogate endpoint evidence

    A framework for the definition and interpretation of the use of surrogate endpoints in interventional trials

    Get PDF
    Background: Interventional trials that evaluate treatment effects using surrogate endpoints have become increasingly common. This paper describes four linked empirical studies and the development of a framework for defining, interpreting and reporting surrogate endpoints in trials. Methods: As part of developing the CONSORT (Consolidated Standards of Reporting Trials) and SPIRIT (Standard Protocol Items: Recommendations for Interventional Trials) extensions for randomised trials reporting surrogate endpoints, we undertook a scoping review, e-Delphi study, consensus meeting, and a web survey to examine current definitions and stakeholder (including clinicians, trial investigators, patients and public partners, journal editors, and health technology experts) interpretations of surrogate endpoints as primary outcome measures in trials. Findings: Current surrogate endpoint definitional frameworks are inconsistent and unclear. Surrogate endpoints are used in trials as a substitute of the treatment effects of an intervention on the target outcome(s) of ultimate interest, events measuring how patients feel, function, or survive. Traditionally the consideration of surrogate endpoints in trials has focused on biomarkers (e.g., HDL cholesterol, blood pressure, tumour response), especially in the medical product regulatory setting. Nevertheless, the concept of surrogacy in trials is potentially broader. Intermediate outcomes that include a measure of function or symptoms (e.g., angina frequency, exercise tolerance) can also be used as substitute for target outcomes (e.g., all-cause mortality)-thereby acting as surrogate endpoints. However, we found a lack of consensus among stakeholders on accepting and interpreting intermediate outcomes in trials as surrogate endpoints or target outcomes. In our assessment, patients and health technology assessment experts appeared more likely to consider intermediate outcomes to be surrogate endpoints than clinicians and regulators. Interpretation: There is an urgent need for better understanding and reporting on the use of surrogate endpoints, especially in the setting of interventional trials. We provide a framework for the definition of surrogate endpoints (biomarkers and intermediate outcomes) and target outcomes in trials to improve future reporting and aid stakeholders' interpretation and use of trial surrogate endpoint evidence. Funding: SPIRIT-SURROGATE/CONSORT-SURROGATE project is Medical Research Council Better Research Better Health (MR/V038400/1) funded

    Postglacial Colonization of Northern Coastal Habitat by Bottlenose Dolphins: A Marine Leading-Edge Expansion?

    Get PDF
    Oscillations in the Earth’s temperature and the subsequent retreating and advancing of ice-sheets around the polar regions are thought to have played an important role in shaping the distribution and genetic structuring of contemporary high-latitude populations. After the Last Glacial Maximum (LGM), retreating of the ice-sheets would have enabled early colonizers to rapidly occupy suitable niches to the exclusion of other conspecifics, thereby reducing genetic diversity at the leading-edge. Bottlenose dolphins (genus Tursiops) form distinct coastal and pelagic ecotypes, with finer-scale genetic structuring observed within each ecotype. We reconstruct the postglacial colonization of the Northeast Atlantic (NEA) by bottlenose dolphins using habitat modeling and phylogenetics. The AquaMaps model hindcasted suitable habitat for the LGM in the Atlantic lower latitude waters and parts of the Mediterranean Sea. The time-calibrated phylogeny, constructed with 86 complete mitochondrial genomes including 30 generated for this study and created using a multispecies coalescent model, suggests that the expansion to the available coastal habitat in the NEA happened via founder events starting ~15 000 years ago (95% highest posterior density interval: 4 900–26 400). The founders of the 2 distinct coastal NEA populations comprised as few as 2 maternal lineages that originated from the pelagic population. The low effective population size and genetic diversity estimated for the shared ancestral coastal population subsequent to divergence from the pelagic source population are consistent with leading-edge expansion. These findings highlight the legacy of the Late Pleistocene glacial cycles on the genetic structuring and diversity of contemporary populations

    Personalised Exercise-Rehabilitation FOR people with Multiple long-term conditions (PERFORM): protocol for a randomised feasibility trial

    Get PDF
    Introduction: Personalised Exercise-Rehabilitation FOR people with Multiple long-term conditions (PERFORM) is a research programme that seeks to develop and evaluate a comprehensive exercise-based rehabilitation intervention designed for people with multimorbidity, the presence of multiple long-term conditions (MLTCs). This paper describes the protocol for a randomised trial to assess the feasibility and acceptability of the PERFORM intervention, study design and processes. Methods and analysis: A multicentre, parallel two-group randomised trial with individual 2:1 allocation to the PERFORM exercise-based intervention plus usual care (intervention) or usual care alone (control). The primary outcome of this feasibility trial will be to assess whether prespecified progression criteria (recruitment, retention, intervention adherence) are met to progress to the full randomised trial. The trial will be conducted across three UK sites and 60 people with MLTCs, defined as two or more LTCs, with at least one having evidence of the beneficial effect of exercise. The PERFORM intervention comprises an 8-week (twice a week for 6 weeks and once a week for 2 weeks) supervised rehabilitation programme of personalised exercise training and self-management education delivered by trained healthcare professionals followed by two maintenance sessions. Trial participants will be recruited over a 4.5-month period, and outcomes assessed at baseline (prerandomisation) and 3 months postrandomisation and include health-related quality of life, psychological well-being, symptom burden, frailty, exercise capacity, physical activity, sleep, cognition and serious adverse events. A mixed-methods process evaluation will assess acceptability, feasibility and fidelity of intervention delivery and feasibility of trial processes. An economic evaluation will assess the feasibility of data collection and estimate the costs of the PERFORM intervention. Ethics and dissemination: The trial has been given favourable opinion by the West Midlands, Edgbaston Research Ethics Service (Ref: 23/WM/0057). Participants will be asked to give full, written consent to take part by trained researchers. Findings will be disseminated via journals, presentations and targeted communications to clinicians, commissioners, service users and patients and the public. Trial registration number: ISRCTN68786622. Protocol version 2.0 (16 May 2023)
    • …
    corecore