149 research outputs found

    A hybrid keyword and patent class methodology for selecting relevant sets of patents for a technological field

    Get PDF
    This paper presents a relatively simple, objective and repeatable method for selecting sets of patents that are representative of a specific technological domain. The methodology consists of using search terms to locate the most representative international and US patent classes and determines the overlap of those classes to arrive at the final set of patents. Five different technological fields (computed tomography, solar photovoltaics, wind turbines, electric capacitors, electrochemical batteries) are used to test and demonstrate the proposed method. Comparison against traditional keyword searches and individual patent class searches shows that the method presented in this paper can find a set of patents with more relevance and completeness and no more effort than the other two methods. Follow on procedures to potentially improve the relevancy and completeness for specific domains are also defined and demonstrated. The method is compared to an expertly selected set of patents for an economic domain, and is shown to not be a suitable replacement for that particular use case. The paper also considers potential uses for this methodology and the underlying techniques as well as limitations of the methodology.SUTD-MIT International Design Cente

    Polymorphism discovery and allele frequency estimation using high-throughput DNA sequencing of target-enriched pooled DNA samples

    Get PDF
    BACKGROUND: The central role of the somatotrophic axis in animal post-natal growth, development and fertility is well established. Therefore, the identification of genetic variants affecting quantitative traits within this axis is an attractive goal. However, large sample numbers are a pre-requisite for the identification of genetic variants underlying complex traits and although technologies are improving rapidly, high-throughput sequencing of large numbers of complete individual genomes remains prohibitively expensive. Therefore using a pooled DNA approach coupled with target enrichment and high-throughput sequencing, the aim of this study was to identify polymorphisms and estimate allele frequency differences across 83 candidate genes of the somatotrophic axis, in 150 Holstein-Friesian dairy bulls divided into two groups divergent for genetic merit for fertility. RESULTS: In total, 4,135 SNPs and 893 indels were identified during the resequencing of the 83 candidate genes. Nineteen percent (n = 952) of variants were located within 5' and 3' UTRs. Seventy-two percent (n = 3,612) were intronic and 9% (n = 464) were exonic, including 65 indels and 236 SNPs resulting in non-synonymous substitutions (NSS). Significant (P < 0.01) mean allele frequency differentials between the low and high fertility groups were observed for 720 SNPs (58 NSS). Allele frequencies for 43 of the SNPs were also determined by genotyping the 150 individual animals (Sequenom(® )MassARRAY). No significant differences (P > 0.1) were observed between the two methods for any of the 43 SNPs across both pools (i.e., 86 tests in total). CONCLUSIONS: The results of the current study support previous findings of the use of DNA sample pooling and high-throughput sequencing as a viable strategy for polymorphism discovery and allele frequency estimation. Using this approach we have characterised the genetic variation within genes of the somatotrophic axis and related pathways, central to mammalian post-natal growth and development and subsequent lactogenesis and fertility. We have identified a large number of variants segregating at significantly different frequencies between cattle groups divergent for calving interval plausibly harbouring causative variants contributing to heritable variation. To our knowledge, this is the first report describing sequencing of targeted genomic regions in any livestock species using groups with divergent phenotypes for an economically important trait

    Activation of the innate immune receptor Dectin-1 upon formation of a 'phagocytic synapse'.

    Get PDF
    Innate immune cells must be able to distinguish between direct binding to microbes and detection of components shed from the surface of microbes located at a distance. Dectin-1 (also known as CLEC7A) is a pattern-recognition receptor expressed by myeloid phagocytes (macrophages, dendritic cells and neutrophils) that detects β-glucans in fungal cell walls and triggers direct cellular antimicrobial activity, including phagocytosis and production of reactive oxygen species (ROS). In contrast to inflammatory responses stimulated upon detection of soluble ligands by other pattern-recognition receptors, such as Toll-like receptors (TLRs), these responses are only useful when a cell comes into direct contact with a microbe and must not be spuriously activated by soluble stimuli. In this study we show that, despite its ability to bind both soluble and particulate β-glucan polymers, Dectin-1 signalling is only activated by particulate β-glucans, which cluster the receptor in synapse-like structures from which regulatory tyrosine phosphatases CD45 and CD148 (also known as PTPRC and PTPRJ, respectively) are excluded (Supplementary Fig. 1). The 'phagocytic synapse' now provides a model mechanism by which innate immune receptors can distinguish direct microbial contact from detection of microbes at a distance, thereby initiating direct cellular antimicrobial responses only when they are required

    A study of the breakdown of the quasi-static approximation at high densities and its effect on the helium-like K ALPHA complex of nickel, iron, and calcium

    Get PDF
    The General Spectral Modeling (GSM) code employs the quasi-static approximation, a standard, low-density methodology that assumes the ionization balance is separable from a determination of the excited-state populations that give rise to the spectra. GSM also allows for some states to be treated only as contributions to effective rates. While these two approximations are known to be valid at low densities, this work investigates using such methods to model high-density, non-LTE emission spectra and determines at what point the approximations break down by comparing to spectra produced by the LANL code ATOMIC which makes no such approximations. As both approximations are used by other astrophysical and low-density modeling codes, the results should be of broad interest. He-like Kα\alpha emission spectra are presented for Ni, Fe, and Ca, in order to gauge the effect of both approximations employed in GSM. This work confirms that at and above the temperature of maximum abundance of the He-like ionization stage, the range of validity for both approximations is sufficient for modeling the low- and moderate-density regimes one typically finds in astrophysical and magnetically confined fusion plasmas. However, a breakdown does occur for high densities; we obtain quantitative limits that are significantly higher than previous works. This work demonstrates that, while the range of validity for both approximations is sufficient to predict the density-dependent quenching of the z line, the approximations break down at higher densities. Thus these approximations should be used with greater care when modeling high-density plasmas such as those found in inertial confinement fusion and electromagnetic pinch devices.Comment: Accepted by Physical Review A (http://pra.aps.org/). 11 pages + LANL cover, 5 figures. Will update citation information as it becomes available. Abbreviated abstract is listed her

    Determining level of care appropriateness in the patient journey from acute care to rehabilitation

    Get PDF
    Background: The selection of patients for rehabilitation, and the timing of transfer from acute care, are important clinical decisions that impact on care quality and patient flow. This paper reports utilization review data on inpatients in acute care with stroke, hip fracture or elective joint replacement, and other inpatients referred for rehabilitation. It examines reasons why acute level of care criteria are not met and explores differences in decision making between acute care and rehabilitation teams around patient appropriateness and readiness for transfer. Methods: Cohort study of patients in a large acute referral hospital in Australia followed with the InterQual utilization review tool, modified to also include reasons why utilization criteria are not met. Additional data on team decision making about appropriateness for rehabilitation, and readiness for transfer, were collected on a subset of patients. Results: There were 696 episodes of care (7189 bed days). Days meeting acute level of care criteria were 56% (stroke, hip fracture and joint replacement patients) and 33% (other patients, from the time of referral). Most inappropriate days in acute care were due to delays in processes/scheduling (45%) or being more appropriate for rehabilitation or lower level of care (30%). On the subset of patients, the acute care team and the utilization review tool deemed patients ready for rehabilitation transfer earlier than the rehabilitation team (means of 1.4, 1.3 and 4.0 days from the date of referral, respectively). From when deemed medically stable for transfer by the acute care team, 28% of patients became unstable. From when deemed stable by the rehabilitation team or utilization review, 9% and 11%, respectively, became unstable. Conclusions: A high proportion of patient days did not meet acute level of care criteria, due predominantly to inefficiencies in care processes, or to patients being more appropriate for an alternative level of care, including rehabilitation. The rehabilitation team was the most accurate in determining ongoing medical stability, but at the cost of a longer acute stay. To avoid inpatients remaining in acute care in a state of \u27terra nullius\u27, clinical models which provide rehabilitation within acute care, and more efficient movement to a rehabilitation setting, is required. Utilization review could have a decision support role in the determination of medical stability

    De novo mutations in SMCHD1 cause Bosma arhinia microphthalmia syndrome and abrogate nasal development

    Get PDF
    Bosma arhinia microphthalmia syndrome (BAMS) is an extremely rare and striking condition characterized by complete absence of the nose with or without ocular defects. We report here that missense mutations in the epigenetic regulator SMCHD1 mapping to the extended ATPase domain of the encoded protein cause BAMS in all 14 cases studied. All mutations were de novo where parental DNA was available. Biochemical tests and in vivo assays in Xenopus laevis embryos suggest that these mutations may behave as gain-of-function alleles. This finding is in contrast to the loss-of-function mutations in SMCHD1 that have been associated with facioscapulohumeral muscular dystrophy (FSHD) type 2. Our results establish SMCHD1 as a key player in nasal development and provide biochemical insight into its enzymatic function that may be exploited for development of therapeutics for FSHD

    Identifying the science and technology dimensions of emerging public policy issues through horizon scanning

    Get PDF
    Public policy requires public support, which in turn implies a need to enable the public not just to understand policy but also to be engaged in its development. Where complex science and technology issues are involved in policy making, this takes time, so it is important to identify emerging issues of this type and prepare engagement plans. In our horizon scanning exercise, we used a modified Delphi technique [1]. A wide group of people with interests in the science and policy interface (drawn from policy makers, policy adviser, practitioners, the private sector and academics) elicited a long list of emergent policy issues in which science and technology would feature strongly and which would also necessitate public engagement as policies are developed. This was then refined to a short list of top priorities for policy makers. Thirty issues were identified within broad areas of business and technology; energy and environment; government, politics and education; health, healthcare, population and aging; information, communication, infrastructure and transport; and public safety and national security.Public policy requires public support, which in turn implies a need to enable the public not just to understand policy but also to be engaged in its development. Where complex science and technology issues are involved in policy making, this takes time, so it is important to identify emerging issues of this type and prepare engagement plans. In our horizon scanning exercise, we used a modified Delphi technique [1]. A wide group of people with interests in the science and policy interface (drawn from policy makers, policy adviser, practitioners, the private sector and academics) elicited a long list of emergent policy issues in which science and technology would feature strongly and which would also necessitate public engagement as policies are developed. This was then refined to a short list of top priorities for policy makers. Thirty issues were identified within broad areas of business and technology; energy and environment; government, politics and education; health, healthcare, population and aging; information, communication, infrastructure and transport; and public safety and national security

    Stemming the Tide of Antibiotic Resistance (STAR): A protocol for a trial of a complex intervention addressing the 'why' and 'how' of appropriate antibiotic prescribing in general practice

    Get PDF
    BACKGROUND: After some years of a downward trend, antibiotic prescribing rates in the community have tended to level out in many countries. There is also wide variation in antibiotic prescribing between general practices, and between countries. There are still considerable further gains that could be made in reducing inappropriate antibiotic prescribing, but complex interventions are required. Studies to date have generally evaluated the effect of interventions on antibiotic prescribing in a single consultation and pragmatic evaluations that assess maintenance of new skills are rare. This paper describes the protocol for a pragmatic, randomized evaluation of a complex intervention aimed at reducing antibiotic prescribing by primary care clinicians. METHODS AND DESIGN: We developed a Social Learning Theory based, blended learning program (on-line learning, a practice based seminar, and context bound learning) called the STAR Educational Program. The 'why of change' is addressed by providing clinicians in general practice with information on antibiotic resistance in urine samples submitted by their practice and their antibiotic prescribing data, and facilitating a practice-based seminar on the implications of this data. The 'how of change' is addressed through context-bound communication skills training and information on antibiotic indication and choice. This intervention will be evaluated in a trial involving 60 general practices, with general practice as the unit of randomization (clinicians from each practice to either receive the STAR Educational Program or not) and analysis. The primary outcome will be the number of antibiotic items dispensed over one year. An economic and process evaluation will also be conducted. DISCUSSION: This trial will be the first to evaluate the effectiveness of this type of theory-based, blended learning intervention aimed at reducing antibiotic prescribing by primary care clinicians. Novel aspects include feedback of practice level data on antimicrobial resistance and prescribing, use of principles from motivational interviewing, training in enhanced communication skills that incorporates context-bound experience and reflection, and using antibiotic dispensing over one year (as opposed to antibiotic prescribing in a single consultation) as the main outcome
    corecore