2,404 research outputs found

    Science and Team Development

    Get PDF
    This paper explores a new idea about the future development of science and teams, and predicts its possible applications in science, education, workforce development and research. The inter-relatedness of science and teamwork developments suggests a growing importance of team facilitators’ quality, as well as the criticality of detailed studies of teamwork processes and team consortiums to address the increasing complexity of exponential knowledge growth and work interdependency. In the future, it will become much easier to produce a highly specialised workforce, such as brain surgeons or genome engineers, than to identify, educate and develop individuals capable of the delicate and complex work of multi-team facilitation. Such individuals will become the new scientists of the millennium, having extraordinary knowledge in variety of scientific fields, unusual mix of abilities, possessing highly developed interpersonal and teamwork skills, and visionary ideas in illuminating bold strategies for new scientific discoveries. The new scientists of the millennium, through team consortium facilitation, will be able to build bridges between the multitude of diverse and extremely specialised knowledge and interdependent functions to improve systems for the further benefit of mankind

    Breaking Down Systemic Barriers Around African American Entrepreneurship

    Get PDF
    Jul 1, 2020 The benefits of entrepreneurship are well documented. Startups are responsible for nearly all job growth in the United States, experience growth rates substantially higher than other businesses, can help eliminate poverty and have an outsized impact on overall economic productivity and GDP. Despite this, entrepreneurship and venture capital have been largely closed to minorities, with African Americans being impacted particularly hard. African Americans make up approximately 13 percent of the U.S. population yet only 2.2 percent of small businesses are owned by them. By comparison, whites make up approximately 60 percent of the population but own 82.5 percent of all businesses. Black-owned businesses are also valued eight times lower than white-owned firms and their annual revenue averages nine times less than white-owned businesses. Compounding this is the fact that only one percent of venture capital dollars go to Black entrepreneurs. All of this has helped contribute to our economy\u27s growing inequality. On Tuesday, June 30th the IDEA Center at the University of Notre Dame hosted a webinar conversation on how to break down systemic barriers around African American entrepreneurship. More specifically, our panel of speakers discussed what barriers exist for Black entrepreneurs (and why); what needs to be done to increase Black entrepreneurship; how models of Black entrepreneurship might look different from what traditional models expect; and how Notre Dame, other universities, investors, politicians, and attendees of the webinar can support these efforts now. This webinar featured Philip Gaskin, vice president of entrepreneurship at the Ewing Marion Kauffman Foundation; G. Marcus Cole, the Joseph A. Matson Dean and Professor of Law at Notre Dame Law School; Andrew Welters, CEO and partner at 5Lion Ventures; and Bryan Ritchie, the Vice President and Cathy and John Martin Associate Provost for Innovation at Notre Dame

    Stability of response characteristics of a Delphi panel: application of bootstrap data expansion

    Get PDF
    BACKGROUND: Delphi surveys with panels of experts in a particular area of interest have been widely utilized in the fields of clinical medicine, nursing practice, medical education and healthcare services. Despite this wide applicability of the Delphi methodology, there is no clear identification of what constitutes a sufficient number of Delphi survey participants to ensure stability of results. METHODS: The study analyzed the response characteristics from the first round of a Delphi survey conducted with 23 experts in healthcare quality and patient safety. The panel members had similar training and subject matter understanding of the Malcolm Baldrige Criteria for Performance Excellence in Healthcare. The raw data from the first round sampling, which usually contains the largest diversity of responses, were augmented via bootstrap sampling to obtain computer-generated results for two larger samples obtained by sampling with replacement. Response characteristics (mean, trimmed mean, standard deviation and 95% confidence intervals) for 54 survey items were compared for the responses of the 23 actual study participants and two computer-generated samples of 1000 and 2000 resampling iterations. RESULTS: The results from this study indicate that the response characteristics of a small expert panel in a well-defined knowledge area are stable in light of augmented sampling. CONCLUSION: Panels of similarly trained experts (who possess a general understanding in the field of interest) provide effective and reliable utilization of a small sample from a limited number of experts in a field of study to develop reliable criteria that inform judgment and support effective decision-making

    Magnetic Field-based Navigation of a Mobile Robot

    Get PDF
    This thesis explores a unique system and method to control the position of a remote vehicle. A handheld cylindrical transmitter generates an alternating dipole field at a specific frequency. A self-contained robot utilizes three loop antennas mounted in each of the Cartesian axes to continually determine its position in the dipole field. The vehicle then maneuvers to maintain a specific positional relationship along the transmitter's axis. Since the robot's movement is determined by magnetic field sensing, there is no line-of-sight requirement. The robot's position can be maintained in the dark and behind walls as easily as in a bright open room. The system can be divided into four subsystems: the transmitter, the receiving electronics, the microcontroller and its decisions, and the mobile platform, with the latter three comprising the robot. The transmitter radiates the dipole field by passing an AC signal through a solenoid antenna. This field is detected, filtered, and amplified using a microprocessor-based automatic gain control. The microcontroller uses field information to control robot movement. Each subsystem is discussed in length, with detailed theory presented in the Appendices. The control methods presented in this thesis are proven not only by the underlying theory but also by operation of a successful embodiment. The embodiment employs simple electronics to perform all necessary processes while avoiding the use of expensive components. Example photos, schematics, software source code, and mechanical design are discussed in such detail that the reader may reproduce the working system. Suggested improvements and alternative embodiments are presented to encourage extension of this technology to more practical applications.School of Electrical & Computer Engineerin

    The Evolution of the Dark Halo Spin Parameters lambda and lambda' in a LCDM Universe: The Role of Minor and Major Mergers

    Full text link
    The evolution of the spin parameter of dark halos and the dependence on the halo merging history in a set of dissipationless cosmological LCDM simulations is investigated. Special focus is placed on the differences of the two commonly used versions of the spin parameter, namely lambda=J*E^1/2/(G*M^5/2) (Peebles 80) and lambda'=J/(sqrt(2)*M_vir*R_vir*V_vir) (Bullock et al. 01). Though the distribution of the spin transfer rate defined as the ratio of the spin parameters after and prior to a merger is similar to a high degree for both, lambda and lambda', we find considerable differences in the time evolution: while lambda' is roughly independent of redshift, lambda turns out to increase significantly with decreasing redshift. This distinct behaviour arises from small differences in the spin transfer during accretion events. The evolution of the spin parameter is strongly coupled with the virial ratio eta:=2*E_kin/|E_pot| of dark halos. Major mergers disturb halos and increase both their virial ratio and spin parameter for 1-2 Gyrs. At high redshifts (z=2-3) many halos are disturbed with an average virial ratio of eta = 1.3 which approaches unity until z=0. We find that the redshift evolution of the spin parameters is dominated by the huge number of minor mergers rather than the rare major merger events.Comment: 10 pages, 11 figures, submitted to MNRA

    Selection Bias Due to Loss to Follow Up in Cohort Studies

    Get PDF
    Selection bias due to loss to follow up represents a threat to the internal validity of estimates derived from cohort studies. Over the last fifteen years, stratification-based techniques as well as methods such as inverse probability-of-censoring weighted estimation have been more prominently discussed and offered as a means to correct for selection bias. However, unlike correcting for confounding bias using inverse weighting, uptake of inverse probability-of-censoring weighted estimation as well as competing methods has been limited in the applied epidemiologic literature. To motivate greater use of inverse probability-of-censoring weighted estimation and competing methods, we use causal diagrams to describe the sources of selection bias in cohort studies employing a time-to-event framework when the quantity of interest is an absolute measure (e.g. absolute risk, survival function) or relative effect measure (e.g., risk difference, risk ratio). We highlight that whether a given estimate obtained from standard methods is potentially subject to selection bias depends on the causal diagram and the measure. We first broadly describe inverse probability-of-censoring weighted estimation and then give a simple example to demonstrate in detail how inverse probability-of-censoring weighted estimation mitigates selection bias and describe challenges to estimation. We then modify complex, real-world data from the University of North Carolina Center for AIDS Research HIV clinical cohort study and estimate the absolute and relative change in the occurrence of death with and without inverse probability-of-censoring weighted correction using the modified University of North Carolina data. We provide SAS code to aid with implementation of inverse probability-of-censoring weighted techniques

    Behavior change interventions: the potential of ontologies for advancing science and practice

    Get PDF
    A central goal of behavioral medicine is the creation of evidence-based interventions for promoting behavior change. Scientific knowledge about behavior change could be more effectively accumulated using "ontologies." In information science, an ontology is a systematic method for articulating a "controlled vocabulary" of agreed-upon terms and their inter-relationships. It involves three core elements: (1) a controlled vocabulary specifying and defining existing classes; (2) specification of the inter-relationships between classes; and (3) codification in a computer-readable format to enable knowledge generation, organization, reuse, integration, and analysis. This paper introduces ontologies, provides a review of current efforts to create ontologies related to behavior change interventions and suggests future work. This paper was written by behavioral medicine and information science experts and was developed in partnership between the Society of Behavioral Medicine's Technology Special Interest Group (SIG) and the Theories and Techniques of Behavior Change Interventions SIG. In recent years significant progress has been made in the foundational work needed to develop ontologies of behavior change. Ontologies of behavior change could facilitate a transformation of behavioral science from a field in which data from different experiments are siloed into one in which data across experiments could be compared and/or integrated. This could facilitate new approaches to hypothesis generation and knowledge discovery in behavioral science

    Doxycycline vs. penicillin G benzathine for the treatment of syphilis in patients with HIV

    Get PDF
    Background: Syphilis is a highly prevalent sexually transmitted infection which can lead to serious health complications if not treated effectively. Historically, penicillin G benzathine has been the primary agent used to treat syphilis infections, with doxycycline being an alternative agent for patients who cannot tolerate penicillin antibiotics. The efficacy of doxycycline in treating syphilis in patients with HIV, however, has not been well documented despite its use as an alternative treatment. The objective of this study is to compare the ability of these two agents to treat syphilis, specifically in patients infected with human immunodeficiency virus (HIV).Methods: This study has been approved by the facility’s Institutional Review Board. It examined a cohort of patients retrospectively to analyze the comparative effectiveness of resolution of syphilis infection in patients treated with penicillin G benzathine vs. doxycycline utilizing diagnosis codes for specific types of syphilis (primary, secondary, tertiary, early latent, late latent). An information technology specialist working with the institution was instructed to run a report of patients from the electronic medical record who received a prescription for doxycycline or were administered an intramuscular injection of penicillin G benzathine with a diagnosis for syphilis and HIV. The population of patients was collected from October 2020 to present. An informational technology specialist developed the report and pulled any identifiable patient information prior to forwarding to the primary research team. Clinical pharmacy residents reviewed all patients included in the report for involvement in the study based on set inclusion and exclusion criteria. The primary endpoint assessed was resolution of syphilis infection, with secondary endpoints looking at reported adverse reactions to treatment, reinfection with syphilis, and incomplete initial treatment. All members of the research team were kept up to date on the happenings of the trial, as is relevant for their level of involvement.Results: Data analysis for this trial is still ongoing. The total number of patients included in the study was 134, with 21 patients having received doxycycline and 113 having been administered penicillin G benzathine. The primary outcome occurred in 18/21 (85.71%) of patients in the doxycycline arm, while 103/113 (91.15%) of patients in the penicillin group saw resolution of syphilis. Adverse effects were widely unreported, with gastrointestinal symptoms being the only type of reaction reported. One patient reported diarrhea after taking doxycycline, and another patient reported nausea after receiving an injection of penicillin G benzathine.Conclusion: Doxycycline had a lower resolution of infection. This could be due to the uneven distribution of patients between the two arms of the study. Given that the place in therapy of doxycycline is primarily patients with severe penicillin allergy, it is not surprising that far more patients were treated with penicillin G benzathine than with doxycycline. The lack of reported adverse effects is encouraging, as it is likely they would have been stated if reactions had occurred. Other endpoints will be analyzed in the future. Further, larger scale, studies are needed to determine if doxycycline is truly inferior to penicillin in treating syphilis infection

    Parametric mixture models to evaluate and summarize hazard ratios in the presence of competing risks with time-dependent hazards and delayed entry

    Get PDF
    In the analysis of survival data, there are often competing events that preclude an event of interest from occurring. Regression analysis with competing risks is typically undertaken using a cause-specific proportional hazards model. However, modern alternative methods exist for the analysis of the subdistribution hazard with a corresponding subdistribution proportional hazards model. In this paper, we introduce a flexible parametric mixture model as a unifying method to obtain estimates of the cause-specific and subdistribution hazards and hazard ratio functions. We describe how these estimates can be summarized over time to give a single number that is comparable to the hazard ratio that is obtained from a corresponding cause-specific or subdistribution proportional hazards model. An application to the Women’s Interagency HIV Study is provided to investigate injection drug use and the time to either the initiation of effective antiretroviral therapy, or clinical disease progression as a competing event

    Ultra-fine dark matter structure in the Solar neighbourhood

    Full text link
    The direct detection of dark matter on Earth depends crucially on its density and its velocity distribution on a milliparsec scale. Conventional N-body simulations are unable to access this scale, making the development of other approaches necessary. In this paper, we apply the method developed in Fantin et al. 2008 to a cosmologically-based merger tree, transforming it into a useful instrument to reproduce and analyse the merger history of a Milky Way-like system. The aim of the model is to investigate the implications of any ultra-fine structure for the current and next generation of directional dark matter detectors. We find that the velocity distribution of a Milky Way-like Galaxy is almost smooth, due to the overlap of many streams of particles generated by multiple mergers. Only the merger of a 10^10 Msun analyse can generate significant features in the ultra-local velocity distribution, detectable at the resolution attainable by current experiments.Comment: 9 pages, 6 figures, accepted for publication in MNRA
    • …
    corecore