948 research outputs found

    Leading the Way: Catholic School Leaders and Action Research

    Get PDF
    Recent research extols the value of problem-based learning strategies in exemplary school leadership preparation programs as one way to provide school leaders with the appropriate tools to systematically use data to make important decisions. The purpose of this study was to address the current gap between the noted importance of problem-based learning strategies in leadership preparation programs, and the demonstrated effect these strategies have on the knowledge, skills, behaviours, and values of school leaders. The study employed a longitudinal mixed-method research design to examine discrete action research skills, behaviours, and values of 44 candidates enrolled in a Master of Arts in Educational Administration degree program. Inferential analysis of the pre- and post-test survey data indicated a statistically significant increase in self-reported preparedness and capacity for all but two of the 14 core research activities assessed on the survey instrument. There are powerful and potentially long lasting outcomes for leadership candidates that complete a full cycle of action research as part of a principal preparation program. This study allows some tentative mapping of the actual skills, behaviours, and values that school leaders may evince as a result of deep exposure to practitioner driven action research

    Large-scale inflatable structures for tunnel protection: a review of the Resilient Tunnel Plug project

    Get PDF
    The protection of underground civil infrastructure continues to be a high priority for transportation and transit security agencies. In particular, rail transit tunnels running under bodies of water are susceptible to disruptions due to flooding caused by extraordinary climatic events such as hurricanes or other events resulting from human activities. Several events have taken place in the past decades that have demonstrated the need to mitigate vulnerabilities or, at least, minimize the consequences of catastrophic events. Although it is impossible to prevent all situations that can lead to flooding, damage can be substantially decreased by reducing the area affected by the event. To minimize the effects of an event, a possible approach is to compartmentalize the tunnel system by creating temporary barriers that can contain the propagation of flooding until a more permanent solution can be implemented. One way to create a temporary barrier is by the deployment of a large-scale inflatable structure, also known as an inflatable plug. In such an application, the inflatable structure is prepared for placement, either permanently or temporally, and maintained ready for deployment, inflation, and pressurization when needed. The internal plug pressure imparts a normal force against the tunnel wall surface with the friction between the plug and tunnel surfaces opposing axial movement of the plug. The sealing effectiveness depends on the ability of the inflatable structure to self-deploy and fit, without human intervention, to the intricacies of the perimeter of the conduit being sealed. Primary design constraints include having the plug stowed away from the dynamic envelope of the trains and being able to withhold the pressure of the flooding water. This work presents a compilation of the main aspects of the activities completed for the development of large-scale inflatable structures as part of the Resilient Tunnel Plug (RTP) Project. The main test results and lessons learned are presented to demonstrate the viability of implementing large-scale inflatable plugs for the containment of flooding in rail tunnels systems. Over 400 coupon and specimen tests, 200 reduced scale tests, and 100 full-scale tests were conducted to demonstrate the efficacy of the design of different prototypes over a 10-year research and development project. The culmination of the work was 12 large-scale flooding demonstrations where the inflatable tunnel plug was shown able to be deployed remotely and withstand a simulated flooding event

    Pastors’ Views of Parents and the Parental Role in Catholic Schools

    Get PDF
    Over 300 years of official Church teachings and documents affirm the importance of the home-school relationship, yet relatively little research has systematically explored the need and value of parent involvement in the school community. This study is a secondary analysis of survey data collected for the Notre Dame Study of U.S. Pastors (Nuzzi, Frabutt, & Holter, 2008) and examines pastors’ views of parents and the parental role in Catholic schools. The article closes with recommendations for action based upon analysis of the quantitative and qualitative data trends from pastors’ responses

    OWL2Vec*: Embedding of OWL Ontologies

    Get PDF
    Semantic embedding of knowledge graphs has been widely studied and used for prediction and statistical analysis tasks across various domains such as Natural Language Processing and the Semantic Web. However, less attention has been paid to developing robust methods for embedding OWL (Web Ontology Language) ontologies. In this paper, we propose a language model based ontology embedding method named OWL2Vec*, which encodes the semantics of an ontology by taking into account its graph structure, lexical information and logic constructors. Our empirical evaluation with three real world datasets suggests that OWL2Vec* benefits from these three different aspects of an ontology in class membership prediction and class subsumption prediction tasks. Furthermore, OWL2Vec* often significantly outperforms the state-of-the-art methods in our experiments

    Application of Commercial Non-Dispersive Infrared Spectroscopy Sensors for Sub-Ambient Carbon Dioxide Detection

    Get PDF
    Monitoring carbon dioxide (CO2) concentration within a spacecraft or spacesuit is critically important to ensuring the safety of the crew. Carbon dioxide uniquely absorbs light at wavelengths of 3.95 micrometers and 4.26 micrometers. As a result, non-dispersive infrared (NDIR) spectroscopy can be employed as a reliable and inexpensive method for the quantification of CO2 within the atmosphere. A multitude of commercial-off-the-shelf (COTS) NDIR sensors exist for CO2 quantification. The COTS sensors provide reasonable accuracy so long as the measurements are attained under conditions close to the calibration conditions of the sensor (typically 21.1 C and 1 atm). However, as pressure deviates from atmospheric to the pressures associated with a spacecraft (8.0-10.2 PSIA) or spacesuit (4.1-8.0 PSIA), the error in the measurement grows increasingly large. In addition to pressure and temperature dependencies, the infrared transmissivity through a volume of gas also depends on the composition of the gas. As the composition is not known a priori, accurate sub-ambient detection must rely on iterative sensor compensation techniques. This manuscript describes the development of recursive compensation algorithms for sub-ambient detection of CO2 with COTS NDIR sensors. In addition, the basis of the exponential loss in accuracy is developed theoretically considering thermal, Doppler, and Lorentz broadening effects which arise as a result of the temperature, pressure, and composition of the gas mixture under analysis. As a result, this manuscript provides an approach to employing COTS sensors at sub-ambient conditions and may also lend insight into designing future NDIR sensors for aerospace application

    Peripheral blood marker of residual acute leukemia after hematopoietic cell transplantation using multi-plex digital droplet PCR

    Full text link
    BACKGROUND Relapse remains the primary cause of death after hematopoietic cell transplantation (HCT) for acute leukemia. The ability to identify minimal/measurable residual disease (MRD) via the blood could identify patients earlier when immunologic interventions may be more successful. We evaluated a new test that could quantify blood tumor mRNA as leukemia MRD surveillance using droplet digital PCR (ddPCR). METHODS The multiplex ddPCR assay was developed using tumor cell lines positive for the tumor associated antigens (TAA: WT1, PRAME, BIRC5), with homeostatic ABL1. On IRB-approved protocols, RNA was isolated from mononuclear cells from acute leukemia patients after HCT (n = 31 subjects; n = 91 specimens) and healthy donors (n = 20). ddPCR simultaneously quantitated mRNA expression of WT1, PRAME, BIRC5, and ABL1 and the TAA/ABL1 blood ratio was measured in patients with and without active leukemia after HCT. RESULTS Tumor cell lines confirmed quantitation of TAAs. In patients with active acute leukemia after HCT (MRD+ or relapse; n=19), the blood levels of WT1/ABL1, PRAME/ABL1, and BIRC5/ABL1 exceeded healthy donors (p<0.0001, p=0.0286, and p=0.0064 respectively). Active disease status was associated with TAA positivity (1+ TAA vs 0 TAA) with an odds ratio=10.67, (p=0.0070, 95% confidence interval 1.91 - 59.62). The area under the curve is 0.7544. Changes in ddPCR correlated with disease response captured on standard of care tests, accurately denoting positive or negative disease burden in 15/16 (95%). Of patients with MRD+ or relapsed leukemia after HCT, 84% were positive for at least one TAA/ABL1 in the peripheral blood. In summary, we have developed a new method for blood MRD monitoring of leukemia after HCT and present preliminary data that the TAA/ABL1 ratio may may serve as a novel surrogate biomarker for relapse of acute leukemia after HCT

    The Iterative Signature Algorithm for the analysis of large scale gene expression data

    Full text link
    We present a new approach for the analysis of genome-wide expression data. Our method is designed to overcome the limitations of traditional techniques, when applied to large-scale data. Rather than alloting each gene to a single cluster, we assign both genes and conditions to context-dependent and potentially overlapping transcription modules. We provide a rigorous definition of a transcription module as the object to be retrieved from the expression data. An efficient algorithm, that searches for the modules encoded in the data by iteratively refining sets of genes and conditions until they match this definition, is established. Each iteration involves a linear map, induced by the normalized expression matrix, followed by the application of a threshold function. We argue that our method is in fact a generalization of Singular Value Decomposition, which corresponds to the special case where no threshold is applied. We show analytically that for noisy expression data our approach leads to better classification due to the implementation of the threshold. This result is confirmed by numerical analyses based on in-silico expression data. We discuss briefly results obtained by applying our algorithm to expression data from the yeast S. cerevisiae.Comment: Latex, 36 pages, 8 figure
    • …
    corecore