29,571 research outputs found

    What\u27s So Funny \u27Bout Peace, Love, and Understanding

    Full text link
    Last Saturday I stood on Stine Lake with a group of friends to pray and spread a message of “Peace for Syria.” This event was sponsored by the Newman Association in response to Pope Francis’s request that “Christians, and our brothers and sisters of other religions and every man and woman of good will, cry out forcefully: Violence and war are never the way to peace!” Students of all religions and backgrounds came to support us, and it was a very rewarding day for me as a Catholic and as a human longing for world peace and understanding. [excerpt

    Leaf cuticular morphology links Platanaceae and Proteaceae

    Get PDF
    Int. J. Plant Sci. 166(5):843–855. © 2005 by The University of Chicago.The leaf cuticular morphology of extant species of Platanus was investigated using light and scanning electron microscopy. All species are shown to possess trichome bases of the same type as those commonly found in Proteaceae. Of particular significance are compound forms that consist of an annular surface scar associated with more than one underlying epidermal cell. These are found on the adaxial leaf surfaces of all species of Platanus and are also clearly evident on the abaxial surface of Platanus orientalis. This type of trichome base is therefore interpreted as the first detected nonreproductive morphological synapomorphy linking Proteaceae and Platanaceae. Also, the laterocytic, sometimes paracytic, or anomocytic arrangement of subsidiary cells in Platanus is distinct from the general state in Proteaceae, which is brachyparacytic and presumably derived. In Bellendena, possibly the most basal genus of extant Proteaceae, subsidiary cell arrangements resemble those of Platanus. These results are discussed with respect to leaf fossil records of Proteales, where it is concluded that the combination of brachyparacytic stomata and compound trichome bases is strong evidence for Proteaceae.Raymond J. Carpenter, Robert S. Hill, and Gregory J. Jorda

    A Gedanken spacecraft that operates using the quantum vacuum (Dynamic Casimir effect)

    Full text link
    Conventional rockets are not a suitable technology for deep space missions. Chemical rockets require a very large weight of propellant, travel very slowly compared to light speed, and require significant energy to maintain operation over periods of years. For example, the 722 kg Voyager spacecraft required 13,600 kg of propellant to launch and would take about 80,000 years to reach the nearest star, Proxima Centauri, about 4.3 light years away. There have been various attempts at developing ideas on which one might base a spacecraft that would permit deep space travel, such as spacewarps. In this paper we consider another suggestion from science fiction and explore how the quantum vacuum might be utilized in the creation of a novel spacecraft. The spacecraft is based on the dynamic Casimir effect, in which electromagnetic radiation is emitted when an uncharged mirror is properly accelerated in the vacuum. The radiative reaction produces a dissipative force on the mirror that tends to resist the acceleration of the mirror. This force can be used to accelerate a spacecraft attached to the mirror. We also show that, in principal, one could obtain the power to operate the accelerated mirror in such a spacecraft using energy extracted from the quantum vacuum using the standard Casimir effect witha parallel plate geometry. Unfortunately the method as currently conceived generates a miniscule thrust, and is no more practical than a spacewarp, yet it does provide an interesting demonstration of our current understanding of the physics of the quantized electromagnetic field in vacuum.Comment: 18 pages, 3 figure

    Quantitative Risk Analysis using Real-time Data and Change-point Analysis for Data-informed Risk Prediction

    Get PDF
    Incidents in highly hazardous process industries (HHPI) are a major concern for various stakeholders due to the impact on human lives, environment, and potentially huge financial losses. Because process activities, location and products are unique, risk analysis techniques applied in the HHPI has evolved over the years. Unfortunately, some limitations of the various quantitative risk analysis (QRA) method currently employed means alternative or more improved methods are required. This research has obtained one such method called Big Data QRA Method. This method relies entirely on big data techniques and real-time process data to identify the point at which process risk is imminent and provide the extent of contribution of other components interacting up to the time index of the risk. Unlike the existing QRA methods which are static and based on unvalidated assumptions and data from single case studies, the big data method is dynamic and can be applied to most process systems. This alternative method is my original contribution to science and the practice of risk analysis The detailed procedure which has been provided in Chapter 9 of this thesis applies multiple change-point analysis and other big data techniques like, (a) time series analysis, (b) data exploration and compression techniques, (c) decision tree modelling, (d) linear regression modelling. Since the distributional properties of process data can change over time, the big data approach was found to be more appropriate. Considering the unique conditions, activities and the process systems use within the HHPI, the dust fire and explosion incidents at the Imperial Sugar Factory and the New England Wood Pellet LLC both of which occurred in the USA were found to be suitable case histories to use as a guide for evaluation of data in this research. Data analysis was performed using open source software packages in R Studio. Based on the investigation, multiple-change-point analysis packages strucchange and changepoint were found to be successful at detecting early signs of deteriorating conditions of component in process equipment and the main process risk. One such process component is a bearing which was suspected as the source of ignition which led to the dust fire and explosion at the Imperial Sugar Factory. As a result, this this research applies the big data QRA method procedure to bearing vibration data to predict early deterioration of bearings and final period when the bearing’s performance begins the final phase of deterioration to failure. Model-based identification of these periods provides an indication of whether the conditions of a mechanical part in process equipment at a particular moment represent an unacceptable risk. The procedure starts with selection of process operation data based on the findings of an incident investigation report on the case history of a known process incident. As the defining components of risk, both the frequency and consequences associated with the risk were obtained from the incident investigation reports. Acceptance criteria for the risk can be applied to the periods between the risks detected by the two change-point packages. The method was validated with two case study datasets to demonstrate its applicability as procedure for QRA. The procedure was then tested with two other case study datasets as examples of its application as a QRA method. The insight obtained from the validation and the applied examples led to the conclusion that big data techniques can be applied to real-time process data for risk assessment in the HHPI

    The Global 21-cm Signal in the Context of the High-z Galaxy Luminosity Function

    Get PDF
    Motivated by recent progress in studies of the high-zz Universe, we build a new model for the global 21-cm signal that is explicitly calibrated to measurements of the galaxy luminosity function (LF) and further tuned to match the Thomson scattering optical depth of the cosmic microwave background, τe\tau_e. Assuming that the z8z \lesssim 8 galaxy population can be smoothly extrapolated to higher redshifts, the recent decline in best-fit values of τe\tau_e and the inefficient heating induced by X-ray binaries (HMXBs; the presumptive sources of the X-ray background at high-zz) imply that the entirety of cosmic reionization and reheating occurs at redshifts z12z \lesssim 12. In contrast to past global 21-cm models, whose z20z \sim 20 (ν70\nu \sim 70 MHz) absorption features and strong 25\sim 25 mK emission features were driven largely by the assumption of efficient early star-formation and X-ray heating, our new fiducial model peaks in absorption at ν110\nu \sim 110 MHz at a depth of 160\sim -160 mK and has a negligible emission component. As a result, a strong emission signal would provide convincing evidence that HMXBs are not the only drivers of cosmic reheating. Shallow absorption troughs should accompany strong heating scenarios, but could also be caused by a low escape fraction of Lyman-Werner photons. Generating signals with troughs at ν95\nu \lesssim 95 MHz requires a floor in the star-formation efficiency in halos below 109M\sim 10^{9} M_{\odot}, which is equivalent to steepening the faint-end of the galaxy LF. These findings demonstrate that the global 21-cm signal is a powerful complement to current and future galaxy surveys and efforts to better understand the interstellar medium in high-zz galaxies.Comment: 17 pages, 9 figures, in pres

    Multiprocessor sparse L/U decomposition with controlled fill-in

    Get PDF
    Generation of the maximal compatibles of pivot elements for a class of small sparse matrices is studied. The algorithm involves a binary tree search and has a complexity exponential in the order of the matrix. Different strategies for selection of a set of compatible pivots based on the Markowitz criterion are investigated. The competing issues of parallelism and fill-in generation are studied and results are provided. A technque for obtaining an ordered compatible set directly from the ordered incompatible table is given. This technique generates a set of compatible pivots with the property of generating few fills. A new hueristic algorithm is then proposed that combines the idea of an ordered compatible set with a limited binary tree search to generate several sets of compatible pivots in linear time. Finally, an elimination set to reduce the matrix is selected. Parameters are suggested to obtain a balance between parallelism and fill-ins. Results of applying the proposed algorithms on several large application matrices are presented and analyzed
    corecore