3,165 research outputs found

    Creation as an Ecumenical Problem: Renewed Belief through Green Experience

    Get PDF
    Loss of a sense of creaturehood and of members has occurred across the lines of divided churches in a secular context. The author explores the question whether green experience of nature can be a path toward a renewed sense of creaturehood. Bernard Lonergan’s distinction between faith and belief allows for identifying a primordial faith that interprets the cosmos as numinous. Ignatius of Loyola’s Spiritual Exercises interprets primordial faith with the biblical word of God as Creator. Why not develop local ecumenical experiments in reevangelization that address green experience

    From genotypes to organisms: State-of-the-art and perspectives of a cornerstone in evolutionary dynamics

    Get PDF
    Understanding how genotypes map onto phenotypes, fitness, and eventually organisms is arguably the next major missing piece in a fully predictive theory of evolution. We refer to this generally as the problem of the genotype-phenotype map. Though we are still far from achieving a complete picture of these relationships, our current understanding of simpler questions, such as the structure induced in the space of genotypes by sequences mapped to molecular structures, has revealed important facts that deeply affect the dynamical description of evolutionary processes. Empirical evidence supporting the fundamental relevance of features such as phenotypic bias is mounting as well, while the synthesis of conceptual and experimental progress leads to questioning current assumptions on the nature of evolutionary dynamics-cancer progression models or synthetic biology approaches being notable examples. This work delves into a critical and constructive attitude in our current knowledge of how genotypes map onto molecular phenotypes and organismal functions, and discusses theoretical and empirical avenues to broaden and improve this comprehension. As a final goal, this community should aim at deriving an updated picture of evolutionary processes soundly relying on the structural properties of genotype spaces, as revealed by modern techniques of molecular and functional analysis.Comment: 111 pages, 11 figures uses elsarticle latex clas

    A multiscale systems perspective on cancer, immunotherapy, and Interleukin-12

    Get PDF
    Monoclonal antibodies represent some of the most promising molecular targeted immunotherapies. However, understanding mechanisms by which tumors evade elimination by the immune system of the host presents a significant challenge for developing effective cancer immunotherapies. The interaction of cancer cells with the host is a complex process that is distributed across a variety of time and length scales. The time scales range from the dynamics of protein refolding (i.e., microseconds) to the dynamics of disease progression (i.e., years). The length scales span the farthest reaches of the human body (i.e., meters) down to the range of molecular interactions (i.e., nanometers). Limited ranges of time and length scales are used experimentally to observe and quantify changes in physiology due to cancer. Translating knowledge obtained from the limited scales observed experimentally to predict patient response is an essential prerequisite for the rational design of cancer immunotherapies that improve clinical outcomes. In studying multiscale systems, engineers use systems analysis and design to identify important components in a complex system and to test conceptual understanding of the integrated system behavior using simulation. The objective of this review is to summarize interactions between the tumor and cell-mediated immunity from a multiscale perspective. Interleukin-12 and its role in coordinating antibody-dependent cell-mediated cytotoxicity is used illustrate the different time and length scale that underpin cancer immunoediting. An underlying theme in this review is the potential role that simulation can play in translating knowledge across scales

    Network and systems medicine: Position paper of the European Collaboration on Science and Technology action on Open Multiscale Systems Medicine

    Get PDF
    Introduction: Network and systems medicine has rapidly evolved over the past decade, thanks to computational and integrative tools, which stem in part from systems biology. However, major challenges and hurdles are still present regarding validation and translation into clinical application and decision making for precision medicine. Methods: In this context, the Collaboration on Science and Technology Action on Open Multiscale Systems Medicine (OpenMultiMed) reviewed the available advanced technologies for multidimensional data generation and integration in an open-science approach as well as key clinical applications of network and systems medicine and the main issues and opportunities for the future. Results: The development of multi-omic approaches as well as new digital tools provides a unique opportunity to explore complex biological systems and networks at different scales. Moreover, the application of findable, applicable, interoperable, and reusable principles and the adoption of standards increases data availability and sharing for multiscale integration and interpretation. These innovations have led to the first clinical applications of network and systems medicine, particularly in the field of personalized therapy and drug dosing. Enlarging network and systems medicine application would now imply to increase patient engagement and health care providers as well as to educate the novel generations of medical doctors and biomedical researchers to shift the current organ- and symptom-based medical concepts toward network- and systems-based ones for more precise diagnoses, interventions, and ideally prevention. Conclusion: In this dynamic setting, the health care system will also have to evolve, if not revolutionize, in terms of organization and management

    Quantifying the Generation of T Cell Immunity using a Systems Biology Approach.

    Full text link
    The immune system is our defense against pathogens. Quantitatively predicting its response to foreign stimulation is key toward developing tools to interfere with or prevent infection (e.g. vaccines and immunotherapies). I use a systems biology approach and develop computational models describing dynamics occurring within lymph nodes, sites where activated immune cells are generated. These effector cells circulate out into blood and to sites of infection participating in immunity. I both quantitatively and qualitatively study dynamics of immune cells during a generalized infection as well as during infection with Mycobacterium tuberculosis (Mtb). The models predict that their 3-dimensional configuration enables the lymph nodes to support rare antigen-specific T cells to efficiently search for antigen-bearing dendritic cells, and this efficiency is not reduced when the precursor frequency increases in a wide range. The models also predict strategies to manipulate the differentiation of immune cells to maximize specific subtypes of T cells populations, depending on different immunomodulation goals. When coupled with Mtb infection models, our models are able to assist vaccine design by finding correlations between immune cell subsets and protection against Mtb, and also help identify mechanisms controlling different disease outcomes at host level.PhDBioinformaticsUniversity of Michigan, Horace H. Rackham School of Graduate Studieshttp://deepblue.lib.umich.edu/bitstream/2027.42/113563/1/changgon_1.pd

    Failed futures, broken promises, and the prospect of cybernetic immortality: toward an abundant sociological history of cryonic suspension, 1962-1979

    Get PDF
    This dissertation offers an interpretation of cryonic suspension, or "cryonics," the practice of preserving human corpses by way of perfusing them with chemical protectants and gradually subjecting them, at the pronouncement of legal death, to extremely low temperatures (-360◦ F, -196◦ C), which are then controlled and maintained over the long term by liquid nitrogen filled "cryocapsules." Cryonics is ultimately motivated by the hope that medicine will at some future point achieve the requisite kinds and levels of technology to facilitate the rejuvenation and "reanimation" of the "deanimated," those who lay in cryonic suspension. The interpretation of cryonic suspension that I set forth departs quite abruptly from existing academic engagements with the practice—it is rooted in a wealth of previously unutilized archival materials from the 1960s and 70s, all of which are virtually inaccessible to those operating outside the cryonics community. The interpretation cuts across, takes as its substantive focus, and is periodized with respect to three different though related moments in the history of cryonic suspension: 1) the emergence of cryonics in 1962 and the previously unexamined ties of the practice to the postwar science of cybernetics and NASA’s Cyborg Spaceflight Program; 2) the subsequent performance and material instantiation of cryonics, marked by the plights of those who froze and were frozen throughout the American 1960s and 70s; and, tied to and fomented by the lattermost especially, 3) catastrophic failure, marked by the collapse of the Cryonics Society of New York in 1974, and the discovery, in 1979, of several abandoned, thawed, and radically decomposed cryonics "patients" interned in the Cryonics Society of California’s underground "cryo-crypt" at the Oakwood Memorial Park Cemetery in Chatsworth, California; what is infamously known in cryonics circles as the "Chatsworth Scandal." The dissertation as such offers several novel interpretive claims about cryonic suspension, all of which take shape in sustained dialogue with cultural studies of science and technology, and especially the history of cybernetics. The dissertation’s principle theoretical intervention involves deploying these claims to offer an alternative to prevailing interpretations of cryonic suspension, both popular and academic, as an unintelligible pseudoscientific "anomaly." I argue to the contrary that cryonic suspension emerged in a space produced by what Anthony Giddens and especially Zygmunt Bauman regard as the principle constitutive feature of modern social life—the ultimately futile yet pervasive modern impulse to sequester death, dying, and the dead from the realm of the living. I furthermore argue that the distinctly modern logic of sequestration is replicated in the reigning epistemic norms and practices that shape sociological theory and research proper, in that academic sociology, whatever its professed stripes and leanings, tends overwhelmingly to regard death, narrowly conceived in decidedly modern terms as an "end of life event," as being only marginally important to apprehending the shape of the modern social, when in fact death's sequestration constitutes the social realities upon which sociologists tend to train their analytical focus. The key to the intervention I make with respect to cryonic suspension's intelligibility thus hinges upon recognizing that the otherwise seemingly "anomalous" practice emerged in a space produced by the institutional shortcomings death's sequestration under western modernity, and thus presents a lived reality that places considerable strain upon the conceptual comfort zones of modern epistemology and historiography. It is in this sense that cryonic suspension, as I argue following Robert Orsi, evidences an abundant phenomenon. Instead of "passing over in silence" the epistemic discomfort presented by cryonic suspension's abundance, the narrative accounts of cryonics that I develop are pressed into the service of countering those authorized ways of knowing that safely accord with modernity's sequestration of death. I thus opt for an historical sociological treatment of cryonics, one centered about death's sequestration—that is to say, an abundant sociological history of cryonic suspension

    Acute myocardial infarction patient data to assess healthcare utilization and treatments.

    Get PDF
    The goal of this study is to use a data mining framework to assess the three main treatments for acute myocardial infarction: thrombolytic therapy, percutaneous coronary intervention (percutaneous angioplasty), and coronary artery bypass surgery. The need for a data mining framework in this study arises because of the use of real world data rather than highly clean and homogenous data found in most clinical trials and epidemiological studies. The assessment is based on determining a profile of patients undergoing an episode of acute myocardial infarction, determine resource utilization by treatment, and creating a model that predicts each treatment resource utilization and cost. Text Mining is used to find a subset of input attributes that characterize subjects who undergo the different treatments for acute myocardial infarction as well as distinct resource utilization profiles. Classical statistical methods are used to evaluate the results of text clustering. The features selected by supervised learning are used to build predictive models for resource utilization and are compared with those features selected by traditional statistical methods for a predictive model with the same outcome. Sequence analysis is used to determine the sequence of treatment of acute myocardial infarction. The resulting sequence is used to construct a probability tree that defines the basis for cost effectiveness analysis that compares acute myocardial infarction treatments. To determine effectiveness, survival analysis methodology is implemented to assess the occurrence of death during the hospitalization, the likelihood of a repeated episode of acute myocardial infarction, and the length of time between reoccurrence of an episode of acute myocardial infarction or the occurrence of death. The complexity of this study was mainly based on the data source used: administrative data from insurance claims. Such data source was not originally designed for the study of health outcomes or health resource utilization. However, by transforming record tables from many-to-many relations to one-to-one relations, they became useful in tracking the evolution of disease and disease outcomes. Also, by transforming tables from a wide-format to a long-format, the records became analyzable by many data mining algorithms. Moreover, this study contributed to field of applied mathematics and public health by implementing a sequence analysis on consecutive procedures to determine the sequence of events that describe the evolution of a hospitalization for acute myocardial infarction. This same data transformation and algorithm can be used in the study of rare diseases whose evolution is not well understood

    The Toowoomba adult trauma triage tool

    Get PDF
    Since the introduction of the Australasian Triage Scale (ATS) there has been considerable variation in its application. Improved uniformity in the application of the ATS by triage nurses is required. A reproducible, reliable and valid method to classify the illness acuity of Emergency Department patients so that a triage category 3 by one nurse means the same as a triage category 3 by another, not only in the same ED but also in another institution would be of considerable value to emergency nurses. This has been the driving motivation behind developing the Toowoomba Adult Trauma Triage Tool (TATTT). It is hoped the TATTT will support emergency nurses working in this challenging area by promoting standardisation and decreasing subjectivity in the triage decision process
    corecore