607 research outputs found
Bioethics: Reincarnation of Natural Philosophy in Modern Science
The theory of evolution of complex and comprising of human systems and algorithm for its
constructing are the synthesis of evolutionary epistemology, philosophical anthropology and
concrete scientific empirical basis in modern (transdisciplinary) science. «Trans-disciplinary» in
the context is interpreted as a completely new epistemological situation, which is fraught with the
initiation of a civilizational crisis. Philosophy and ideology of technogenic civilization is based on
the possibility of unambiguous demarcation of public value and descriptive scientific discourses
(1), and the object and subject of the cognitive process (2). Both of these attributes are no longer
valid. For mass, everyday consciousness and institutional philosophical tradition it is intuitively
obvious that having the ability to control the evolutionary process, Homo sapiens came close to the
borders of their own biological and cultural identity. The spontaneous coevolutionary process of
interaction between the «subject» (rational living organisms) and the «object» (material world), is
the teleological trend of the movement towards the complete rationalization of the World as It Is,
its merger with the World of Due. The stratification of the global evolutionary process into selective
and semantic (teleological) coevolutionary and therefore ontologically inseparable components
follows. With the entry of anthropogenic civilization into the stage of the information society, firsty,
the post-academic phase of the historical evolution of scientific rationality began, the attributes of
which are the specific methodology of scientific knowledge, scientific ethos and ontology. Bioethics
as a phenomenon of intellectual culture represents a natural philosophical core of modern post-
academic (human-dimensional) science, in which the ethical neutrality of scientific theory
principle is inapplicable, and elements of public-axiological and scientific-descriptive discourses
are integrated into a single logic construction. As result, hermeneutics precedes epistemology not
only methodologically, but also meaningfully, and natural philosophy is regaining the status of the
backbone of the theory of evolution – in an explicit for
Efficient sampling for Bayesian inference of conjunctive Bayesian networks
Motivation: Cancer development is driven by the accumulation of advantageous mutations and subsequent clonal expansion of cells harbouring these mutations, but the order in which mutations occur remains poorly understood. Advances in genome sequencing and the soon-arriving flood of cancer genome data produced by large cancer sequencing consortia hold the promise to elucidate cancer progression. However, new computational methods are needed to analyse these large datasets. Results: We present a Bayesian inference scheme for Conjunctive Bayesian Networks, a probabilistic graphical model in which mutations accumulate according to partial order constraints and cancer genotypes are observed subject to measurement noise. We develop an efficient MCMC sampling scheme specifically designed to overcome local optima induced by dependency structures. We demonstrate the performance advantage of our sampler over traditional approaches on simulated data and show the advantages of adopting a Bayesian perspective when reanalyzing cancer datasets and comparing our results to previous maximum-likelihood-based approaches. Availability: An R package including the sampler and examples is available at http://www.cbg.ethz.ch/software/bayes-cbn. Contacts: [email protected]
Global parameter identification of stochastic reaction networks from single trajectories
We consider the problem of inferring the unknown parameters of a stochastic
biochemical network model from a single measured time-course of the
concentration of some of the involved species. Such measurements are available,
e.g., from live-cell fluorescence microscopy in image-based systems biology. In
addition, fluctuation time-courses from, e.g., fluorescence correlation
spectroscopy provide additional information about the system dynamics that can
be used to more robustly infer parameters than when considering only mean
concentrations. Estimating model parameters from a single experimental
trajectory enables single-cell measurements and quantification of cell--cell
variability. We propose a novel combination of an adaptive Monte Carlo sampler,
called Gaussian Adaptation, and efficient exact stochastic simulation
algorithms that allows parameter identification from single stochastic
trajectories. We benchmark the proposed method on a linear and a non-linear
reaction network at steady state and during transient phases. In addition, we
demonstrate that the present method also provides an ellipsoidal volume
estimate of the viable part of parameter space and is able to estimate the
physical volume of the compartment in which the observed reactions take place.Comment: Article in print as a book chapter in Springer's "Advances in Systems
Biology
A Category Theoretical Argument Against the Possibility of Artificial Life
One of Robert Rosen's main contributions to the scientific community is summarized in his book 'Life itself'. There Rosen presents a theoretical framework to define living systems; given this definition, he goes on to show that living systems are not realisable in computational universes. Despite being well known and often cited, Rosen's central proof has so far not been evaluated by the scientific community. In this article we review the essence of Rosen's ideas leading up to his rejection of the possibility of real artificial life in silico. We also evaluate his arguments and point out that some of Rosen's central notions are ill- defined. The conclusion of this article is that Rosen's central proof is wrong
Management of object-oriented action-based distributed programs
Phd ThesisThis thesis addresses the problem of managing the runtime behaviour of distributed
programs. The thesis of this work is that management is fundamentally
an information processing activity and that the object model, as applied to actionbased
distributed systems and database systems, is an appropriate representation
of the management information. In this approach, the basic concepts of classes,
objects, relationships, and atomic transition systems are used to form object
models of distributed programs. Distributed programs are collections of objects
whose methods are structured using atomic actions, i.e., atomic transactions.
Object models are formed of two submodels, each representing a fundamental
aspect of a distributed program. The structural submodel represents a static
perspective of the distributed program, and the control submodel represents a
dynamic perspective of it. Structural models represent the program's objects,
classes and their relationships. Control models represent the program's object
states, events, guards and actions-a transition system. Resolution of queries on
the distributed program's object model enable the management system to control
certain activities of distributed programs.
At a different level of abstraction, the distributed program can be seen as a
reactive system where two subprograms interact: an application program and a
management program; they interact only through sensors and actuators. Sensors
are methods used to probe an object's state and actuators are methods used
to change an object's state. The management program is capable to prod the
application program into action by activating sensors and actuators available at
the interface of the application program. Actions are determined by management
policies that are encoded in the management program. This way of structuring
the management system encourages a clear modularization of application and
management distributed programs, allowing better separation of concerns. Managemental
concerns can be dealt with by the management program, functional
concerns can be assigned to the application program.
The object-oriented action-based computational model adopted by the management
system provides a natural framework for the implementation of faulttolerant
distributed programs. Object orientation provides modularity and extensibility
through object encapsulation. Atomic actions guarantee the consistency of
the objects of the distributed program despite concurrency and failures. Replication
of the distributed program provides increased fault-tolerance by guaranteeing
the consistent progress of the computation, even though some of the replicated
objects can fail.
A prototype management system based on the management theory proposed
above has been implemented atop Arjuna; an object-oriented programming system
which provides a set of tools for constructing fault-tolerant distributed programs. The management system is composed of two subsystems: Stabilis, a
management system for structural information, and Vigil, a management system
for control information. Example applications have been implemented to illustrate
the use of the management system and gather experimental evidence to give
support to the thesis.CNPq (Consellho Nacional de Desenvolvimento Cientifico e Tecnol6gico, Brazil):
BROADCAST (Basic Research On Advanced Distributed Computing: from Algorithms to SysTems)
A Fuzzy Predictable Load Balancing Approach in Cloud Computing
Cloud computing is a new paradigm for hosting and delivering services on demand over the internet where users access services. It is an example of an ultimately virtualized system, and a natural evolution for data centers that employ automated systems management, workload balancing, and virtualization technologies. Live virtual machine (VM) migration is a technique to achieve load balancing in cloud environment by transferring an active overload VM from one physical host to another one without disrupting the VM. In this study, to eliminate whole VM migration in load balancing process, we propose a Fuzzy Predictable Load Balancing (FPLB) approach which confronts with the problem of overload VM, by assigning the extra tasks from overloaded VM to another similar VM instead of whole VM migration. In addition, we propose a Fuzzy Prediction Method (FPM) to predict VMs migration time. This approach also contains a multi-objective optimization model to migrate these tasks to a new VM host. In proposed FPLB approach there is no need to pause VM during migration time. Furthermore, considering this fact that VM live migration contrast to tasks migration takes longer to complete and needs more idle capacity in host physical machine (PM), the proposed approach will significantly reduce time, idle memory and cost consumption
Multi-Objective Optimization of the Design Parameters of a Shell and Tube Type Heat Exchanger Based on Economic and Size Consideration
A heat exchanger is a device that is used to transfer heat between two or more fluids that are at different temperatures. These are essential elements in a wide range of systems, including the human body, automobiles, computers, power plants and comfort heating /cooling equipment. The most commonly used type is the shell and tube type heat exchanger. Owing to their wide utilization, their cost minimization is an important target and instead of traditional iterative procedures, we implement a software-based (MATLAB), genetic algorithm in order to achieve the minimization of the total cost of equipment including capital investment and the sum of discounted annual energy expenditures including pumping. Simultaneously, the minimization of length of the heat exchanger is also targeted. The multi-objective algorithm searches for the optimal values of design variables such as outer tube diameter, outer shell diameter and baffle spacing, for two types of tube layout arrangement (triangular and square) with the number of tube passes being two or four
- …