70,949 research outputs found
TinkerCell: Modular CAD Tool for Synthetic Biology
Synthetic biology brings together concepts and techniques from engineering
and biology. In this field, computer-aided design (CAD) is necessary in order
to bridge the gap between computational modeling and biological data. An
application named TinkerCell has been created in order to serve as a CAD tool
for synthetic biology. TinkerCell is a visual modeling tool that supports a
hierarchy of biological parts. Each part in this hierarchy consists of a set of
attributes that define the part, such as sequence or rate constants. Models
that are constructed using these parts can be analyzed using various C and
Python programs that are hosted by TinkerCell via an extensive C and Python
API. TinkerCell supports the notion of a module, which are networks with
interfaces. Such modules can be connected to each other, forming larger modular
networks. Because TinkerCell associates parameters and equations in a model
with their respective part, parts can be loaded from databases along with their
parameters and rate equations. The modular network design can be used to
exchange modules as well as test the concept of modularity in biological
systems. The flexible modeling framework along with the C and Python API allows
TinkerCell to serve as a host to numerous third-party algorithms. TinkerCell is
a free and open-source project under the Berkeley Software Distribution
license. Downloads, documentation, and tutorials are available at
www.tinkercell.com.Comment: 23 pages, 20 figure
The Hopfield model and its role in the development of synthetic biology
Neural network models make extensive use of
concepts coming from physics and engineering. How do scientists
justify the use of these concepts in the representation of
biological systems? How is evidence for or against the use of
these concepts produced in the application and manipulation
of the models? It will be shown in this article that neural
network models are evaluated differently depending on the
scientific context and its modeling practice. In the case of
the Hopfield model, the different modeling practices related to
theoretical physics and neurobiology played a central role for
how the model was received and used in the different scientific
communities. In theoretical physics, where the Hopfield model
has its roots, mathematical modeling is much more common and
established than in neurobiology which is strongly experiment
driven. These differences in modeling practice contributed to
the development of the new field of synthetic biology which
introduced a third type of model which combines mathematical
modeling and experimenting on biological systems and by doing
so mediates between the different modeling practices
Causality, Information and Biological Computation: An algorithmic software approach to life, disease and the immune system
Biology has taken strong steps towards becoming a computer science aiming at
reprogramming nature after the realisation that nature herself has reprogrammed
organisms by harnessing the power of natural selection and the digital
prescriptive nature of replicating DNA. Here we further unpack ideas related to
computability, algorithmic information theory and software engineering, in the
context of the extent to which biology can be (re)programmed, and with how we
may go about doing so in a more systematic way with all the tools and concepts
offered by theoretical computer science in a translation exercise from
computing to molecular biology and back. These concepts provide a means to a
hierarchical organization thereby blurring previously clear-cut lines between
concepts like matter and life, or between tumour types that are otherwise taken
as different and may not have however a different cause. This does not diminish
the properties of life or make its components and functions less interesting.
On the contrary, this approach makes for a more encompassing and integrated
view of nature, one that subsumes observer and observed within the same system,
and can generate new perspectives and tools with which to view complex diseases
like cancer, approaching them afresh from a software-engineering viewpoint that
casts evolution in the role of programmer, cells as computing machines, DNA and
genes as instructions and computer programs, viruses as hacking devices, the
immune system as a software debugging tool, and diseases as an
information-theoretic battlefield where all these forces deploy. We show how
information theory and algorithmic programming may explain fundamental
mechanisms of life and death.Comment: 30 pages, 8 figures. Invited chapter contribution to Information and
Causality: From Matter to Life. Sara I. Walker, Paul C.W. Davies and George
Ellis (eds.), Cambridge University Pres
A Process Algebraical Approach to Modelling Compartmentalized Biological Systems
This paper introduces Protein Calculus, a special modeling language designed for encoding and calculating the behaviors of compartmentilized biological systems. The formalism combines, in a unified framework, two successful computational paradigms - process algebras and membrane systems. The goal of Protein Calculus is to provide a formal tool for transforming collected information from in vivo experiments into coded definition of the different types of proteins, complexes of proteins, and membrane-organized systems of such entities. Using this encoded information as input, our calculus computes, in silico, the possible behaviors of a living system. This is the preliminary version of a paper that was published in Proceedings of International Conference of Computational Methods in Sciences and Engineering (ICCMSE), American Institute of Physics, AIP Proceedings, N 2: 642-646, 2007 (http://scitation.aip.org/dbt/dbt.jsp?KEY=APCPCS&Volume=963&Issue=2)
- …