55 research outputs found
Growth-rate-dependent dynamics of a bacterial genetic oscillator
Gene networks exhibiting oscillatory dynamics are widespread in biology. The
minimal regulatory designs giving rise to oscillations have been implemented
synthetically and studied by mathematical modeling. However, most of the
available analyses generally neglect the coupling of regulatory circuits with
the cellular "chassis" in which the circuits are embedded. For example, the
intracellular macromolecular composition of fast-growing bacteria changes with
growth rate. As a consequence, important parameters of gene expression, such as
ribosome concentration or cell volume, are growth-rate dependent, ultimately
coupling the dynamics of genetic circuits with cell physiology. This work
addresses the effects of growth rate on the dynamics of a paradigmatic example
of genetic oscillator, the repressilator. Making use of empirical growth-rate
dependences of parameters in bacteria, we show that the repressilator dynamics
can switch between oscillations and convergence to a fixed point depending on
the cellular state of growth, and thus on the nutrients it is fed. The physical
support of the circuit (type of plasmid or gene positions on the chromosome)
also plays an important role in determining the oscillation stability and the
growth-rate dependence of period and amplitude. This analysis has potential
application in the field of synthetic biology, and suggests that the coupling
between endogenous genetic oscillators and cell physiology can have substantial
consequences for their functionality.Comment: 14 pages, 9 figures (revised version, accepted for publication
Modelling the evolution of transcription factor binding preferences in complex eukaryotes
Transcription factors (TFs) exert their regulatory action by binding to DNA
with specific sequence preferences. However, different TFs can partially share
their binding sequences due to their common evolutionary origin. This
`redundancy' of binding defines a way of organizing TFs in `motif families' by
grouping TFs with similar binding preferences. Since these ultimately define
the TF target genes, the motif family organization entails information about
the structure of transcriptional regulation as it has been shaped by evolution.
Focusing on the human TF repertoire, we show that a one-parameter evolutionary
model of the Birth-Death-Innovation type can explain the TF empirical
ripartition in motif families, and allows to highlight the relevant
evolutionary forces at the origin of this organization. Moreover, the model
allows to pinpoint few deviations from the neutral scenario it assumes: three
over-expanded families (including HOX and FOX genes), a set of `singleton' TFs
for which duplication seems to be selected against, and a higher-than-average
rate of diversification of the binding preferences of TFs with a Zinc Finger
DNA binding domain. Finally, a comparison of the TF motif family organization
in different eukaryotic species suggests an increase of redundancy of binding
with organism complexity.Comment: 14 pages, 5 figures. Minor changes. Final version, accepted for
publicatio
Stochastic timing in gene expression for simple regulatory strategies
Timing is essential for many cellular processes, from cellular responses to
external stimuli to the cell cycle and circadian clocks. Many of these
processes are based on gene expression. For example, an activated gene may be
required to reach in a precise time a threshold level of expression that
triggers a specific downstream process. However, gene expression is subject to
stochastic fluctuations, naturally inducing an uncertainty in this
threshold-crossing time with potential consequences on biological functions and
phenotypes. Here, we consider such "timing fluctuations", and we ask how they
can be controlled. Our analytical estimates and simulations show that, for an
induced gene, timing variability is minimal if the threshold level of
expression is approximately half of the steady-state level. Timing fuctuations
can be reduced by increasing the transcription rate, while they are insensitive
to the translation rate. In presence of self-regulatory strategies, we show
that self-repression reduces timing noise for threshold levels that have to be
reached quickly, while selfactivation is optimal at long times. These results
lay a framework for understanding stochasticity of endogenous systems such as
the cell cycle, as well as for the design of synthetic trigger circuits.Comment: 10 pages, 5 figure
Gene autoregulation via intronic microRNAs and its functions
Background: MicroRNAs, post-transcriptional repressors of gene expression,
play a pivotal role in gene regulatory networks. They are involved in core
cellular processes and their dysregulation is associated to a broad range of
human diseases. This paper focus on a minimal microRNA-mediated regulatory
circuit, in which a protein-coding gene (host gene) is targeted by a microRNA
located inside one of its introns. Results: Autoregulation via intronic
microRNAs is widespread in the human regulatory network, as confirmed by our
bioinformatic analysis, and can perform several regulatory tasks despite its
simple topology. Our analysis, based on analytical calculations and
simulations, indicates that this circuitry alters the dynamics of the host gene
expression, can induce complex responses implementing adaptation and Weber's
law, and efficiently filters fluctuations propagating from the upstream network
to the host gene. A fine-tuning of the circuit parameters can optimize each of
these functions. Interestingly, they are all related to gene expression
homeostasis, in agreement with the increasing evidence suggesting a role of
microRNA regulation in conferring robustness to biological processes. In
addition to model analysis, we present a list of bioinformatically predicted
candidate circuits in human for future experimental tests. Conclusions: The
results presented here suggest a potentially relevant functional role for
negative self-regulation via intronic microRNAs, in particular as a homeostatic
control mechanism of gene expression. Moreover, the map of circuit functions in
terms of experimentally measurable parameters, resulting from our analysis, can
be a useful guideline for possible applications in synthetic biology.Comment: 29 pages and 7 figures in the main text, 18 pages of Supporting
Informatio
Statistics of shared components in complex component systems
Many complex systems are modular. Such systems can be represented as
"component systems", i.e., sets of elementary components, such as LEGO bricks
in LEGO sets. The bricks found in a LEGO set reflect a target architecture,
which can be built following a set-specific list of instructions. In other
component systems, instead, the underlying functional design and constraints
are not obvious a priori, and their detection is often a challenge of both
scientific and practical importance, requiring a clear understanding of
component statistics. Importantly, some quantitative invariants appear to be
common to many component systems, most notably a common broad distribution of
component abundances, which often resembles the well-known Zipf's law. Such
"laws" affect in a general and non-trivial way the component statistics,
potentially hindering the identification of system-specific functional
constraints or generative processes. Here, we specifically focus on the
statistics of shared components, i.e., the distribution of the number of
components shared by different system-realizations, such as the common bricks
found in different LEGO sets. To account for the effects of component
heterogeneity, we consider a simple null model, which builds
system-realizations by random draws from a universe of possible components.
Under general assumptions on abundance heterogeneity, we provide analytical
estimates of component occurrence, which quantify exhaustively the statistics
of shared components. Surprisingly, this simple null model can positively
explain important features of empirical component-occurrence distributions
obtained from data on bacterial genomes, LEGO sets, and book chapters. Specific
architectural features and functional constraints can be detected from
occurrence patterns as deviations from these null predictions, as we show for
the illustrative case of the "core" genome in bacteria.Comment: 18 pages, 7 main figures, 7 supplementary figure
Zipf and Heaps laws from dependency structures in component systems
Complex natural and technological systems can be considered, on a
coarse-grained level, as assemblies of elementary components: for example,
genomes as sets of genes, or texts as sets of words. On one hand, the joint
occurrence of components emerges from architectural and specific constraints in
such systems. On the other hand, general regularities may unify different
systems, such as the broadly studied Zipf and Heaps laws, respectively
concerning the distribution of component frequencies and their number as a
function of system size. Dependency structures (i.e., directed networks
encoding the dependency relations between the components in a system) were
proposed recently as a possible organizing principles underlying some of the
regularities observed. However, the consequences of this assumption were
explored only in binary component systems, where solely the presence or absence
of components is considered, and multiple copies of the same component are not
allowed. Here, we consider a simple model that generates, from a given ensemble
of dependency structures, a statistical ensemble of sets of components,
allowing for components to appear with any multiplicity. Our model is a minimal
extension that is memoryless, and therefore accessible to analytical
calculations. A mean-field analytical approach (analogous to the "Zipfian
ensemble" in the linguistics literature) captures the relevant laws describing
the component statistics as we show by comparison with numerical computations.
In particular, we recover a power-law Zipf rank plot, with a set of core
components, and a Heaps law displaying three consecutive regimes (linear,
sub-linear and saturating) that we characterize quantitatively
Heaps' law, statistics of shared components and temporal patterns from a sample-space-reducing process
Zipf's law is a hallmark of several complex systems with a modular structure,
such as books composed by words or genomes composed by genes. In these
component systems, Zipf's law describes the empirical power law distribution of
component frequencies. Stochastic processes based on a sample-space-reducing
(SSR) mechanism, in which the number of accessible states reduces as the system
evolves, have been recently proposed as a simple explanation for the ubiquitous
emergence of this law. However, many complex component systems are
characterized by other statistical patterns beyond Zipf's law, such as a
sublinear growth of the component vocabulary with the system size, known as
Heap's law, and a specific statistics of shared components. This work shows,
with analytical calculations and simulations, that these statistical properties
can emerge jointly from a SSR mechanism, thus making it an appropriate
parameter-poor representation for component systems. Several alternative (and
equally simple) models, for example based on the preferential attachment
mechanism, can also reproduce Heaps' and Zipf's laws, suggesting that
additional statistical properties should be taken into account to select the
most-likely generative process for a specific system. Along this line, we will
show that the temporal component distribution predicted by the SSR model is
markedly different from the one emerging from the popular rich-gets-richer
mechanism. A comparison with empirical data from natural language indicates
that the SSR process can be chosen as a better candidate model for text
generation based on this statistical property. Finally, a limitation of the SSR
model in reproducing the empirical "burstiness" of word appearances in texts
will be pointed out, thus indicating a possible direction for extensions of the
basic SSR process.Comment: 14 pages, 4 figure
- …