147,063 research outputs found
Identification of heat exchange process in the evaporators of absorption refrigerating units under conditions of uncertainty
Проведено аналіз функціонування випарників абсорбційно-холодильних установок блоку вторинної конденсації типового для України агрегату синтезу аміаку. Обґрунтована необхідність мінімізації температури вторинної конденсації за рахунок створення автоматизованої адаптивної системи оптимального програмного управління. Встановлені рівняння для чисельної оцінки невизначеності теплового навантаження випарника та коефіцієнту теплопередачі. Розроблено алгоритмічне забезпечення щодо розв’язання задач ідентифікації та створення математичної моделі. Визначена технічна структура автоматизованої системи для їх реалізації
Applying Formal Methods to Networking: Theory, Techniques and Applications
Despite its great importance, modern network infrastructure is remarkable for
the lack of rigor in its engineering. The Internet which began as a research
experiment was never designed to handle the users and applications it hosts
today. The lack of formalization of the Internet architecture meant limited
abstractions and modularity, especially for the control and management planes,
thus requiring for every new need a new protocol built from scratch. This led
to an unwieldy ossified Internet architecture resistant to any attempts at
formal verification, and an Internet culture where expediency and pragmatism
are favored over formal correctness. Fortunately, recent work in the space of
clean slate Internet design---especially, the software defined networking (SDN)
paradigm---offers the Internet community another chance to develop the right
kind of architecture and abstractions. This has also led to a great resurgence
in interest of applying formal methods to specification, verification, and
synthesis of networking protocols and applications. In this paper, we present a
self-contained tutorial of the formidable amount of work that has been done in
formal methods, and present a survey of its applications to networking.Comment: 30 pages, submitted to IEEE Communications Surveys and Tutorial
Systems biology in animal sciences
Systems biology is a rapidly expanding field of research and is applied in a number of biological disciplines. In animal sciences, omics approaches are increasingly used, yielding vast amounts of data, but systems biology approaches to extract understanding from these data of biological processes and animal traits are not yet frequently used. This paper aims to explain what systems biology is and which areas of animal sciences could benefit from systems biology approaches. Systems biology aims to understand whole biological systems working as a unit, rather than investigating their individual components. Therefore, systems biology can be considered a holistic approach, as opposed to reductionism. The recently developed ‘omics’ technologies enable biological sciences to characterize the molecular components of life with ever increasing speed, yielding vast amounts of data. However, biological functions do not follow from the simple addition of the properties of system components, but rather arise from the dynamic interactions of these components. Systems biology combines statistics, bioinformatics and mathematical modeling to integrate and analyze large amounts of data in order to extract a better understanding of the biology from these huge data sets and to predict the behavior of biological systems. A ‘system’ approach and mathematical modeling in biological sciences are not new in itself, as they were used in biochemistry, physiology and genetics long before the name systems biology was coined. However, the present combination of mass biological data and of computational and modeling tools is unprecedented and truly represents a major paradigm shift in biology. Significant advances have been made using systems biology approaches, especially in the field of bacterial and eukaryotic cells and in human medicine. Similarly, progress is being made with ‘system approaches’ in animal sciences, providing exciting opportunities to predict and modulate animal traits
Method for finding metabolic properties based on the general growth law. Liver examples. A General framework for biological modeling
We propose a method for finding metabolic parameters of cells, organs and
whole organisms, which is based on the earlier discovered general growth law.
Based on the obtained results and analysis of available biological models, we
propose a general framework for modeling biological phenomena and discuss how
it can be used in Virtual Liver Network project. The foundational idea of the
study is that growth of cells, organs, systems and whole organisms, besides
biomolecular machinery, is influenced by biophysical mechanisms acting at
different scale levels. In particular, the general growth law uniquely defines
distribution of nutritional resources between maintenance needs and biomass
synthesis at each phase of growth and at each scale level. We exemplify the
approach considering metabolic properties of growing human and dog livers and
liver transplants. A procedure for verification of obtained results has been
introduced too. We found that two examined dogs have high metabolic rates
consuming about 0.62 and 1 gram of nutrients per cubic centimeter of liver per
day, and verified this using the proposed verification procedure. We also
evaluated consumption rate of nutrients in human livers, determining it to be
about 0.088 gram of nutrients per cubic centimeter of liver per day for males,
and about 0.098 for females. This noticeable difference can be explained by
evolutionary development, which required females to have greater liver
processing capacity to support pregnancy. We also found how much nutrients go
to biomass synthesis and maintenance at each phase of liver and liver
transplant growth. Obtained results demonstrate that the proposed approach can
be used for finding metabolic characteristics of cells, organs, and whole
organisms, which can further serve as important inputs for many applications in
biology (protein expression), biotechnology (synthesis of substances), and
medicine.Comment: 20 pages, 6 figures, 4 table
Sciduction: Combining Induction, Deduction, and Structure for Verification and Synthesis
Even with impressive advances in automated formal methods, certain problems
in system verification and synthesis remain challenging. Examples include the
verification of quantitative properties of software involving constraints on
timing and energy consumption, and the automatic synthesis of systems from
specifications. The major challenges include environment modeling,
incompleteness in specifications, and the complexity of underlying decision
problems.
This position paper proposes sciduction, an approach to tackle these
challenges by integrating inductive inference, deductive reasoning, and
structure hypotheses. Deductive reasoning, which leads from general rules or
concepts to conclusions about specific problem instances, includes techniques
such as logical inference and constraint solving. Inductive inference, which
generalizes from specific instances to yield a concept, includes algorithmic
learning from examples. Structure hypotheses are used to define the class of
artifacts, such as invariants or program fragments, generated during
verification or synthesis. Sciduction constrains inductive and deductive
reasoning using structure hypotheses, and actively combines inductive and
deductive reasoning: for instance, deductive techniques generate examples for
learning, and inductive reasoning is used to guide the deductive engines.
We illustrate this approach with three applications: (i) timing analysis of
software; (ii) synthesis of loop-free programs, and (iii) controller synthesis
for hybrid systems. Some future applications are also discussed
Quantitative Verification: Formal Guarantees for Timeliness, Reliability and Performance
Computerised systems appear in almost all aspects of our daily lives, often in safety-critical scenarios such as embedded control systems in cars and aircraft
or medical devices such as pacemakers and sensors. We are thus increasingly reliant on these systems working correctly, despite often operating in unpredictable or unreliable environments. Designers of such devices need ways to guarantee that they will operate in a reliable and efficient manner.
Quantitative verification is a technique for analysing quantitative aspects of a system's design, such as timeliness, reliability or performance. It applies formal methods, based on a rigorous analysis of a mathematical model of the system, to automatically prove certain precisely specified properties, e.g. ``the airbag will always deploy within 20 milliseconds after a crash'' or ``the probability of both sensors failing simultaneously is less than 0.001''.
The ability to formally guarantee quantitative properties of this kind is beneficial across a wide range of application domains. For example, in safety-critical systems, it may be essential to establish credible bounds on the probability with which certain failures or combinations of failures can occur. In embedded control systems, it is often important to comply with strict constraints on timing or resources. More generally, being able to derive guarantees on precisely specified levels of performance or efficiency is a valuable tool in the design of, for example, wireless networking protocols, robotic systems or power management algorithms, to name but a few.
This report gives a short introduction to quantitative verification, focusing in particular on a widely used technique called model checking, and its generalisation to the analysis of quantitative aspects of a system such as timing, probabilistic behaviour or resource usage.
The intended audience is industrial designers and developers of systems such as those highlighted above who could benefit from the application of quantitative verification,but lack expertise in formal verification or modelling
- …