10,808 research outputs found

    Are the tunnel ventilation systems adapted for the different risk situations ?

    Get PDF
    International audienceThe ventilation design criteria for both road and rail tunnel is based on the design fire defined by the standards and the general knowledge about smoke propagation. The problem of such an approach is that it considers only the impact on the safety ventilation of the smoke propagation and dispersion inside the tunnel excluding other possible accident. However some other situations, such as toxic gas release, are possible and even if the aim is not to design the ventilation on other dangerous phenomena with a lower occurrence frequency, it must be ensure that the ventilation system does not increase the consequences of the accident. Mainly, the problem of toxic gas dispersion is pointed out in this paper. Because of the large variety of dangerous materials that can transit in tunnel, the probability of an accident that impacts a toxic transport cannot be neglected. In the worst case scenario, such as a massive release of high toxic gases, the ventilation is useless because of the toxic quantity that induces a large number of deaths inside the tunnel. However, when the toxic release is lower and ventilation can be used, having in mind that toxic gas is generally heavy gas or a cold gas, the behaviour will of course be different than the one of smoke and the ventilation system may not be adapted for such a situation. This case has scarcely been studied yet. In this study, both experimental approach and numerical tools were used to improve the global understanding of dense gas dispersion in underground infrastructure such as road tunnels. The experimental work was achieved in the INERIS fire gallery which represents a 50 m long 1/3rd scale tunnel using Argon. It was achieved for different leaks conditions in order to appreciate the dense gas natural behaviour. This work has also enabled the comparison between experimental work and CFD calculation with FDS code for the particular application of dense gas dispersion. . The work was extended to some other configurations and geometry in order to simulate real scale situation with different kind of gases : a highly toxic dense gas such as Chlorine, a light gas stored as a liquid at a very low temperature such as Ammonia, and a gas which remains liquid at ambient temperature and pressure and is drained into an evaporating pool such as Acrolein. This work will consider the natural behaviour of the gases and the influence of longitudinal ventilation both inside and outside of the tunne

    Logic Programming Applications: What Are the Abstractions and Implementations?

    Full text link
    This article presents an overview of applications of logic programming, classifying them based on the abstractions and implementations of logic languages that support the applications. The three key abstractions are join, recursion, and constraint. Their essential implementations are for-loops, fixed points, and backtracking, respectively. The corresponding kinds of applications are database queries, inductive analysis, and combinatorial search, respectively. We also discuss language extensions and programming paradigms, summarize example application problems by application areas, and touch on example systems that support variants of the abstractions with different implementations

    Dataflow Programming and Acceleration of Computationally-Intensive Algorithms

    Get PDF
    The volume of unstructured textual information continues to grow due to recent technological advancements. This resulted in an exponential growth of information generated in various formats, including blogs, posts, social networking, and enterprise documents. Numerous Enterprise Architecture (EA) documents are also created daily, such as reports, contracts, agreements, frameworks, architecture requirements, designs, and operational guides. The processing and computation of this massive amount of unstructured information necessitate substantial computing capabilities and the implementation of new techniques. It is critical to manage this unstructured information through a centralized knowledge management platform. Knowledge management is the process of managing information within an organization. This involves creating, collecting, organizing, and storing information in a way that makes it easily accessible and usable. The research involved the development textual knowledge management system, and two use cases were considered for extracting textual knowledge from documents. The first case study focused on the safety-critical documents of a railway enterprise. Safety is of paramount importance in the railway industry. There are several EA documents including manuals, operational procedures, and technical guidelines that contain critical information. Digitalization of these documents is essential for analysing vast amounts of textual knowledge that exist in these documents to improve the safety and security of railway operations. A case study was conducted between the University of Huddersfield and the Railway Safety Standard Board (RSSB) to analyse EA safety documents using Natural language processing (NLP). A graphical user interface was developed that includes various document processing features such as semantic search, document mapping, text summarization, and visualization of key trends. For the second case study, open-source data was utilized, and textual knowledge was extracted. Several features were also developed, including kernel distribution, analysis offkey trends, and sentiment analysis of words (such as unique, positive, and negative) within the documents. Additionally, a heterogeneous framework was designed using CPU/GPU and FPGAs to analyse the computational performance of document mapping

    Detecting and explaining unfairness in consumer contracts through memory networks

    Get PDF
    Recent work has demonstrated how data-driven AI methods can leverage consumer protection by supporting the automated analysis of legal documents. However, a shortcoming of data-driven approaches is poor explainability. We posit that in this domain useful explanations of classifier outcomes can be provided by resorting to legal rationales. We thus consider several configurations of memory-augmented neural networks where rationales are given a special role in the modeling of context knowledge. Our results show that rationales not only contribute to improve the classification accuracy, but are also able to offer meaningful, natural language explanations of otherwise opaque classifier outcomes

    Microeconomic Structure determines Macroeconomic Dynamics. Aoki defeats the Representative Agent

    Full text link
    Masanao Aoki developed a new methodology for a basic problem of economics: deducing rigorously the macroeconomic dynamics as emerging from the interactions of many individual agents. This includes deduction of the fractal / intermittent fluctuations of macroeconomic quantities from the granularity of the mezo-economic collective objects (large individual wealth, highly productive geographical locations, emergent technologies, emergent economic sectors) in which the micro-economic agents self-organize. In particular, we present some theoretical predictions, which also met extensive validation from empirical data in a wide range of systems: - The fractal Levy exponent of the stock market index fluctuations equals the Pareto exponent of the investors wealth distribution. The origin of the macroeconomic dynamics is therefore found in the granularity induced by the wealth / capital of the wealthiest investors. - Economic cycles consist of a Schumpeter 'creative destruction' pattern whereby the maxima are cusp-shaped while the minima are smooth. In between the cusps, the cycle consists of the sum of 2 'crossing exponentials': one decaying and the other increasing. This unification within the same theoretical framework of short term market fluctuations and long term economic cycles offers the perspective of a genuine conceptual synthesis between micro- and macroeconomics. Joining another giant of contemporary science - Phil Anderson - Aoki emphasized the role of rare, large fluctuations in the emergence of macroeconomic phenomena out of microscopic interactions and in particular their non self-averaging, in the language of statistical physics. In this light, we present a simple stochastic multi-sector growth model.Comment: 42 pages, 6 figure

    A construction method of urban road risky vehicles based on dynamic knowledge graph

    Get PDF
    The growth of the Internet of Things makes it possible to share information on risky vehicles openly and freely. How to create dynamic knowledge graphs of continually changing risky vehicles has emerged as a crucial technology for identifying risky vehicles, as well as a research hotspot in both artificial intelligence and field knowledge graphs. The node information of the risky vehicle knowledge graph is not rich, and the graph structure plays a major role in its dynamic changes. The paper presents a fusion algorithm based on relational graph convolutional network (R-GCN) and Long Short-Term Memory (LSTM) to build the dynamic knowledge graph of risky vehicles and conducts a comparative experiment on the link prediction task. The results showed that the fusion algorithm based on R-GCN and LSTM had better performance than the other methods such as GCN, DynGEM, ROLAND, and RE-GCN, with the MAP value of 0.2746 and the MRR value of 0.1075. To further verify the proposed algorithm, classification experiments are carried out on the risky vehicle dataset. Accuracy, precision, recall, and F-values were used as heat-tolerance evaluation indexes in classification experiments, the values were 0.667, 0.034, 0.422, and 0.52 respectively
    • …
    corecore