3,547 research outputs found

    Towards safer mining: the role of modelling software to find missing persons after a mine collapse

    Get PDF
    Purpose. The purpose of the study is to apply science and technology to determine the most likely location of a container in which three miners were trapped after the Lily mine disaster. Following the collapse of the Crown Pillar at Lily Mine in South Africa on the 5th of February 2016, there was a national outcry to find the three miners who were trapped in a surface container lamp room that disappeared in the sinkhole that formed during the surface col-lapse. Methods. At a visit to Lily Mine on the 9th of March, the Witwatersrand Mining Institute suggested a two-way strategy going forward to find the container in which the miners are trapped and buried. The first approach, which is the subject of this paper, is to test temporal 3D modeling software technology to locate the container, and second, to use scientific measurement and testing technologies. The overall methodology used was to first, request academia and research entities within the University to supply the WMI with ideas, which ideas list was compiled as responses came in. These were scrutinized and literature gathered for a conceptual study on which these ideas are likely to work. The software screening and preliminary testing of such software are discussed in this article. Findings. The findings are that software modeling is likely to locate the present position of the container, but accurate data and a combination of different advanced software packages will be required, but at tremendous cost. Originality. This paper presents original work on how software technology can be used to locate missing miners. Practical implications. The two approaches were not likely to recover the miners alive because of the considerable time interval, but will alert the rescue team and mine workers when they come in close proximity to them.Мета. Визначення можливого місця локалізації лампового приміщення контейнера, в якому опинилися три шахтаря після аварії на шахті Лілі (Барбертон, Мпумаланга) методом комп’ютерного моделювання. Після обвалення стельового цілика на шахті Лілі 5 лютого 2016 року почалася національна кампанія з порятунку трьох шахтарів, які залишилися у ламповому приміщенні поверхневого транспортного контейнера, що провалився в утворену після вибуху воронку. Методика. Співробітниками Гірничого Інституту (Уітуотерс) запропонована двостадійна стратегія пошуку контейнера, в якому існує ймовірність знаходження шахтарів. В рамках першого підходу (який розглядається у даній статті) для виявлення контейнера здійснювалось випробування комп’ютерної технології 3D-моделювання в часі. Другий підхід передбачав технологію проведення наукового вимірювання та експерименту. В цілому, методологія включала, насамперед, підключення викладацького та наукового складу університету до вирішення проблеми шляхом комплексної генерації ідей, які були об’єднані в загальний список, вивчені із залученням відповідних літературних джерел, і найбільш реалістичні ідеї були виділені із загального переліку. Дана стаття розглядає результати комп’ютерної експертизи цих ідей та перевірки надійності відповідного програмного забезпечення. Результати. Для зручності моделювання процес обвалення був розділений на три окремі фази: руйнування воронки, руйнування західного схилу та небезпека ковзання на південних схилах. Ідентифіковано програмні технології, які можуть імітувати рух контейнера у перших двох фазах обвалення. В результаті моделювання у програмному забезпеченні ParaView виявлено місце розташування даного контейнера. Виконано аналіз південного схилу за допомогою ArcGIS і складені карти небезпеки схилу для району, а також підземні карти порятунку з маршрутами евакуації. Встановлено, що комп’ютерне моделювання може визначити місцезнаходження контейнера, але для цього потрібні точні вихідні дані й комплекс дорогих високоефективних програмних пакетів. Наукова новизна. Вперше застосовано комплекс комп’ютерних технологій та програмного забезпечення для пошуку зниклих шахтарів після аварійних ситуацій у підземному просторі шахт. Практична значимість. При застосуванні двостадійної стратегії пошуку шахтарів, що опинилися під завалом порід, команда рятувальників отримає сигнал про наближення до їх місцезнаходження.Цель. Определение возможного места локализации лампового помещения контейнера, в котором оказались три шахтера после аварии на шахте Лили (Барбертон, Мпумаланга) методом компьютерного моделирования. После обрушения потолочного целика на шахте Лили 5 февраля 2016 года началась национальная кампания по спасению трех шахтеров, оставшихся в ламповом помещении поверхностного транспортного контейнера, который провалился в воронку, образовавшуюся после взрыва. Методика. Сотрудниками Горного Института (Уитуотерс) предложена двухстадийная стратегия поиска контейнера, в котором существует вероятность нахождения шахтеров. В рамках первого подхода (который рассматривается в данной статье) для обнаружения контейнера производилось испытание компьютерной технологии 3D-моделирования во времени. Второй подход предполагал технологию проведения научного измерения и эксперимента. В целом, методология включала, прежде всего, подключение преподавательского и научного состава университета к решению проблемы путем комплексной генерации идей, которые были объединены в общий список, изучены с привлечением соответствующих литературных источников, и наиболее реалистичные идеи были выделены из общего списка. Настоящая статья рассматривает результаты компьютерной экспертизы данных идей и проверки надежности соответствующего программного обеспечения. Результаты. Для удобства моделирования процесс обрушения был разделен на три отдельные фазы: разрушение воронки, разрушение западного склона и опасность скольжения на южных склонах. Идентифицированы программные технологии, которые могут имитировать движение контейнера в первых двух фазах обрушения. В результате моделирования в программном обеспечении ParaView выявлено местоположение данного контейнера. Выполнен анализа южного склона с помощью ArcGIS и составлены карты опасности склона для района, а также подземные карты спасения с маршрутами эвакуации. Установлено, что компьютерное моделирование может определить местонахождение контейнера, но для этого нужны точные исходные данные и комплекс дорогостоящих высокоэффективных программных пакетов. Научная новизна. Впервые применен комплекс компьютерных технологий и программного обеспечения для поиска пропавших шахтеров после аварийных ситуаций в подземном пространстве шахт. Практическая значимость. При применении двухстадийной стратегии поиска шахтеров, оказавшихся под завалом пород, команда горноспасателей получит сигнал о приближении к их местонахождению.The results of the article were obtained without the support of any of the projects or funding

    On the classification and evaluation of prefetching schemes

    Get PDF
    Abstract available: p. [2

    Working Sets Past and Present

    Get PDF

    A Scalable Cluster-based Infrastructure for Edge-computing Services

    Get PDF
    In this paper we present a scalable and dynamic intermediary infrastruc- ture, SEcS (acronym of BScalable Edge computing Services’’), for developing and deploying advanced Edge computing services, by using a cluster of heterogeneous machines. Our goal is to address the challenges of the next-generation Internet services: scalability, high availability, fault-tolerance and robustness, as well as programmability and quick prototyping. The system is written in Java and is based on IBM’s Web Based Intermediaries (WBI) [71] developed at IBM Almaden Research Center

    Geomatics for Mobility Management. A comprehensive database model for Mobility Management

    Get PDF
    In urban and metropolitan context, Traffic Operations Centres (TOCs) use technologies as Geographic Information Systems (GIS) and Intelligent Transport Systems (ITS) to tackling urban mobility issue. Usually in TOCs, various isolated systems are maintained in parallel (stored in different databases), and data comes from different sources: a challenge in transport management is to transfer disparate data into a unified data management system that preserves access to legacy data, allowing multi-thematic analysis. This need of integration between systems is important for a wise policy decisions. This study aims to design a comprehensive and general spatial data model that could allow the integration and visualization of traffic components and measures. The activity is focused on the case study of 5T Agency in Turin, a TOC that manages traffic regulation, public transit fleets and information to users, in the metropolitan area of Turin and Piedmont Region. In particular, the agency has set up during years a wide system of ITS technologies that acquires continuously measures and traffic information, which are used to deploy information services to citizens and public administrations. However, the spatial nature of these data is not fully considered in the daily operational activity, with the result of difficulties in information integration. Indeed the agency lacks of a complete GIS that includes all the management information in an organized spatial and “horizontal” vision. The main research question concerns the integration of different kind of data in a unique GIS spatial data model. Spatial data interoperability is critical and particularly challenging because geographic data definition in legacy database can vary widely: different data format and standards, data inconsistencies, different spatial and temporal granularities, different methods and enforcing rules that relates measures, events and physical infrastructures. The idea is not to replace the existing implemented and efficient system, but to built-up on these systems a GIS that overpass the different software and DBMS platforms and that can demonstrate how a spatial and horizontal vision in tackling urban mobility issues may be useful for policy and strategies decisions. The modelling activity take reference from a transport standards review and results in database general schema, which can be reused by other TOCs in their activities, helping the integration and coordination between different TOCs. The final output of the research is an ArcGIS geodatabase, tailored on 5T data requirements, which enable the customised representation of private traffic elements and measures. Specific custom scripts have been developed to allow the extraction and the temporal aggregation of traffic measures and events. The solution proposed allows the reuse of data and measures for custom purposes, without the need to deeply know the entire ITS environment system. In addition, The proposed ArcGIS geodatabase solution is optimised for limited power-computing environment. A case study has been deepened in order to evaluate the suitability of the database: a confrontation between damages, detected by Emergency Mapping Services (EMS), and Traffic Message Channel traffic events, has been conducted, evaluating the utility of 5T historical information of traffic events of the Piedmont floods of November 2016 for EMS services

    An accurate prefetching policy for object oriented systems

    Get PDF
    PhD ThesisIn the latest high-performance computers, there is a growing requirement for accurate prefetching(AP) methodologies for advanced object management schemes in virtual memory and migration systems. The major issue for achieving this goal is that of finding a simple way of accurately predicting the objects that will be referenced in the near future and to group them so as to allow them to be fetched same time. The basic notion of AP involves building a relationship for logically grouping related objects and prefetching them, rather than using their physical grouping and it relies on demand fetching such as is done in existing restructuring or grouping schemes. By this, AP tries to overcome some of the shortcomings posed by physical grouping methods. Prefetching also makes use of the properties of object oriented languages to build inter and intra object relationships as a means of logical grouping. This thesis describes how this relationship can be established at compile time and how it can be used for accurate object prefetching in virtual memory systems. In addition, AP performs control flow and data dependency analysis to reinforce the relationships and to find the dependencies of a program. The user program is decomposed into prefetching blocks which contain all the information needed for block prefetching such as long branches and function calls at major branch points. The proposed prefetching scheme is implemented by extending a C++ compiler and evaluated on a virtual memory simulator. The results show a significant reduction both in the number of page fault and memory pollution. In particular, AP can suppress many page faults that occur during transition phases which are unmanageable by other ways of fetching. AP can be applied to a local and distributed virtual memory system so as to reduce the fault rate by fetching groups of objects at the same time and consequently lessening operating system overheads.British Counci

    Challenging local realism with human choices

    Full text link
    A Bell test is a randomized trial that compares experimental observations against the philosophical worldview of local realism. A Bell test requires spatially distributed entanglement, fast and high-efficiency detection and unpredictable measurement settings. Although technology can satisfy the first two of these requirements, the use of physical devices to choose settings in a Bell test involves making assumptions about the physics that one aims to test. Bell himself noted this weakness in using physical setting choices and argued that human `free will' could be used rigorously to ensure unpredictability in Bell tests. Here we report a set of local-realism tests using human choices, which avoids assumptions about predictability in physics. We recruited about 100,000 human participants to play an online video game that incentivizes fast, sustained input of unpredictable selections and illustrates Bell-test methodology. The participants generated 97,347,490 binary choices, which were directed via a scalable web platform to 12 laboratories on five continents, where 13 experiments tested local realism using photons, single atoms, atomic ensembles, and superconducting devices. Over a 12-hour period on 30 November 2016, participants worldwide provided a sustained data flow of over 1,000 bits per second to the experiments, which used different human-generated data to choose each measurement setting. The observed correlations strongly contradict local realism and other realistic positions in bipartite and tripartite scenarios. Project outcomes include closing the `freedom-of-choice loophole' (the possibility that the setting choices are influenced by `hidden variables' to correlate with the particle properties), the utilization of video-game methods for rapid collection of human generated randomness, and the use of networking techniques for global participation in experimental science.Comment: This version includes minor changes resulting from reviewer and editorial input. Abstract shortened to fit within arXiv limit

    Traffic locality oriented route discovery algorithms for mobile ad hoc networks

    Get PDF
    There has been a growing interest in Mobile Ad hoc Networks (MANETs) motivated by the advances in wireless technology and the range of potential applications that might be realised with such technology. Due to the lack of an infrastructure and their dynamic nature, MANETs demand a new set of networking protocols to harness the full benefits of these versatile communication systems. Great deals of research activities have been devoted to develop on-demand routing algorithms for MANETs. The route discovery processes used in most on-demand routing algorithms, such as the Dynamic Source Routing (DSR) and Ad hoc On-demand Distance Vector (AODV), rely on simple flooding as a broadcasting technique for route discovery. Although simple flooding is simple to implement, it dominates the routing overhead, leading to the well-known broadcast storm problem that results in packet congestion and excessive collisions. A number of routing techniques have been proposed to alleviate this problem, some of which aim to improve the route discovery process by restricting the broadcast of route request packets to only the essential part of the network. Ideally, a route discovery should stop when a receiving node reports a route to the required destination. However, this cannot be achieved efficiently without the use of external resources; such as GPS location devices. In this thesis, a new locality-oriented route discovery approach is proposed and exploited to develop three new algorithms to improve the route discovery process in on-demand routing protocols. The proposal of our algorithms is motivated by the fact that various patterns of traffic locality occur quite naturally in MANETs since groups of nodes communicate frequently with each other to accomplish common tasks. Some of these algorithms manage to reduce end-to-end delay while incurring lower routing overhead compared to some of the existing algorithms such as simple flooding used in AODV. The three algorithms are based on a revised concept of traffic locality in MANETs which relies on identifying a dynamic zone around a source node where the zone radius depends on the distribution of the nodes with which that the source is “mostly” communicating. The traffic locality concept developed in this research form the basis of our Traffic Locality Route Discovery Approach (TLRDA) that aims to improve the routing discovery process in on-demand routing protocols. A neighbourhood region is generated for each active source node, containing “most” of its destinations, thus the whole network being divided into two non-overlapping regions, neighbourhood and beyond-neighbourhood, centred at the source node from that source node prospective. Route requests are processed normally in the neighbourhood region according to the routing algorithm used. However, outside this region various measures are taken to impede such broadcasts and, ultimately, stop them when they have outlived their usefulness. The approach is adaptive where the boundary of each source node’s neighbourhood is continuously updated to reflect the communication behaviour of the source node. TLRDA is the basis for the new three route discovery algorithms; notably: Traffic Locality Route Discovery Algorithm with Delay (TLRDA D), Traffic Locality Route Discovery Algorithm with Chase (TLRDA-C), and Traffic Locality Expanding Ring Search (TL-ERS). In TLRDA-D, any route request that is currently travelling in its source node’s beyond-neighbourhood region is deliberately delayed to give priority to unfulfilled route requests. In TLRDA-C, this approach is augmented by using chase packets to target the route requests associated with them after the requested route has been discovered. In TL-ERS, the search is conducted by covering three successive rings. The first ring covers the source node neighbourhood region and unsatisfied route requests in this ring trigger the generation of the second ring which is double that of the first. Otherwise, the third ring covers the whole network and the algorithm finally resorts to flooding. Detailed performance evaluations are provided using both mathematical and simulation modelling to investigate the performance behaviour of the TLRDA D, TLRDA-C, and TL-ERS algorithms and demonstrate their relative effectiveness against the existing approaches. Our results reveal that TLRDA D and TLRDA C manage to minimize end-to-end packet delays while TLRDA-C and TL-ERS exhibit low routing overhead. Moreover, the results indicate that equipping AODV with our new route discovery algorithms greatly enhance the performance of AODV in terms of end to end delay, routing overhead, and packet loss

    PADAMOT : project overview report

    Get PDF
    Background and relevance to radioactive waste management International consensus confirms that placing radioactive wastes and spent nuclear fuel deep underground in a geological repository is the generally preferred option for their long-term management and disposal. This strategy provides a number of advantages compared to leaving it on or near the Earth’s surface. These advantages come about because, for a well chosen site, the geosphere can provide: • a physical barrier that can negate or buffer against the effects of surface dominated natural disruptive processes such as deep weathering, glaciation, river and marine erosion or flooding, asteroid/comet impact and earthquake shaking etc. • long and slow groundwater return pathways from the facility to the biosphere along which retardation, dilution and dispersion processes may operate to reduce radionuclide concentration in the groundwater. • a stable, and benign geochemical environment to maximise the longevity of the engineered barriers such as the waste containers and backfill in the facility. • a natural radiation shield around the wastes. • a mechanically stable environment in which the facility can be constructed and will afterwards be protected. • an environment which reduces the likelihood of the repository being disturbed by inadvertent human intrusion such as land use changes, construction projects, drilling, quarrying and mining etc. • protection against the effects of deliberate human activities such as vandalism, terrorism and war etc. However, safety considerations for storing and disposing of long-lived radioactive wastes must take into account various scenarios that might affect the ability of the geosphere to provide the functionality listed above. Therefore, in order to provide confidence in the ability of a repository to perform within the deep geological setting at a particular site, a demonstration of geosphere “stability” needs to be made. Stability is defined here to be the capacity of a geological and hydrogeological system to minimise the impact of external influences on the repository environment, or at least to account for them in a manner that would allow their impacts to be evaluated and accounted for in any safety assessments. A repository should be sited where the deep geosphere is a stable host in which the engineered containment can continue to perform according to design and in which the surrounding hydrogeological, geomechanical and geochemical environment will continue to operate as a natural barrier to radionuclide movement towards the biosphere. However, over the long periods of time during which long-lived radioactive wastes will pose a hazard, environmental change at the surface has the potential to disrupt the stability of the geosphere and therefore the causes of environmental change and their potential consequences need to be evaluated. As noted above, environmental change can include processes such as deep weathering, glaciation, river and marine erosion. It can also lead to changes in groundwater boundary conditions through alternating recharge/discharge relationships. One of the key drivers for environmental change is climate variability. The question then arises, how can geosphere stability be assessed with respect to changes in climate? Key issues raised in connection with this are: • What evidence is there that 'going underground' eliminates the extreme conditions that storage on the surface would be subjected to in the long term? • How can the additional stability and safety of the deep geosphere be demonstrated with evidence from the natural system? As a corollary to this, the capacity of repository sites deep underground in stable rock masses to mitigate potential impacts of future climate change on groundwater conditions therefore needs to be tested and demonstrated. To date, generic scenarios for groundwater evolution relating to climate change are currently weakly constrained by data and process understanding. Hence, the possibility of site-specific changes of groundwater conditions in the future can only be assessed and demonstrated by studying groundwater evolution in the past. Stability of groundwater conditions in the past is an indication of future stability, though both the climatic and geological contexts must be taken into account in making such an assertion

    A Computational Model for Quantum Measurement

    Full text link
    Is the dynamical evolution of physical systems objectively a manifestation of information processing by the universe? We find that an affirmative answer has important consequences for the measurement problem. In particular, we calculate the amount of quantum information processing involved in the evolution of physical systems, assuming a finite degree of fine-graining of Hilbert space. This assumption is shown to imply that there is a finite capacity to sustain the immense entanglement that measurement entails. When this capacity is overwhelmed, the system's unitary evolution becomes computationally unstable and the system suffers an information transition (`collapse'). Classical behaviour arises from the rapid cycles of unitary evolution and information transitions. Thus, the fine-graining of Hilbert space determines the location of the `Heisenberg cut', the mesoscopic threshold separating the microscopic, quantum system from the macroscopic, classical environment. The model can be viewed as a probablistic complement to decoherence, that completes the measurement process by turning decohered improper mixtures of states into proper mixtures. It is shown to provide a natural resolution to the measurement problem and the basis problem.Comment: 24 pages; REVTeX4; published versio
    corecore