1,811 research outputs found

    A New Data Layout For Set Intersection on GPUs

    Full text link
    Set intersection is the core in a variety of problems, e.g. frequent itemset mining and sparse boolean matrix multiplication. It is well-known that large speed gains can, for some computational problems, be obtained by using a graphics processing unit (GPU) as a massively parallel computing device. However, GPUs require highly regular control flow and memory access patterns, and for this reason previous GPU methods for intersecting sets have used a simple bitmap representation. This representation requires excessive space on sparse data sets. In this paper we present a novel data layout, "BatMap", that is particularly well suited for parallel processing, and is compact even for sparse data. Frequent itemset mining is one of the most important applications of set intersection. As a case-study on the potential of BatMaps we focus on frequent pair mining, which is a core special case of frequent itemset mining. The main finding is that our method is able to achieve speedups over both Apriori and FP-growth when the number of distinct items is large, and the density of the problem instance is above 1%. Previous implementations of frequent itemset mining on GPU have not been able to show speedups over the best single-threaded implementations.Comment: A version of this paper appears in Proceedings of IPDPS 201

    Production Scheduling of an Open-pit Mining Complex with Waste Dump Constraints

    Get PDF
    The research work aims to solve the production scheduling problem for open pit mining complexes. It establishes a Mixed-Integer Programming (MIP) model that maximises the net present value of future cash flows and satisfies reserve, production capacity, mining block precedence, waste disposal, stockpiling, and pit sequence constraints. The model is validated and implemented with real-world case

    Quantification of uncertainty of geometallurgical variables for mine planning optimisation

    Get PDF
    Interest in geometallurgy has increased significantly over the past 15 years or so because of the benefits it brings to mine planning and operation. Its use and integration into design, planning and operation is becoming increasingly critical especially in the context of declining ore grades and increasing mining and processing costs. This thesis, comprising four papers, offers methodologies and methods to quantify geometallurgical uncertainty and enrich the block model with geometallurgical variables, which contribute to improved optimisation of mining operations. This enhanced block model is termed a geometallurgical block model. Bootstrapped non-linear regression models by projection pursuit were built to predict grindability indices and recovery, and quantify model uncertainty. These models are useful for populating the geometallurgical block model with response attributes. New multi-objective optimisation formulations for block caving mining were formulated and solved by a meta-heuristics solver focussing on maximising the project revenue and, at the same time, minimising several risk measures. A novel clustering method, which is able to use both continuous and categorical attributes and incorporate expert knowledge, was also developed for geometallurgical domaining which characterises the deposit according to its metallurgical response. The concept of geometallurgical dilution was formulated and used for optimising production scheduling in an open-pit case study.Thesis (Ph.D.) (Research by Publication) -- University of Adelaide, School of Civil, Environmental and Mining Engineering, 201

    The significance of identifying potential failure mechanisms from conceptual to design level for open pit rock slopes

    Get PDF
    Abstract: quality and quantity of geotechnical data often expands over time and may give rise to an increase in reliability and a corresponding reduction in uncertainty of input parameters. In this paper, the geotechnical model is built on data obtained from four different consultants over 15 years, spanning from conceptual study to design. Stability conditions are investigated through Limit Equilibrium Method and compared to the numerical analysis using Finite Difference Method. Three critical profiles, based on areas of known concern, are analysed. Kinematically admissible joint orientations are incorporated as Ubiquitous Joint models and materials are modelled based on the Mohr-Coulomb failure criterion. Limit Equilibrium Method results revealed that profile A is the most critical slope, with a significant probability of planar and wedge failure at stack angle level. Safety factors for large scale planar failure of profile B, although stable, remains below the acceptance criteria for the overall slope angle, which opted for numerical analysis. Profile C was deemed stable and no further analyses were required. Good agreement between methods of analysis, in terms of safety factors and failure surfaces. Finite Difference Method computed lower safety factors to the point of critical stability for profile B. A reduction in overall slope angle by 12° for this profile increases the safety factor to an acceptable value and reduces the probability of failure to 2% from a previous 14%. The lowered range in probability suggests a reduction in result variability and thus an increased level of confidence in data and analysis

    A New Data Layout for Set Intersection on GPUs

    Full text link

    Review of deep learning approaches in solving rock fragmentation problems

    Get PDF
    One of the most significant challenges of the mining industry is resource yield estimation from visual data. An example would be identification of the rock chunk distribution parameters in an open pit. Solution of this task allows one to estimate blasting quality and other parameters of open-pit mining. This task is of the utmost importance, as it is critical to achieving optimal operational efficiency, reducing costs and maximizing profits in the mining industry. The mentioned task is known as rock fragmentation estimation and is typically tackled using computer vision techniques like instance segmentation or semantic segmentation. These problems are often solved using deep learning convolutional neural networks. One of the key requirements for an industrial application is often the need for real-time operation. Fast computation and accurate results are required for practical tasks. Thus, the efficient utilization of computing power to process high-resolution images and large datasets is essential. Our survey is focused on the recent advancements in rock fragmentation, blast quality estimation, particle size distribution estimation and other related tasks. We consider most of the recent results in this field applied to open-pit, conveyor belts and other types of work conditions. Most of the reviewed papers cover the period of 2018-2023. However, the most significant of the older publications are also considered. A review of publications reveals their specificity, promising trends and best practices in this field. To place the rock fragmentation problems in a broader context and propose future research topics, we also discuss state-of-the-art achievements in real-time computer vision and parallel implementations of neural networks

    Evaluating the efficiency of the genetic algorithm in designing the ultimate pit limit of open-pit mines

    Get PDF
    The large-scale open-pit mine production planning problem is an NP-hard issue. That is, it cannot be solved in a reasonable computational time. To solve this problem, various methods, including metaheuristic methods, have been proposed to reduce the computation time. One of these methods is the genetic algorithm (GA) which can provide near-optimal solutions to the problem in a shorter time. This paper aims to evaluate the efficiency of the GA technique based on the pit values and computational times compared with other methods of designing the ultimate pit limit (UPL). In other words, in addition to GA evaluation in UPL design, other proposed methods for UPL design are also compared. Determining the UPL of an open-pit mine is the first step in production planning. UPL solver selects blocks whose total economic value is maximum while meeting the slope constraints. In this regard, various methods have been proposed, which can be classified into three general categories: Operational Research (OR), heuristic, and metaheuristic. The GA, categorized as a metaheuristic method, Linear Programming (LP) model as an OR method, and Floating Cone (FC) algorithm as a heuristic method, have been employed to determine the UPL of open-pit mines. Since the LP method provides the exact answer, consider the basics. Then the results of GA were validated based on the results of LP and compared with the results of FC. This paper used the Marvin mine block model with characteristics of 53271 blocks and eight levels as a case study. Comparing the UPL value's three ways revealed that the LP model received the highest value by comparing the value obtained from GA and the FC algorithm's lowest value. However, the GA provided the results in a shorter time than LP, which is more critical in large-scale production planning problems. By performing the sensitivity analysis in the GA on the two parameters, crossover and mutation probability, the GA's UPL value was modified to 20940. Its UPL value is only 8% less than LP's UPL value
    • …
    corecore