25 research outputs found

    Robust and parallel mesh reconstruction from unoriented noisy points.

    Get PDF
    Sheung, Hoi.Thesis (M.Phil.)--Chinese University of Hong Kong, 2009.Includes bibliographical references (p. 65-70).Abstract also in Chinese.Abstract --- p.vAcknowledgements --- p.ixList of Figures --- p.xiiiList of Tables --- p.xvChapter 1 --- Introduction --- p.1Chapter 1.1 --- Main Contributions --- p.3Chapter 1.2 --- Outline --- p.3Chapter 2 --- Related Work --- p.5Chapter 2.1 --- Volumetric reconstruction --- p.5Chapter 2.2 --- Combinatorial approaches --- p.6Chapter 2.3 --- Robust statistics in surface reconstruction --- p.6Chapter 2.4 --- Down-sampling of massive points --- p.7Chapter 2.5 --- Streaming and parallel computing --- p.7Chapter 3 --- Robust Normal Estimation and Point Projection --- p.9Chapter 3.1 --- Robust Estimator --- p.9Chapter 3.2 --- Mean Shift Method --- p.11Chapter 3.3 --- Normal Estimation and Projection --- p.11Chapter 3.4 --- Moving Least Squares Surfaces --- p.14Chapter 3.4.1 --- Step 1: local reference domain --- p.14Chapter 3.4.2 --- Step 2: local bivariate polynomial --- p.14Chapter 3.4.3 --- Simpler Implementation --- p.15Chapter 3.5 --- Robust Moving Least Squares by Forward Search --- p.16Chapter 3.6 --- Comparison with RMLS --- p.17Chapter 3.7 --- K-Nearest Neighborhoods --- p.18Chapter 3.7.1 --- Octree --- p.18Chapter 3.7.2 --- Kd-Tree --- p.19Chapter 3.7.3 --- Other Techniques --- p.19Chapter 3.8 --- Principal Component Analysis --- p.19Chapter 3.9 --- Polynomial Fitting --- p.21Chapter 3.10 --- Highly Parallel Implementation --- p.22Chapter 4 --- Error Controlled Subsampling --- p.23Chapter 4.1 --- Centroidal Voronoi Diagram --- p.23Chapter 4.2 --- Energy Function --- p.24Chapter 4.2.1 --- Distance Energy --- p.24Chapter 4.2.2 --- Shape Prior Energy --- p.24Chapter 4.2.3 --- Global Energy --- p.25Chapter 4.3 --- Lloyd´ةs Algorithm --- p.26Chapter 4.4 --- Clustering Optimization and Subsampling --- p.27Chapter 5 --- Mesh Generation --- p.29Chapter 5.1 --- Tight Cocone Triangulation --- p.29Chapter 5.2 --- Clustering Based Local Triangulation --- p.30Chapter 5.2.1 --- Initial Surface Reconstruction --- p.30Chapter 5.2.2 --- Cleaning Process --- p.32Chapter 5.2.3 --- Comparisons --- p.33Chapter 5.3 --- Computing Dual Graph --- p.34Chapter 6 --- Results and Discussion --- p.37Chapter 6.1 --- Results of Mesh Reconstruction form Noisy Point Cloud --- p.37Chapter 6.2 --- Results of Clustering Based Local Triangulation --- p.47Chapter 7 --- Conclusions --- p.55Chapter 7.1 --- Key Contributions --- p.55Chapter 7.2 --- Factors Affecting Our Algorithm --- p.55Chapter 7.3 --- Future Work --- p.56Chapter A --- Building Neighborhood Table --- p.59Chapter A.l --- Building Neighborhood Table in Streaming --- p.59Chapter B --- Publications --- p.63Bibliography --- p.6

    Meshless Voronoi on the GPU

    Get PDF
    International audienceWe propose a GPU algorithm that computes a 3D Voronoi diagram. Our algorithm is tailored for applications that solely make use of the geometry of the Voronoi cells, such as Lloyd's relaxation used in meshing, or some numerical schemes used in fluid simulations and astrophysics. Since these applications only require the geometry of the Voronoi cells, they do not need the combinatorial mesh data structure computed by the classical algorithms (Bowyer-Watson). Thus, by exploiting the specific spatial distribution of the point-sets used in this type of applications, our algorithm computes each cell independently, in parallel, based on its nearest neighbors. In addition, we show how to compute integrals over the Voronoi cells by decomposing them on the fly into tetrahedra, without needing to compute any global combinatorial information. The advantages of our algorithm is that it is fast, very simple to implement, has constant memory usage per thread and does not need any synchronization primitive.These specificities make it particularly efficient on the GPU: it gains one order of magnitude as compared to the fastest state-of-the-art multi-core CPU implementations. To ease the reproducibility of our results, the full documented source code is included in the supplemental material

    Various Deep Learning Techniques Involved In Breast Cancer Mammogram Classification – A Survey

    Get PDF
    The most common and rapidly spreading disease in the world is breast cancer. Most cases of breast cancer are observed in females. Breast cancer can be controlled with early detection. Early discovery helps to manage a lot of cases and lower the death rate. On breast cancer, numerous studies have been conducted. Machine learning is the method that is utilized in research the most frequently. There have been a lot of earlier machine learning-based studies. Decision trees, KNN, SVM, naive bays, and other machine learning algorithms perform better in their respective fields. However, a newly created method is now being utilized to categorize breast cancer. Deep learning is a recently developed method. The limitations of machine learning are solved through deep learning. Convolution neural networks, recurrent neural networks, deep belief networks, and other deep learning techniques are frequently utilized in data science. Deep learning algorithms perform better than machine learning algorithms. The best aspects of the images are extracted. CNN is employed in our study to categorize the photos. Basically, CNN is the most widely used technique to categorize images, on which our research is based

    A Comparison of Steering Techniques for Virtual Crowd Simulations

    Get PDF

    Robust Eigen-Filter Design for Ultrasound Flow Imaging Using a Multivariate Clustering

    Get PDF
    Blood flow visualization is a challenging task in the presence of tissue motion. Unsuppressed tissue clutter produces flashing artefacts in ultrasound flow imaging which hampers blood flow detection by dominating part of the blood flow signal in certain challenging clinical imaging applications, ranging from cardiac imaging (maximal tissue vibrations) to microvascular flow imaging (very low blood flow speeds). Conventional clutter filtering techniques perform poorly since blood and tissue clutter echoes share similar spectral characteristics. Eigen-based filtering was recently introduced and has shown good clutter rejection performance; however, flow detection performance in eigen filtering suffers if tissue and flow signal subspaces overlap after eigen components are projected to a single signal feature space for clutter rank selection. To address this issue, a novel multivariate clustering based singular value decomposition (SVD) filter design is developed. The proposed multivariate clustering based filter robustly detects and removes non-blood eigen components by leveraging on three key spatiotemporal statistics: singular value magnitude, spatial correlation and the mean Doppler frequency of singular vectors. A better clutter suppression framework is necessary for high-frame-rate (HFR) ultrasound imaging since it is more susceptible to tissue motion due to poorer spatial resolution (tissue clutter bleeds into flow pixels easily). Hence, to test the clutter rejection performance of the proposed filter, HFR plane wave data was acquired from an in vitro flow phantom testbed and in vivo from a subject’s common carotid artery and jugular vein region induced with extrinsic tissue motion (voluntary probe motion). The proposed method was able to adaptively detect and preserve blood eigen components and enabled fully automatic identification of eigen components corresponding to tissue clutter, blood and noise that removes dependency on the operator for optimal rank selection. The flow detection efficacy of the proposed multivariate clustering based SVD filter was statistically evaluated and compared with current clutter rank estimation methods using the receiver operating characteristic (ROC) analysis. Results for both in vitro and in vivo experiments showed that the multivariate clustering based SVD filter yielded the highest area under the ROC curve at both peak systole (0.98 for in vitro; 0.95 for in vivo) and end diastole (0.96 for in vitro; 0.93 for in vivo) in comparison with other clutter rank estimation methods, signifying its improved flow detection capability. The impact of this work is on the automated as well as adaptive (in contrast to a fixed cut-off) selection of eigen components which can potentially allow to overcome the flow detection challenges associated with fast tissue motion in cardiovascular imaging and slow flow in microvascular imaging which is critical for cancer diagnoses

    Analyzing Clustered Latent Dirichlet Allocation

    Get PDF
    Dynamic Topic Models (DTM) are a way to extract time-variant information from a collection of documents. The only available implementation of this is slow, taking days to process a corpus of 533,588 documents. In order to see how topics - both their key words and their proportional size in all documents - change over time, we analyze Clustered Latent Dirichlet Allocation (CLDA) as an alternative to DTM. This algorithm is based on existing parallel components, using Latent Dirichlet Allocation (LDA) to extract topics at local times, and k-means clustering to combine topics from dierent time periods. This method is two orders of magnitude faster than DTM, and allows for more freedom of experiment design. Results show that most topics generated by this algorithm are similar to those generated by DTM at both the local and global level using the Jaccard index and Sørensen-Dice coecient, and that this method\u27s perplexity compares favorably to DTM. We also explore tradeos in CLDA method parameters

    Bitcoin, the controversial cryptocurrency: an insightful overview and its context in the portuguese market

    Get PDF
    Before Bitcoin's appearance, all the payment systems had faced the requirement of being controlled by an entity, as none had shown the ability to perform in a decentralized structure. Bitcoin is also a digital currency and consequently has no intrinsic value, however it has shown an enormous price increase over the past few years, having that growth a special emphasis in the current year. Therefore, Bitcoin emerges as a disruptor in the current financial system. This dissertation has the main objective of analysing the key concepts of Bitcoin from which the most controversial topics emerge. Then, those topics are furtherly scrutinized by developing a quantitative and qualitative research always regarding both present and future impacts of Bitcoin. The research includes statistical data from Bitcoin and other financial data providers. Additionally, and due to the lack of literature comprising the Portuguese market, a special attention is given to this country. Despite the lack of credibility among most financial leaders which has consisted of a limitation to Bitcoin's implementation process, this cryptocurrency is a technological landmark and, so far, shows no signs of slowing down.Antes do aparecimento da Bitcoin, todos os sistemas de pagamento enfrentavam a condição de serem controlados por uma entidade, uma vez que nenhum tinha mostrado capacidade para atuar numa estrutura descentralizada. A Bitcoin é também uma moeda digital e como tal não tem valor intrínseco, no entanto o seu preço tem tido um enorme aumento nos últimos anos, com especial ênfase no ano corrente. Deste modo, a Bitcoin surge como um disruptor do atual sistema financeiro. Esta dissertação tem como principal objetivo a análise dos conceitos fundamentais da Bitcoin, dos quais surgem os tópicos mais controversos. Após essa fase, esses tópicos serão aprofundados através do desenvolvimento de uma pesquisa quantitativa e qualitativa, tendo sempre em consideração impactos atuais e futuros da Bitcoin. Esta pesquisa inclui dados estatísticos apresentados por intermediários da Bitcoin e outros fornecedores de dados financeiros. Adicionalmente, e devido à ausência de literatura que inclua o mercado Português, este será analisado de forma mais detalhada. Apesar da falta de credibilidade junto dos principais líderes financeiros o que tem consistido numa limitação ao processo de implementação da Bitcoin, esta criptomoeda é um marco tecnológico e, até agora, não mostra sinais de abrandamento

    Artificial Intelligence for Small Satellites Mission Autonomy

    Get PDF
    Space mission engineering has always been recognized as a very challenging and innovative branch of engineering: since the beginning of the space race, numerous milestones, key successes and failures, improvements, and connections with other engineering domains have been reached. Despite its relative young age, space engineering discipline has not gone through homogeneous times: alternation of leading nations, shifts in public and private interests, allocations of resources to different domains and goals are all examples of an intrinsic dynamism that characterized this discipline. The dynamism is even more striking in the last two decades, in which several factors contributed to the fervour of this period. Two of the most important ones were certainly the increased presence and push of the commercial and private sector and the overall intent of reducing the size of the spacecraft while maintaining comparable level of performances. A key example of the second driver is the introduction, in 1999, of a new category of space systems called CubeSats. Envisioned and designed to ease the access to space for universities, by standardizing the development of the spacecraft and by ensuring high probabilities of acceptance as piggyback customers in launches, the standard was quickly adopted not only by universities, but also by agencies and private companies. CubeSats turned out to be a disruptive innovation, and the space mission ecosystem was deeply changed by this. New mission concepts and architectures are being developed: CubeSats are now considered as secondary payloads of bigger missions, constellations are being deployed in Low Earth Orbit to perform observation missions to a performance level considered to be only achievable by traditional, fully-sized spacecraft. CubeSats, and more in general the small satellites technology, had to overcome important challenges in the last few years that were constraining and reducing the diffusion and adoption potential of smaller spacecraft for scientific and technology demonstration missions. Among these challenges were: the miniaturization of propulsion technologies, to enable concepts such as Rendezvous and Docking, or interplanetary missions; the improvement of telecommunication state of the art for small satellites, to enable the downlink to Earth of all the data acquired during the mission; and the miniaturization of scientific instruments, to be able to exploit CubeSats in more meaningful, scientific, ways. With the size reduction and with the consolidation of the technology, many aspects of a space mission are reduced in consequence: among these, costs, development and launch times can be cited. An important aspect that has not been demonstrated to scale accordingly is operations: even for small satellite missions, human operators and performant ground control centres are needed. In addition, with the possibility of having constellations or interplanetary distributed missions, a redesign of how operations are management is required, to cope with the innovation in space mission architectures. The present work has been carried out to address the issue of operations for small satellite missions. The thesis presents a research, carried out in several institutions (Politecnico di Torino, MIT, NASA JPL), aimed at improving the autonomy level of space missions, and in particular of small satellites. The key technology exploited in the research is Artificial Intelligence, a computer science branch that has gained extreme interest in research disciplines such as medicine, security, image recognition and language processing, and is currently making its way in space engineering as well. The thesis focuses on three topics, and three related applications have been developed and are here presented: autonomous operations by means of event detection algorithms, intelligent failure detection on small satellite actuator systems, and decision-making support thanks to intelligent tradespace exploration during the preliminary design of space missions. The Artificial Intelligent technologies explored are: Machine Learning, and in particular Neural Networks; Knowledge-based Systems, and in particular Fuzzy Logics; Evolutionary Algorithms, and in particular Genetic Algorithms. The thesis covers the domain (small satellites), the technology (Artificial Intelligence), the focus (mission autonomy) and presents three case studies, that demonstrate the feasibility of employing Artificial Intelligence to enhance how missions are currently operated and designed
    corecore