345 research outputs found
3D visualization and mapping of choroid thickness based on optical coherence tomography: A step-by-step geometric approach
Although bodily organs are inherently 3D, medical diagnosis often relies on their 2D representation. For instance, sectional images of the eye (especially, of its posterior part) based on optical coherence tomography (OCT) provide internal views, from which the ophthalmologist makes medical decisions about 3D eye structures. In the course, the physician is forced to mentally synthesize the underlying 3D context, which could be both time consuming and stressful. In this backdrop, can such 2D sections be arranged and presented in the natural 3D form for faster and stress-free diagnosis? In this paper, we consider ailments affecting choroid thickness, and address the aforementioned question at two levels-in terms of 3D visualization and 3D mapping. In particular, we exploit the spherical geometry of the eye, align OCT sections on a nominal sphere, and extract the choroid by peeling off inner and outer layers. At each step, we render our intermediate results on a 3D lightfield display, which provides a natural visual representation. Finally, the thickness variation of the extracted choroid is spatially mapped, and observed on a lightfield display as well as using 3D visualization softwares on a regular 2D terminal. Consequently, we identified choroid depletion around optic disc based on the test OCT images. We believe that the proposed technique would provide ophthalmologists with a tool for making faster diagnostic decisions with less stress
A Survey on Data Integration in Data Warehouse
Data warehousing embraces technology of integrating data from multiple distributed data sources and using that at an in annotated and aggregated form to support business decision-making and enterprise management. Although many techniques have been revisited or newly developed in the context of data warehouses, such as view maintenance and OLAP, little attention has been paid to data mining techniques for supporting the most important and costly tasks of data integration for data warehouse design
Survey of Current Network Intrusion Detection Techniques
The significance of network security has grown enormously and a number of devices have been introduced to perk up the security of a network. NIDS is a retrofit approach for providing a sense of security in existing computers and data networks, while allowing them to operate in their current open mode. The goal of a network intrusion detection system is to identify, preferably in real time, unauthorized use, misuse and abuse of computer systems by insiders as well as from outside perpetrators. This paper presents a nomenclature of intrusion detection systems that is used to do a survey and identify a number of research prototypes. Keywords: Security, Intrusion Detection, Misuse and Anomaly Detection, Pattern Matching
Mining Frequent Item Sets Data Streams using "ÉclatAlgorithm"
Frequent pattern mining is the process of mining data in a set of items or some patterns from a largedatabase. The resulted frequent set data supports the minimum support threshold. A frequentpattern is a pattern that occurs frequently in a dataset. Association rule mining is defined as to findout association rules that satisfy the predefined minimum support and confidence from a given database. If an item set is said to be frequent, that item set supports the minimum support andconfidence. A Frequent item set should appear in all the transaction of that data base. Discoveringfrequent item sets play a very important role in mining association rules, sequence rules, web logmining and many other interesting patterns among complex data. Data stream is a real timecontinuous, ordered sequence of items. It is an uninterrupted flow of a long sequence of data. Somereal time examples of data stream data are sensor network data, telecommunication data,transactional data and scientific surveillances systems. These data produced trillions of updatesevery day. So it is very difficult to store the entire data. In that time some mining process is required.Data mining is the non-trivial process of identifying valid, original, potentially useful and ultimatelyunderstandable patterns in data. It is an extraction of the hidden predictive information from largedata base. There are lots of algorithms used to find out the frequent item set. In that Apriorialgorithm is the very first classical algorithm used to find the frequent item set. Apart from Apriori,lots of algorithms generated but they are similar to Apriori. They are based on prune and candidategeneration. It takes more memory and time to find out the frequent item set. In this paper, we havestudied about how the éclat algorithm is used in data streams to find out the frequent item sets.Éclat algorithm need not required candidate generation
Implementation of Anomaly Based Network Intrusion Detection by Using Q-learning Technique
Network Intrusion detection System (NIDS) is an intrusion detection system that tries to discover malicious activity such as service attacks, port scans or even attempts to break into computers by monitoring network traffic. Data mining techniques make it possible to search large amounts of data for characteristic rules and patterns. If applied to network monitoring data recorded on a host or in a network, they can be used to detect intrusions, attacks or anomalies. We proposed “machine learning method”, cascading Principal Component Analysis (PCA) and the Q-learning methods to classifying anomalous and normal activities in a computer network. This paper investigates the use of PCA to reduce high dimensional data and to improve the predictive performance. On the reduced data, representing a density region of normal or anomaly instances, Q-learning strategies are applied for the creation of agents that can adapt to unknown, complex environments. We attempted to create an agent that would learn to explore an environment and collect the malicious within it. We obtained interesting results where agents were able to re-adapt their learning quickly to the new traffic and network information as compare to the other machine learning method such as supervised learning and unsupervised learning. Keywords: Intrusion, Anomaly Detection, Data Mining, KDD Cup’99, PCA, Q-learning
Quantitative shadow compensated optical coherence tomography of choroidal vasculature
Conventionally rendered optical coherence tomography (OCT) images of the posterior segment contain shadows which influence the visualization of deep structures such as the choroid. The purpose of this study was to determine whether OCT shadow compensation (SC) alters the appearance of the choroid and the apparent choroidal vascularity index (CVI), an OCT-derived estimated ratio of luminal to total choroidal volume. All scans were shadow compensated using a previously published algorithm, binarized using a novel validated algorithm and extracted binarized choroid to estimate CVI. On 27 raw swept-source OCT volume-scans of healthy subjects, the effect of SC on CVI was established both qualitatively and quantitatively. In shadow compensated scans, the choroid was visualized with greater brightness than the neurosensory retina and the masking of deep tissues by retinal blood vessels was greatly reduced. Among study subjects, significant mean difference in CVI of -0.13 was observed between raw and shadow compensated scans. Conventionally acquired OCT underestimates both choroidal reflectivity and calculated CVI. Quantitative analysis based on subjective grading demonstrated that SC increased the contrast between stromal and luminal regions and are in agreement with true tissue regions. This study is warranted to determine the effects of SC on CVI in diseased eyes
Improving Compressive Strength of Cement Concrete Mix by Using M-Sand and Bamboo Fiber
In the current world, concrete has become a very important part of the construction industry and the materials which are used in making concrete have evolved due to better quality of cement and better grade of coarse aggregates. The sand is an important part of concrete. It is mainly procured from natural sources. Thus the grade of sand is not under our control. The concrete cubes of M-20, M-25 & M-30 grade were threw in this trial explore work and tried to analyze different properties of concrete like compressive quality, workability. In this study M-sand is considered as a replacement of natural sand by 50, 70 & 90% by weight of sand in concrete design mix with 1% Bamboo fiber streams as an admixture. This study is carried out at the age of 7 and 28 days. In this work, the general properties of fresh and hardened concrete were tried and the outcomes were dissected. As concrete is a central material for the construction industry. In any case, in the present period where development is expanding quickly and development rate is coming to their statures, it is contrarily affecting our condition as well. So it is vital to utilize some optional materials as a part of concrete to minimize the cost, and to enhance the properties of concrete for better stability of the structures
An Efficient Image Segmentation Approach through Enhanced Watershed Algorithm
Image segmentation is a significant task for image analysis which is at the middle layer of image engineering. The purpose of segmentation is to decompose the image into parts that are meaningful with respect to a particular application. The proposed system is to boost the morphological watershed method for degraded images. Proposed algorithm is based on merging morphological watershed result with enhanced edge detection result obtain on pre processing of degraded images. As a post processing step, to each of the segmented regions obtained, color histogram algorithm is applied, enhancing the overall performance of the watershed algorithm. Keywords – Segmentation, watershed, color histogra
Content Based Image Retrieval by Using Interactive Relevance Feedback Technique - A Survey
Due to rapid increase in storing and capturing multimedia data with the digital device, Content Based Image Retrieval play a very important role in the field of image processing. Although wide ranging studies have been done in the field of CBIR but image finding from multimedia data basis is still very complicated and open problem. If paper provide an review of CBIR based on some of the famous techniques such as Interactive Genetic Algorithm, Relevance Feedback (RS), Neural Network and so on. Relevance Feedback can be used to enhance the ability of CBIR effectively by dropping the semantic gap between low level feature and high level feature. Interactiveness on CBIR can also be done with the help of Genetic Algorithms. GA is the branch of evolutionary computation which makes the retrieval process more interactive so that user can get advanced results from database by comparing to Query Image with its evaluation. The result of traditional implicit feedback can also be improved by Neuro Fuzzy Logic based implicit feedback. This paper covers all the aspect of Relevance Feedback (RF), Interactive Genetic Algorithms, Neural Network in Content Based Image Retrieval, various RF techniques and applications of CBIR.
DOI: 10.17762/ijritcc2321-8169.15075
- …
