70 research outputs found
An Efficient Medical Image Processing Approach Based on a Cognitive Marine Predators Algorithm
Image processing aims to enhance the image's quality such that it is simple for both people and robots to understand. Medical image processing and Biomedical signal processing have many conceptual similarities. Medical image processing involves evaluation, enhancement, and presentation. The focus of medical imaging is on obtaining photographs for both therapeutic and diagnostic reasons. In the existing Marine Predator Algorithm, different disadvantages are experienced when various automated optimization algorithms are used to the problem of ECG categorization. The proposed method follows the flow outlined here: data collection, image preprocessing using histogram equalization, segmentation using the Otsu threshold algorithm, feature extraction using the contour method, feature selection using the Neighborhood Component Analysis (NCA) algorithm, and Cognitive Marine Predator Algorithm (CMPA) as the proposed method. By using the Cognitive Marine Predators Algorithm (CMPA), base layers are fused to use the greatest feasible parameters, producing enhanced high-quality output images. Finally, the image processing performance is analyzed. The proposed approaches overcome the drawbacks of existing algorithms and increase the quality of medical images efficiently. 
Involving the Calculations of Machine Learning to Propose Strategy for the Expectation Examination in Data Mining
Information mining is the course of distinguishing proof of the examples and patterns in enormous and complex information volumes. The typical information handling won't be useful in taking care of such complex information. Consequently, information mining is utilized to deal with such enormous volumes of the information. This paper will figure out the method for data mining process, kinds of data mining. It will moreover get a handle on the pros and cons of the data mining and issues related with the execution of the data mining. Information mining is the interaction which assists with settling different business issues by performing investigation of the information and data. It is utilized to decide the examples and figure out the conditions among different informational collections accessible. It is additionally used to foresee future patterns which is utilized to pursue basic choices by the business. Machine learning is the concept which is used to train the systems that in future they train themselves based upon the previous learning experiences. The paper will explain the machine learning methodologies used for data mining
An Experimental Analysis of Different Database Categories According to the Scope of Applications and Its Advantages
This paper will explain the concept of meaning of data base. In order to store data, companies need a system which has the capacity of storing all the data and information related to the organisation. The system should be such that it can be easily accessed and maintenance is also easy. The system which has the ability to store such data and from which the information can be accessed easily is called database. Each organization will have its own unique set of information; hence the type of database will also be different. There is variety of database available in the market. This paper will focus on variety of database and its advantages and challenges
Storage Security and Predictable Folder Structures in Cloud Computing
The open nature of the html content and URLs used to access other resources used to render the page leaves the folder structure and location of those files vulnerable to robots, external hackers and malicious insider attacks, typically referred to as XSS attack. A malicious user can study the html structure and find out the pattern or folder structure of stored files and with the help of robots or crawlers it can try to access reset of the files residing there on server irrespective of whether he was or was not authorized to get them and could use those files file ï vary from simple ones based on is only the resources are stolen from the web page content or the directories are crawled and all the resources from those locations are accessed, listed or used. XSS attack is easy to be launched with little efforts while its damage is severe in case of cloud
Motif and Conglomeration of Software Process Improvement Model
SPI is defined as an organized system and regular process of making a software step up so that a business body or company can produce and give classic software in a deadline of time as well as within the financial limitation. In the same concern I worked to produce a legal and established model for the same. The work is fully dedicated towards development of classic quality oriented software and for the welfare of the business body. The idea is fully based on the experience of the software project work of companies [1]. This work is with enclosed estimation, progression improvement of the software, reasons and causes that sway on the process of SPI. The decisive reason was to build up a model that could be practical in run through for the companies doing development work. There is a complete description of software principles, process of the software and models of improvement. Here is a system model is purposed that is fully generic and that is beneficial for a small as well as big organization. There are many existing models like CMM and Sigma Six and IDEAL. In the given work many drawbacks of these models are removed to increase the performance. Many new things, steps and policies are applied to do the same. It has eliminated the boundaries of the previous models. There are eight steps in the model those are if applied, used by any small, big company then there is a guarantee of getting a classic product in term of quality, performance as compared to other models. In this a mixture of many corrective actions has been used. The model is a growing and a step by step procedure but using the conventional method. It has limited the factor of risk up to a good extent
Finest Execution Time Approach for Optimal Execution Time in Mobile and Cloud Computing
This is the time when modernism and innovations concerning new technology and trends are hitting at the highest point. Mobile computing becomes the technology where dealings between the computer and man is to be is to be transported even when the usual operation as data, voice, video etc are transferred. The second dimension that we consider into account is the growing trend of cloud technique. It is a huge arrangement of the hardware and applications that makes the surroundings of the cloud. When we consider the smart mobile devices then it is obvious that these are constrained in many terms as energy or the power, the computing resources means the applications etc. so there are many issues those can be considered as challenges as for example energy saving, execution time of the tasks etc. one of the important issue is offloading applications that is related to the incapacitation of the resources. Fundamentally offloading is the technique or procedure by which the heavy tasks i. e the computation intensive tasks is migrated on the cloud. We have done our work in the same context. In this paper we recommend an algorithm known as FET algorithm which stands for Finest Execution Time. Our algorithm is used to reduce the total execution time of the tasks required to be finished at the SMD. We have taken length of the task as a core parameter of this algorithm. The algorithm works in 2 passes. We have proven our results by comparing with the existing methods. Our method or the algorithm gives a benefit or 10 to 13 percent in the total execution time
Advancing the Potential of Routing Protocol in Mobile Ad Hoc Network
An ad hoc network consists of nodes with a radio without wire which has multi hop network surroundings [3]. Their messages can be sent anywhere with the help of intermediate nodes only in limits. Broadcasting is mobile ad hoc networks (MANET) is process to send message one to other nodes of the network [1]. It has far-reaching application in mobile ad hoc networks (MANET). It provides significant control and route administration for all types protocols let it be unicast or multicast protocols. It has become an important and all above to find a strong routing protocol in networking research. MANET has important part like as D.S.R., A.O.D.V. for routing information and location routing are used to established routes [5]. There are many problems in broadcasting of MANETS due to reasons like; Variable and unpredictable characteristics, Fluctuation of Strength, Channel Contention problem and Packet Collision problem [4]. The study had been done to cop up these problems on neighbor coverage based protocol to reduce routing overhead in MANETS. The connectivity factor was also discussed to arrange neighbor coverage system to provide to density adaptation [7]. AODV protocol can be played an important role in optimizing mechanism. This paper presented and completed on new type of rebroadcasting with many performance metrics it is done while using NS-2 Simulator [9]
CBIR by Using Features of Shape and Color
Geometrical Feature is a key issue in content based image retrieval (CBIR). In the prior work, various surface highlights have been proposed in writing, including literature, including statistic ethos and spectral methods. But in many cases most of them are not precisely captured. The most critical texture feature in an image called edge information. As of late, a portion of the authors on multi-scale analysis, particularly the curve-let research about, gave great chance to remove more accurate texture features for image recovery. Curve-let has indicated promising execution, anyway it was initially proposed for image de-noising. In this paper, another image include in view of curve-let transform has been proposed. We apply discrete curve-let transform on surface image and transformed images; we process the low order statistics. Images are then represented using the extracted texture features. We discuss design, implementation, and performance analysis of Tamara’s new statistical feature based image retrieval system. One of our major contributions is to propose a new scalable image retrieval scheme using shape and color based features, which is shown to be scalable to high dimensional of image data
Review of Linguistic Text Steganographic Methods
Steganography is a method of concealing confidential data in a cover file such that attacker cannot predict about clandestine data. Steganography exploit cover message, for example content, picture, sound, video record and so forth to conceal a mystery or secret message. Text Steganography is one of a procedure to conceal the one kind (text) of content data inside same type of content messages. Linguistic steganography is the language based steganographic scheme which proposes more advanced methods to hide the secret messages in text. Initially linguistic text steganographic techniques are developed only for the English language. But now days different regional languages are also used to hide the information like Hindi. This paper reviews the different linguistic text steganographic methods of hindi and English language
- …