8 research outputs found
Dynamic Slice of Aspect Oriented Program A Comparative Study
Aspect Oriented Programming (AOP) is a budding latest technology for separating crosscutting concerns . It is very difficult to achieve cross cutting concerns in object - oriented programming (OOP). AOP is generally suitable for the area where code scattering and code tangling arises. Due to the specific features of AOP language such as joinpoint, point - cut, advice and introduction, it is difficult to apply existing slicing algorithms of procedural or object - oriented programming directly to AOP. This paper addresses different types of program slicing approaches for AOP by considering a very simple example. Also this paper addresses a new approach to calculate the dynamic slice of AOP. The complexity of this algorithm is better as compared to some existing algorithms
Rough ACO: A Hybridized Model for Feature Selection in Gene Expression Data
Dimensionality reduction of a feature set is a common preprocessing step used for pattern recognition, classification applications and in compression schemes. Rough Set Theory is one of the popular methods used, and can be shown to be optimal using different optimality criteria. This paper proposes a novel method for dimensionality reduction of a feature set by choosing a subset of the original features that contains most of the essential information, using the same criteria as the ACO hybridized with Rough Set Theory. We call this method Rough ACO. The proposed method is successfully applied for choosing the best feature combinations and then applying the Upper and Lower Approximations to find the reduced set of features from a gene expression data
Building Scalable Cyber-Physical-Social Networking Infrastructure Using IoT and Low Power Sensors
Wireless Sensors are an important component to develop the Internet of Things (IoT) Sensing infrastructure. There are enormous numbers of sensors connected with each other to form a network (well known as wireless sensor networks) to complete IoT Infrastructure. These deployed wireless sensors are with limited energy and processing capabilities. IoT infrastructure becomes a key factor to building cyber-physical-social networking infrastructure, where all these sensing devices transmit data towards the cloud data center. Data routing towards cloud data center using such low power sensor is still a challenging task. In order to prolong the lifetime of the IoT sensing infrastructure and building scalable cyber infrastructure, there is the requirement of sensing optimization and energy efficient data routing. Towards addressing these issues of IoT sensing, this paper proposes a novel rendezvous data routing protocol for low power sensors. The proposed method divides the sensing area into a number of clusters to lessen the energy consumption with data accumulation and aggregation. As a result, there will be less amount of data stream to the network. Another major reason to select cluster-based data routing is to reduce the control overhead. Finally, the simulation of the proposed method is done in the Castalia simulator to observe the performance. It has been concluded that the proposed method is energy efficient and it prolongs the networks lifetime for scalable IoT infrastructure
Deep learning model for elevating internet of things intrusion detection
The internet of things (IoT) greatly impacts daily life by enabling efficient data exchange between objects and servers. However, cyber-attacks pose a serious threat to IoT devices. Intrusion detection systems (IDS) are vital for safeguarding networks, and machine learning methods are increasingly used to enhance security. Continuous improvement in accuracy and performance is crucial for effective IoT security. Deep learning not only outshines traditional machine learning methods but also holds untapped potential in fortifying IDS systems. This paper introduces an innovative deep learning framework tailored for anomaly detection within IoT networks, leveraging bidirectional long short-term memory (BiLSTM) and gated recurrent unit (GRU) architectures. The hyper parameters of the proposed model are optimized using the JAYA optimization technique. These models are validated using IoT-23 and MQTTset datasets. Several performance metrics including accuracy, precision, recall, f-score, true negative rate (TNR), false positive rate (FPR), and false negative rate (FNR), have been selected to assess the effectiveness of the suggested model. The empirical results are scrutinized and juxtaposed with prevailing approaches in the realm of intrusion detection for IoT. Notably, the proposed method emerges as showcasing superior accuracy when contrasted with existing methods
Aspect oriented programs: Issues and perspective
Aspect oriented programming (AOP) helps programmers for separating crosscutting concerns. All programming methodologies support split up and encapsulation of concerns. In object-oriented programming (OOP) crosscutting aspects are distributed among objects. It is hard to attain crosscutting in OOP as it is scattered in different objects. In AOP crosscutting concerns are addressed using one entity called aspect. This paper discusses varieties of existing slicing techniques of AOP. Also, we discuss a novel method to calculate dynamic slice of AOP. To represent AOP Aspect Oriented System Dependence Graph (AOSDG) is used. The complexity of this new approach is equal or improved as related to certain prevailing approaches. Keywords: Aspect, AOP, Slicing, Crosscutting, AOSD
Prediction of Plasma Membrane Cholesterol from 7-Transmembrane Receptor Using Hybrid Machine Learning Algorithm
The researches have been made on G-protein coupled receptors (GPCRs) over the long-ago decades. GPCR is also named as 7-transmembrane (7TM) receptor. According to biological prospective GPCRs consist of large protein family with respective subfamilies and are mediated by different physiological phenomena like taste, smell, vision etc. The main functionality of these 7TM receptors is signal transduction among various cells. In human genome, cell membrane plays significant role. All cells are made up of trillion of cells and have dissimilar functionality. Cell membrane composed of different components. GPCRs are reported to be modulated by membrane cholesterol by interacting with cholesterol recognition amino acid consensus (L/V-X (1-5)-Y-X (1-5)-R/K) (CRAC) or reverse orientation of CRAC (R/K-X (1-5)-Y-X (1-5)-L/V) (CARC) motifs present in the TM helices. Among all, cholesterol is one who is regulated by membrane proteins. Here we took GPCR as membrane proteins and this protein modulates membrane cholesterol. According to cell biology, GPCR regulates a wide diversity of vital cellular processes and are targeted by a huge fraction of approved drugs. In this paper we have concentrated our investigation on membrane protein with membrane cholesterol. A hybrid algorithm consisting of spectral clustering and support vector machine is proposed for prediction of membrane cholesterol with GPCR. Spectral clustering uses graph nodes for calculating the cluster points and also it considers other concept such as similarity matrix, low-dimensional space for projecting the data points and upon this parameter at last construct the cluster centre. Supervised learning method is used for solving regression and classification problems. From the analysis we found that our result shows better prediction accuracy in terms of time complexity when compared with two existing models such as fuzzy c-means (FCM) and rough set with FCM model