29 research outputs found

    Ontological approach for database integration

    Get PDF
    Database integration is one of the research areas that have gained a lot of attention from researcher. It has the goal of representing the data from different database sources in one unified form. To reach database integration we have to face two obstacles. The first one is the distribution of data, and the second is the heterogeneity. The Web ensures addressing the distribution problem, and for the case of heterogeneity there are many approaches that can be used to solve the database integration problem, such as data warehouse and federated databases. The problem in these two approaches is the lack of semantics. Therefore, our approach exploits the Semantic Web methodology. The hybrid ontology method can be facilitated in solving the database integration problem. In this method two elements are available; the source (database) and the domain ontology, however, the local ontology is missing. In fact, to ensure the success of this method the local ontologies should be produced. Our approach obtains the semantics from the logical model of database to generate local ontology. Then, the validation and the enhancement can be acquired from the semantics obtained from the conceptual model of the database. Now, our approach can be applied in the generation phase and the validation-enrichment phase. In the generation phase in our approach, we utilise the reverse engineering techniques in order to catch the semantics hidden in the SQL language. Then, the approach reproduces the logical model of the database. Finally, our transformation system will be applied to generate an ontology. In our transformation system, all the concepts of classes, relationships and axioms will be generated. Firstly, the process of class creation contains many rules participating together to produce classes. Our unique rules succeeded in solving problems such as fragmentation and hierarchy. Also, our rules eliminate the superfluous classes of multi-valued attribute relation as well as taking care of neglected cases such as: relationships with additional attributes. The final class creation rule is for generic relation cases. The rules of the relationship between concepts are generated with eliminating the relationships between integrated concepts. Finally, there are many rules that consider the relationship and the attributes constraints which should be transformed to axioms in the ontological model. The formal rules of our approach are domain independent; also, it produces a generic ontology that is not restricted to a specific ontology language. The rules consider the gap between the database model and the ontological model. Therefore, some database constructs would not have an equivalent in the ontological model. The second phase consists of the validation and the enrichment processes. The best way to validate the transformation result is to facilitate the semantics obtained from the conceptual model of the database. In the validation phase, the domain expert captures the missing or the superfluous concepts (classes or relationships). In the enrichment phase, the generalisation method can be applied to classes that share common attributes. Also, the concepts of complex or composite attributes can be represented as classes. We implement the transformation system by a tool called SQL2OWL in order to show the correctness and the functionally of our approach. The evaluation of our system showed the success of our proposed approach. The evaluation goes through many techniques. Firstly, a comparative study is held between the results produced by our approach and the similar approaches. The second evaluation technique is the weighting score system which specify the criteria that affect the transformation system. The final evaluation technique is the score scheme. We consider the quality of the transformation system by applying the compliance measure in order to show the strength of our approach compared to the existing approaches. Finally the measures of success that our approach considered are the system scalability and the completeness

    An Enhanced Source Location Privacy based on Data Dissemination in Wireless Sensor Networks (DeLP)

    Get PDF
    open access articleWireless Sensor Network is a network of large number of nodes with limited power and computational capabilities. It has the potential of event monitoring in unattended locations where there is a chance of unauthorized access. The work that is presented here identifies and addresses the problem of eavesdropping in the exposed environment of the sensor network, which makes it easy for the adversary to trace the packets to find the originator source node, hence compromising the contextual privacy. Our scheme provides an enhanced three-level security system for source location privacy. The base station is at the center of square grid of four quadrants and it is surrounded by a ring of flooding nodes, which act as a first step in confusing the adversary. The fake node is deployed in the opposite quadrant of actual source and start reporting base station. The selection of phantom node using our algorithm in another quadrant provides the third level of confusion. The results show that Dissemination in Wireless Sensor Networks (DeLP) has reduced the energy utilization by 50% percent, increased the safety period by 26%, while providing a six times more packet delivery ratio along with a further 15% decrease in the packet delivery delay as compared to the tree-based scheme. It also provides 334% more safety period than the phantom routing, while it lags behind in other parameters due to the simplicity of phantom scheme. This work illustrates the privacy protection of the source node and the designed procedure may be useful in designing more robust algorithms for location privac

    Resource Efficient Authentication and Session Key Establishment Procedure for Low-Resource IoT Devices

    Get PDF
    open access journalThe Internet of Things (IoT) can includes many resource-constrained devices, with most usually needing to securely communicate with their network managers, which are more resource-rich devices in the IoT network. We propose a resource-efficient security scheme that includes authentication of devices with their network managers, authentication between devices on different networks, and an attack-resilient key establishment procedure. Using automated validation with internet security protocols and applications tool-set, we analyse several attack scenarios to determine the security soundness of the proposed solution, and then we evaluate its performance analytically and experimentally. The performance analysis shows that the proposed solution occupies little memory and consumes low energy during the authentication and key generation processes respectively. Moreover, it protects the network from well-known attacks (man-in-the-middle attacks, replay attacks, impersonation attacks, key compromission attacks and denial of service attacks)

    Cooperative Volunteer Protocol to Detect Non-Line of Sight Nodes in Vehicular Ad hoc Networks

    Get PDF
    The file attached to this record is the author's final peer reviewed version. The Publisher's final version can be found by following the DOI link. CTIA vehicular Ad hoc Network (VANET) is a special type of Mobile Ad hoc Network (MANET) application that impacts wireless communications and Intelligent Transport Systems (ITSs). VANETs are employed to develop safety applications for vehicles to create a safer and less cluttered environment on the road. The many remaining challenges relating to VANETs have encouraged researchers to conduct further investigation in this field to meet these challenges. For example, issues pertaining to routing protocols, such as the delivery of warning messages to vehicles facing Non-Line of Sight (NLOS) situations without causing a broadcasting storm and channel contention are regarded as a serious dilemma, especially in congested environments. This prompted the design of an efficient mechanism for a routing protocol capable of broadcasting warning messages from emergency vehicles to vehicles under NLOS conditions to reduce the overhead and increase the packet delivery ratio with reduced time delay and channel utilisation. This work used the cooperative approach to develop the routing protocol named the Co-operative Volunteer Protocol (CVP), which uses volunteer vehicles to disseminate the warning message from the source to the target vehicle experiencing an NLOS situation. A novel architecture has been developed by utilising the concept of a Context-Aware System (CAS), which clarifies the OBU components and their interaction with each other to collect data and make decisions based on the sensed circumstances. The simulation results showed that the proposed protocol outperformed the GRANT protocol with regard to several metrics such as packet delivery ratio, neighbourhood awareness, channel utilisation, overhead, and latency. The results also showed that the proposed CVP could successfully detect NLOS situations and solve them effectively and efficiently for both the intersection scenario in urban areas and the highway scenario

    Automated deep bottleneck residual 82-layered architecture with Bayesian optimization for the classification of brain and common maternal fetal ultrasound planes

    Get PDF
    Despite a worldwide decline in maternal mortality over the past two decades, a significant gap persists between low- and high-income countries, with 94% of maternal mortality concentrated in low and middle-income nations. Ultrasound serves as a prevalent diagnostic tool in prenatal care for monitoring fetal growth and development. Nevertheless, acquiring standard fetal ultrasound planes with accurate anatomical structures proves challenging and time-intensive, even for skilled sonographers. Therefore, for determining common maternal fetuses from ultrasound images, an automated computer-aided diagnostic (CAD) system is required. A new residual bottleneck mechanism-based deep learning architecture has been proposed that includes 82 layers deep. The proposed architecture has added three residual blocks, each including two highway paths and one skip connection. In addition, a convolutional layer has been added of size 3 × 3 before each residual block. In the training process, several hyper parameters have been initialized using Bayesian optimization (BO) rather than manual initialization. Deep features are extracted from the average pooling layer and performed the classification. In the classification process, an increase occurred in the computational time; therefore, we proposed an improved search-based moth flame optimization algorithm for optimal feature selection. The data is then classified using neural network classifiers based on the selected features. The experimental phase involved the analysis of ultrasound images, specifically focusing on fetal brain and common maternal fetal images. The proposed method achieved 78.5% and 79.4% accuracy for brain fetal planes and common maternal fetal planes. Comparison with several pre-trained neural nets and state-of-the-art (SOTA) optimization algorithms shows improved accuracy

    Hybridizing cost saving with trust for blockchain technology adoption by financial institutions

    Get PDF
    Distributed Ledger Technology (DLT) is transforming the financial industry and leading to a rise in the modern banking system. Like in developed nations, disruptive technology is necessary to advance the traditional banking system in emerging economies. The present study aims to investigate the critical factors that influence a user’s intention to accept blockchain technology for financial institutions. The proposed model is based on Technology Acceptance Model (TAM) constructs with trust and cost-saving, tested using structural equation modelling. Findings from an online survey of 188 practitioners working in Malaysia's financial sector confirm that all constructs except trust on perceived usefulness were found to have a significant impact during the blockchain implementation. Moreover, cost-saving matters most during the disruptive technology adoption for financial institutions. Based on the findings, the subsequent theoretical and practical implications are assessed, albeit with notable limitations

    A novel bottleneck residual and self-attention fusion-assisted architecture for land use recognition in remote sensing images

    Get PDF
    The massive yearly population growth is causing hazards to spread swiftly around the world and have a detrimental impact on both human life and the world economy. By ensuring early prediction accuracy, remote sensing enters the scene to safeguard the globe against weather-related threats and natural disasters. Convolutional neural networks, which are a reflection of deep learning, have been used more recently to reliably identify land use in remote sensing images. This work proposes a novel bottleneck residual and self-attention fusion-assisted architecture for land use recognition from remote sensing images. First, we proposed using the fast neural approach to generate cloud-effect satellite images. In neural style, we proposed a 5-layered residual block CNN to estimate the loss of neural-style images. After that, we proposed two novel architectures, named 3-layered bottleneck CNN architecture and 3-layered bottleneck self-attention CNN architecture, for the classification of land use images. Training has been conducted on both proposed and original neural-style generated datasets for both architectures. Subsequently, features are extracted from the deep layers and merged employing an innovative serial approach based on weighted entropy. By removing redundant and superfluous data, a novel Chimp Optimization technique is applied to the fused features in order to further refine them. In conclusion, selected features are classified using the help of neural network classifiers. The experimental procedure yielded respective accuracy rates of 99.0% and 99.4% when applied to both datasets. When evaluated in comparison to state-of-the-art (SOTA) methods, the outcomes generated by the proposed framework demonstrated enhanced precision and accuracy
    corecore