42 research outputs found

    A Review of Real World Big Data Processing Structure: Problems and Solutions

    Get PDF
    Information sort and sum in human culture is developing in astonishing pace which is brought about by rising new administrations as distributed computing, web of things and area-based administrations, the time of enormous information has arrived. As information, has been principal asset, how to oversee and use enormous information better has pulled in much consideration. Particularly, with the advancement of web of things, how to handling huge sum continuous information has turned into an extraordinary test in research and applications. As of late, distributed computing innovation has pulled in much consideration with elite, yet how to utilize distributed computing innovation for substantial scale ongoing information preparing has not been contemplated. This paper concentrated the difficulties of huge information firstly and finishes up every one of these difficulties into six issues. Keeping in mind the end goal to enhance the execution of constant handling of substantial information, this paper manufactures a sort of real-time big data processing (RTDP) design considering the distributed computing innovation and after that proposed the four layers of the engineering, and various leveled figuring model. This paper proposed a multi-level stockpiling model and the LMA-based application organization technique to meet the continuous and heterogeneity necessities of RTDP framework. We utilize DSMS, CEP, group-basedMap Reduce and other handling mode and FPGA, GPU, CPU, ASIC advancements contrastingly to preparing the information at the terminal of information gathering. We organized the information and afterward transfer to the cloud server and Map Reduce the information consolidated with the effective processing abilities cloud design. This paper brings up the general structure for future RTDP framework and computation techniques, is right now the general strategy RTDP framework outline

    A Process Model for Developing Semantic Web Systems

    Get PDF
    Abstract: Before the Web era various software development methodologies have been proposed for the development of software applications for different domains. The main objectives of those methodologies were to meet user's requirements, find out means to suggest a systematic software development and reduce the maintenance cost of the developed software. On the emergence of the Web and to develop the web-based software systems, some existing methodologies have been extended. Also, new approaches (or informal methodologies) are introduced for the development of web-based systems because the development process for these systems is not considered as an extension of the classical software engineering, although both development processes for web-based systems and non web-based systems have the same basic objective which is software development. Of course, the development of the web-based systems needs a new kind of development methodologies which should meet and capture their unique and different requirements. Currently available software development methodologies are inappropriate and unsuitable to use for the development of web-based software systems, especially for the third generation web, called Semantic Web. In this paper, we present a brief review of the existing software development methodologies for the development of web-based systems. Some informal software development methodologies (or approaches) for the semantic web are also reviewed. Then, based on this analytical review, we propose a model for the development of semantic web systems. This model can be used as a benchmark to propose formal methodologies for the development of the semantic web systems

    An empirical investigation of performance challenges within context‐aware content sharing for vehicular ad hoc networks

    Get PDF
    Connected vehicles is a leading use-case within the Industrial Internet of Things (IIoT), which is aimed at automating a range of driving tasks such as navigation, accident avoidance, content sharing and auto-driving. Such systems leverage Vehicular Ad-hoc Networks (VANETs) and include vehicle to vehicle (V2V) and vehicle to roadside infrastructure (V2I) communication along with remote systems such as traffic alerts and weather reports. However, the device endpoints in such networks are typically resource-constrained and, therefore, leverage edge computing, wireless communications and data analytics to improve the overall driving experience, influencing factors such as safety, reliability, comfort, response and economic efficiency. Our focus in this paper is to identify and highlight open challenges to achieve a secure and efficient convergence between the constrained IoT devices and the high-performance capabilities offered by the clouds. Therein, we present a context-aware content sharing scenario for VANETs and identify specific requirements for its achievement. We also conduct a comparative study of simulation software for edge computing paradigm to identify their strengths and weaknesses, especially within the context of VANETs. We use FogNetSim++ to simulate diverse settings within VANETs with respect to latency and data rate highlighting challenges and opportunities for future research

    Security Techniques and Solutions for Preventing the Cross-Site Scripting Web Vulnerabilities: A General Approach

    Get PDF
    The growth of social networking sites across the World Wide Web is directly proportional to the complex user- created HTML content and this habit is rapidly becoming the norm rather than exception. Complex user created web message is a threat for cross site scripting (XSS) attacks that hits various websites and confidential user data. In this state, processes that prevent web applications to XSS attacks have been of recent interest for researchers. Most of the web applications and confidential user data have security problem with XSS attacks. Using this method an attacker embeds his malicious script into application’s output. This contaminated response of server is sent to a user’s web browser where it is executed and user’s sensitive data is transmitted to a third party. Recently XSS attack is prevented on server side, by thoroughly examining, filtering and removing malicious content inserted by hacker. For social networking sites the criticality of XSS attacks gets even higher because the hackers can try more socially engineered attacks where the target user can be fooled by thinking that an attack link is coming from his ‘friend’. The presented solution focuses on prevention techniques for cross-site (XSS) attacks both on server side and on the client side by keeping a track of all user requests and information. We have also discussed various recent XSS attacks in real world and have done analysis that why filtering mechanisms are so abortive and being failed in defending these attacks

    A STUDY OF INTERNET THREATS, AVOIDANCE AND BIOMETRIC SECURITY TECHNIQUES - COMPARISON OF BIOMETRIC TECHNIQUES

    Get PDF
    In today’s IT world, most of the communication is done through networking. So, security of information is very crucial. A lot of techniques have been developed for security which involves passwords, encryption, digital signatures etc. But there are chances of vulnerabilities in these techniques and hackers can break the security algorithms of these techniques. So, in this era, researchers have moved towards biometric techniques of security. It involves identification of people based on their physical characteristics or psychological behaviors. A choice of biometric method to be used is made depending on the level of security required and the goals of the system. Biometric identification is very excellent and secure way of authenticating people. But it can also suffer from security threats, if proper design considerations are not taken into account. This work presents details of biometric techniques and a detailed comparison of most famous biometric techniques

    Real-time adaptive estimation of decoherence timescales for a single qubit

    Get PDF
    Characterizing the time over which quantum coherence survives is critical for any implementation of quantum bits, memories, and sensors. The usual method for determining a quantum system's decoherence rate involves a suite of experiments probing the entire expected range of this parameter, and extracting the resulting estimation in postprocessing. Here we present an adaptive multiparameter Bayesian approach, based on a simple analytical update rule, to estimate the key decoherence timescales (T1, T2∗ - , and T2) and the corresponding decay exponent of a quantum system in real time, using information gained in preceding experiments. This approach reduces the time required to reach a given uncertainty by a factor up to an order of magnitude, depending on the specific experiment, compared to the standard protocol of curve fitting. A further speedup of a factor approximately 2 can be realized by performing our optimization with respect to sensitivity as opposed to variance

    Cardiovascular Outcomes and Trends of Transcatheter vs. Surgical Aortic Valve Replacement Among Octogenarians With Heart Failure: A Propensity Matched National Cohort Analysis

    Get PDF
    Background: Heart failure (HF) is a complex clinical syndrome with symptoms and signs that result from any structural or functional impairment of ventricular filling or ejection of blood. Limited data is available regarding the in-hospital outcomes of TAVR compared to SAVR in the octogenarian population with HF. Methods: The National Inpatient Sample (NIS) database was used to compare TAVR versus SAVR among octogenarians with HF. The primary outcome was in-hospital mortality. The secondary outcome included acute kidney injury (AKI), cerebrovascular accident (CVA), post-procedural stroke, major bleeding, blood transfusions, sudden cardiac arrest (SCA), cardiogenic shock (CS), and mechanical circulatory support (MCS). Results: A total of 74,995 octogenarian patients with HF (TAVR-HF n = 64,890 (86.5%); SAVR n = 10,105 (13.5%)) were included. The median age of patients in TAVR-HF and SAVR-HF was 86 (83-89) and 82 (81-84) respectively. TAVR-HF had lower percentage in-hospital mortality (1.8% vs. 6.9%;p \u3c 0.001), CVA (2.5% vs. 3.6%; p = 0.009), SCA (9.9% vs. 20.2%; p \u3c 0.001), AKI (17.4% vs. 40.8%); p \u3c 0.001), major transfusion (26.4% vs 67.3%; p \u3c 0.001), CS (1.8% vs 9.8%; p \u3c 0.001), and MCS (0.8% vs 7.3%; p \u3c 0.001) when compared to SAVR-HF. Additionally, post-procedural stroke and major bleeding showed no significant difference. The median unmatched total charges for TAVR-HF and SAVR-HF were 194,561and246,100 and 246,100 respectively. Conclusion: In this nationwide observational analysis, TAVR is associated with an improved safety profile for octogenarians with heart failure (both preserved and reduced ejection fraction) compared to SAVR

    Synergistic effects of H\u3csub\u3e2\u3c/sub\u3eO\u3csub\u3e2\u3c/sub\u3e and S\u3csub\u3e2\u3c/sub\u3eO\u3csub\u3e8\u3c/sub\u3e\u3csup\u3e2−\u3c/sup\u3e in the gamma radiation induced degradation of congo-red dye: Kinetics and toxicities evaluation

    Get PDF
    © 2019 Elsevier B.V. Gamma radiation has received increasing attention due to their high potential in degradation of recalcitrant pollutants. Thus in the present study, gamma radiation was used for degradation of congo-red (CR) dye, a highly toxic and carcinogenic pollutant, in the presence of H2O2 and S2O82−. The CR was significantly degraded by gamma radiation (i.e., 53%), however, presence of H2O2 and S2O82− promoted degradation of CR to 98 and 87%, respectively, at 1184 Gy absorbed dose. The radical scavengers and electron spin resonance studies revealed that gamma radiation decompose H2O2 and S2O82− into [rad]OH and SO4[rad]− and both [rad]OH and SO4[rad]− caused degradation of CR. The CR showed high reactivity, i.e., 3.25 × 109 and 8.50 × 108 M−1 s−1 with [rad]OH and SO4[rad]−, respectively, and removal of CR was inhibited in the presence of [rad]OH and SO4[rad]− scavengers. The removal of CR was promoted with elevating initial concentrations of H2O2 and S2O82− and decreasing initial concentrations of CR. pH of aqueous solution also significantly influenced removal of the dye. The proposed degradation pathways of CR were established from the [rad]OH mediated degradation of CR and nature of identified degradation products. The greater mineralization of CR, formation of small molecular mass degradation product, and decline in concentration of acetate after extended treatment suggest the gamma-ray mediated peroxide based process to be a promising alternative for potential degradation of CR
    corecore