489 research outputs found

    Genetic and epigenetic determinants of diffuse large B-cell lymphoma

    Get PDF
    Diffuse large B-cell lymphoma (DLBCL) is the most common type of lymphoma and is notorious for its heterogeneity, aggressive nature, and the frequent development of resistance and/or relapse after treatment with standard chemotherapy. To address these problems, a strong emphasis has been placed on researching the molecular origins and mechanisms of DLBCL to develop effective treatments. One of the major insights produced by such research is that DLBCL almost always stems from genetic damage that occurs during the germinal center (GC) reaction, which is required for the production of high-affinity antibodies. Indeed, there is significant overlap between the mechanisms that govern the GC reaction and those that drive the progression of DLBCL. A second important insight is that some of the most frequent genetic mutations that occur in DLBCL are those related to chromatin and epigenetics, especially those related to proteins that “write” histone post-translational modifications (PTMs). Mutation or deletion of these epigenetic writers often renders cells unable to epigenetically “switch on” critical gene sets that are required to exit the GC reaction, differentiate, repair DNA, and other essential cellular functions. Failure to activate these genes locks cells into a genotoxic state that is conducive to oncogenesis and/or relapse

    A Novel Feature Set for Application Identification

    Get PDF
    Classifying Internet traffic into applications is vital to many areas, from quality of service (QoS) provisioning, to network management and security. The task is challenging as network applications are rather dynamic in nature, tend to use a web front-end and are typically encrypted, rendering traditional port-based and deep packet inspection (DPI) method unusable. Recent classification studies proposed two alternatives: using the statistical properties of traffic or inferring the behavioural patterns of network applications, both aiming to describe the activity within and among network flows in order to understand application usage and behaviour. The aim of this paper is to propose and investigate a novel feature to define application behaviour as seen through the generated network traffic by considering the timing and pattern of user events during application sessions, leading to an extended traffic feature set based on burstiness. The selected features were further used to train and test a supervised C5.0 machine learning classifier and led to a better characterization of network applications, with a traffic classification accuracy ranging between 90- 98%

    On Internet Traffic Classification: A Two-Phased Machine Learning Approach

    Get PDF
    Traffic classification utilizing flow measurement enables operators to perform essential network management. Flow accounting methods such as NetFlow are, however, considered inadequate for classification requiring additional packet-level information, host behaviour analysis, and specialized hardware limiting their practical adoption. This paper aims to overcome these challenges by proposing two-phased machine learning classification mechanism with NetFlow as input. The individual flow classes are derived per application through k-means and are further used to train a C5.0 decision tree classifier. As part of validation, the initial unsupervised phase used flow records of fifteen popular Internet applications that were collected and independently subjected to k-means clustering to determine unique flow classes generated per application. The derived flow classes were afterwards used to train and test a supervised C5.0 based decision tree. The resulting classifier reported an average accuracy of 92.37% on approximately 3.4 million test cases increasing to 96.67% with adaptive boosting. The classifier specificity factor which accounted for differentiating content specific from supplementary flows ranged between 98.37% and 99.57%. Furthermore, the computational performance and accuracy of the proposed methodology in comparison with similar machine learning techniques lead us to recommend its extension to other applications in achieving highly granular real-time traffic classification

    Anomaly Detection in Encrypted Internet Traffic Using Hybrid Deep Learning

    Get PDF
    An increasing number of Internet application services are relying on encrypted traffic to offer adequate consumer privacy. Anomaly detection in encrypted traffic to circumvent and mitigate cyber security threats is, however, an open and ongoing research challenge due to the limitation of existing traffic classification techniques. Deep learning is emerging as a promising paradigm, allowing reduction in manual determination of feature set to increase classification accuracy. The present work develops a deep learning-based model for detection of anomalies in encrypted network traffic. Three different publicly available datasets including the NSL-KDD, UNSW-NB15, and CIC-IDS-2017 are used to comprehensively analyze encrypted attacks targeting popular protocols. Instead of relying on a single deep learning model, multiple schemes using convolutional (CNN), long short-term memory (LSTM), and recurrent neural networks (RNNs) are investigated. Our results report a hybrid combination of convolutional (CNN) and gated recurrent unit (GRU) models as outperforming others. The hybrid approach benefits from the low-latency feature derivation of the CNN, and an overall improved training dataset fitting. Additionally, the highly effective generalization offered by GRU results in optimal time-domain-related feature extraction, resulting in the CNN and GRU hybrid scheme presenting the best model.</jats:p

    Perspectives on Auditing and Regulatory Compliance in Blockchain Transactions

    Get PDF
    The recent advent of blockchain technology is anticipated to revolutionize the operational processes of several industries including banking, finance, real estate, retail and benefit governmental as well as corporate information management structures. The underlying principles of information immutability, traceability, and verifiability built-in blockchain transactions may lead to greater adoption of distributed crypto-ledger applications in auditing automation, compliance monitoring, and guaranteeing high assurance. This chapter discusses the contemporary applications of blockchain technology in information auditing, exploring aspects such as data recording, accuracy, verification, transparency, and overall value of a decentralized blockchain crypto-ledger for auditors. Opportunities for timeliness, completeness, and reconciliation in appraising regulatory compliance of organizations employing blockchain-based contractual frameworks are also investigated. The chapter reviews the existing and anticipated challenges blockchain applications pose to traditional regulatory compliance models and the inherent risks for businesses and stakeholders. We highlight the impact of operational concerns such as decentralized transactions, network complexity, transaction reversals, credential management, software quality, and human resources. Finally, the chapter provides perspective on assurance complexities involved in transforming from proprietary to blockchain-based framework while adhering to IT control obligations dictated by three major auditing standards Sarbanes Oxley Act (SOX), Control Objectives for Information Technologies (COBIT), and International Standardization Organization (ISO) /International Electrotechnical Commission (IEC) 27001

    Information Security Risk Assessment

    Get PDF
    Information security risk assessment is an important part of enterprises’ management practices that helps to identify, quantify, and prioritize risks against criteria for risk acceptance and objectives relevant to the organization. Risk management refers to a process that consists of identification, management, and elimination or reduction of the likelihood of events that can negatively affect the resources of the information system to reduce security risks that potentially have the ability to affect the information system, subject to an acceptable cost of protection means that contain a risk analysis, analysis of the “cost-effectiveness” parameter, and selection, construction, and testing of the security subsystem, as well as the study of all aspects of security.</jats:p

    Using Burstiness for Network Applications Classification.

    Get PDF
    Network traffic classification is a vital task for service operators, network engineers, and security specialists to manage network traffic, design networks, and detect threats. Identifying the type/name of applications that generate traffic is a challenging task as encrypting traffic becomes the norm for Internet communication. Therefore, relying on conventional techniques such as deep packet inspection (DPI) or port numbers is not efficient anymore. This paper proposes a novel flow statistical-based set of features that may be used for classifying applications by leveraging machine learning algorithms to yield high accuracy in identifying the type of applications that generate the traffic. The proposed features compute different timings between packets and flows. This work utilises tcptrace to extract features based on traffic burstiness and periods of inactivity (idle time) for the analysed traffic, followed by the C5.0 algorithm for determining the applications that generated it. The evaluation tests performed on a set of real, uncontrolled traffic, indicated that the method has an accuracy of 79% in identifying the correct network application.</jats:p

    Different effects of aerobic exercise and diaphragmatic breathing on lower esophageal sphincter pressure and quality of life in patients with reflux: A comparative study

    Get PDF
    BACKGROUND Gastroesophageal reflux disease (GERD) is a worldwide disorder with an increasing prevalence. The quality of life (QOL) of the patients may be influenced by reflux disease. Diaphragmatic breathing (DB), as well as aerobic exercise (AE), may improve the symptoms of reflux disease, although it remains a controversial issue. The aim of this study was to compare the effects of AE and DB on QOL and lower esophageal sphincter (LES) pressure of patients with moderate to severe reflux. METHODS This was a case-control study that was conducted for 8 weeks among patients with moderate to severe GERD. The block randomization method was designed to randomize patients into three groups (AE, DB, and control) to achieve equal sample sizes. The control group received omeprazole 20 mg once daily. The other groups, in addition to omeprazole, received AE and DB. QOL and LES pressure were measured before and after the study by Questionary and Manometry method, respectively. RESULTS 75 patients were enrolled in this study. Positive effects of DB on LES pressure was approved (p = 0.001). DB had significantly more effects on QOL than aerobic exercise (p = 0.003). AE can significantly improve QOL in patients (p = 0.02) but no significant change in LES pressure (p = 0.38). There was no change in the control group for both variables. CONCLUSION AE had no effects on LES pressure but can improve QOL of the patients. DB had more effects on QOL than AE, so injured or disable patients with reflux who cannot do AE, can benefit from DB to improve their reflux symptoms. © 2021 The Author(s)
    corecore