17,798 research outputs found

    Malware Detection using Machine Learning and Deep Learning

    Full text link
    Research shows that over the last decade, malware has been growing exponentially, causing substantial financial losses to various organizations. Different anti-malware companies have been proposing solutions to defend attacks from these malware. The velocity, volume, and the complexity of malware are posing new challenges to the anti-malware community. Current state-of-the-art research shows that recently, researchers and anti-virus organizations started applying machine learning and deep learning methods for malware analysis and detection. We have used opcode frequency as a feature vector and applied unsupervised learning in addition to supervised learning for malware classification. The focus of this tutorial is to present our work on detecting malware with 1) various machine learning algorithms and 2) deep learning models. Our results show that the Random Forest outperforms Deep Neural Network with opcode frequency as a feature. Also in feature reduction, Deep Auto-Encoders are overkill for the dataset, and elementary function like Variance Threshold perform better than others. In addition to the proposed methodologies, we will also discuss the additional issues and the unique challenges in the domain, open research problems, limitations, and future directions.Comment: 11 Pages and 3 Figure

    On the Privacy Practices of Just Plain Sites

    Full text link
    In addition to visiting high profile sites such as Facebook and Google, web users often visit more modest sites, such as those operated by bloggers, or by local organizations such as schools. Such sites, which we call "Just Plain Sites" (JPSs) are likely to inadvertently represent greater privacy risks than high profile sites by virtue of being unable to afford privacy expertise. To assess the prevalence of the privacy risks to which JPSs may inadvertently be exposing their visitors, we analyzed a number of easily observed privacy practices of such sites. We found that many JPSs collect a great deal of information from their visitors, share a great deal of information about their visitors with third parties, permit a great deal of tracking of their visitors, and use deprecated or unsafe security practices. Our goal in this work is not to scold JPS operators, but to raise awareness of these facts among both JPS operators and visitors, possibly encouraging the operators of such sites to take greater care in their implementations, and visitors to take greater care in how, when, and what they share.Comment: 10 pages, 7 figures, 6 tables, 5 authors, and a partridge in a pear tre

    Quality of service assurance for the next generation Internet

    Get PDF
    The provisioning for multimedia applications has been of increasing interest among researchers and Internet Service Providers. Through the migration from resource-based to service-driven networks, it has become evident that the Internet model should be enhanced to provide support for a variety of differentiated services that match applications and customer requirements, and not stay limited under the flat best-effort service that is currently provided. In this paper, we describe and critically appraise the major achievements of the efforts to introduce Quality of Service (QoS) assurance and provisioning within the Internet model. We then propose a research path for the creation of a network services management architecture, through which we can move towards a QoS-enabled network environment, offering support for a variety of different services, based on traffic characteristics and user expectations

    A Complexity-Based Taxonomy of Systems Development Methodologies

    Get PDF
    For the last two decades systems developers and researchers have largely assumed that the process of developing business information systems is a wellstructured, project-oriented, once in a lifetime undertaking. However, present business models that are intensively reliant on information technology are rendering this perception obsolete. There has been a growing propensity toward increasingly iterative, fastpaced, user-driven systems development methodologies such as Rapid Application Development, Unified Modeling Language, Joint Application Development and the Relationship Management methodology (Hans- Werner1997; Isakowitz, 1995; Shapiro, 1997; Vessey, 1994). At the same time the discipline of information systems has witnessed an increasing awareness of the importance of systems maintenance - with the general proposition that systems development is an everproceeding activity (rather than a project-based activity) - becoming the norm (Howard 1990). The majority of empirical research on systems development, to date, has tested the contributions of different types of knowledge to effective systems development. Much research has also been done on how different methodologies for systems development impact the quality, effectiveness, and efficiency of resultant business information infrastructure. The studies suffer from one or more of the following limitations: (1) They largely perceive information systems evolution via systems development as generally being a slow, linear, structured and continuous process. Current events in the information systems and electronic commerce sectors indicate that systems development is highly dynamic, discontinuous and adaptive in nature – the defining traits of a complex system. (2) Though most of the past studies recognize that systems development is a knowledge intensive activity, rarely is it seen as a primary mechanism by which a firm embeds knowledge into its business information infrastructure’s technologies, databases and automated operating procedures. Because almost all business functions and transactions within an electronic commerce enterprise is achieved via the firm’s business information infrastructure, enhancement of business knowledge and information within such a firm is expected to be heavily dependent on the enhancements made to that infrastructure via specific systems development approaches. In rapidly evolving environments as characterized by present day electronic commerce, methodologies become a primary means by which the firm continuously updates its knowledge resources hence sustaining or leveraging its competitive advantages. The theory of complexity may contribute to the perception and re-classification of systems development methodologies in such a manner as to provide a clearer understanding of which methodologies are best suited for directing the development and enhancement of business information systems in today\u27s electronic commerce economy. By viewing business information systems as emergent complex adaptive systems, the methodologies employed to derive these systems can be seen as being synonymous to the natural rules that govern the behavior of all natural phenomenon. Thus it enables us to explain what methodologies best match a specific systems development or enhancement tasks allowing for the development of better quality business information systems, especially for electronic commerce applications
    • …
    corecore