962 research outputs found

    A conceptual approach to gene expression analysis enhanced by visual analytics

    Get PDF
    The analysis of gene expression data is a complex task for biologists wishing to understand the role of genes in the formation of diseases such as cancer. Biologists need greater support when trying to discover, and comprehend, new relationships within their data. In this paper, we describe an approach to the analysis of gene expression data where overlapping groupings are generated by Formal Concept Analysis and interactively analyzed in a tool called CUBIST. The CUBIST workflow involves querying a semantic database and converting the result into a formal context, which can be simplified to make it manageable, before it is visualized as a concept lattice and associated charts

    Tutorial and Critical Analysis of Phishing Websites Methods

    Get PDF
    The Internet has become an essential component of our everyday social and financial activities. Internet is not important for individual users only but also for organizations, because organizations that offer online trading can achieve a competitive edge by serving worldwide clients. Internet facilitates reaching customers all over the globe without any market place restrictions and with effective use of e-commerce. As a result, the number of customers who rely on the Internet to perform procurements is increasing dramatically. Hundreds of millions of dollars are transferred through the Internet every day. This amount of money was tempting the fraudsters to carry out their fraudulent operations. Hence, Internet users may be vulnerable to different types of web threats, which may cause financial damages, identity theft, loss of private information, brand reputation damage and loss of customers’ confidence in e-commerce and online banking. Therefore, suitability of the Internet for commercial transactions becomes doubtful. Phishing is considered a form of web threats that is defined as the art of impersonating a website of an honest enterprise aiming to obtain user’s confidential credentials such as usernames, passwords and social security numbers. In this article, the phishing phenomena will be discussed in detail. In addition, we present a survey of the state of the art research on such attack. Moreover, we aim to recognize the up-to-date developments in phishing and its precautionary measures and provide a comprehensive study and evaluation of these researches to realize the gap that is still predominating in this area. This research will mostly focus on the web based phishing detection methods rather than email based detection methods

    Business models for the Web: an analysis of top successful web sites

    Get PDF
    To investigate successful web business models, an original multidimensional framework is defined and applied to a large number of web sites. The framework‚ named BM*Web‚ combines issues already present in existing schema describing business models, with innovative aspects that have not previously been taken into account in those combinations or which are now viewed in a new light. Results of the application of BM*Web to the 500 top list of Alexa (at a speficic time) highlight an articulated picture where more than one success profile exists and not all of them include a web community, although a strong relationship exists between community and success under some conditions. The identification of features that characterize the most successful business models for the Web could be used to define guidelines for company management, once the appropriate profile for a company has been recognised.

    Events and Controversies: Influences of a Shocking News Event on Information Seeking

    Full text link
    It has been suggested that online search and retrieval contributes to the intellectual isolation of users within their preexisting ideologies, where people's prior views are strengthened and alternative viewpoints are infrequently encountered. This so-called "filter bubble" phenomenon has been called out as especially detrimental when it comes to dialog among people on controversial, emotionally charged topics, such as the labeling of genetically modified food, the right to bear arms, the death penalty, and online privacy. We seek to identify and study information-seeking behavior and access to alternative versus reinforcing viewpoints following shocking, emotional, and large-scale news events. We choose for a case study to analyze search and browsing on gun control/rights, a strongly polarizing topic for both citizens and leaders of the United States. We study the period of time preceding and following a mass shooting to understand how its occurrence, follow-on discussions, and debate may have been linked to changes in the patterns of searching and browsing. We employ information-theoretic measures to quantify the diversity of Web domains of interest to users and understand the browsing patterns of users. We use these measures to characterize the influence of news events on these web search and browsing patterns

    Automatic execution of expressive music performance

    Get PDF
    The definition of computer models to represent the expressiveness of a musical performance, is useful to try to understand how and what way anyone can express expressive intentions in a music performance. The CaRo 2.0 is a computer model or software system that allows automatic computation in interactive way for rendering expressive musical scores. Initially, the exclusively on Microsoft environment, which limits the interest of the product. This thesis relates to the porting and integrationope

    Extending information retrieval system model to improve interactive web searching.

    Get PDF
    The research set out with the broad objective of developing new tools to support Web information searching. A survey showed that a substantial number of interactive search tools were being developed but little work on how these new developments fitted into the general aim of helping people find information. Due to this it proved difficult to compare and analyse how tools help and affect users and where they belong in a general scheme of information search tools. A key reason for a lack of better information searching tools was identified in the ill-suited nature of existing information retrieval system models. The traditional information retrieval model is extended by synthesising work in information retrieval and information seeking research. The purpose of this new holistic search model is to assist information system practitioners in identifying, hypothesising, designing and evaluating Web information searching tools. Using the model, a term relevance feedback tool called ‘Tag and Keyword’ (TKy) was developed in a Web browser and it was hypothesised that it could improve query reformulation and reduce unnecessary browsing. The tool was laboratory experimented and quantitative analysis showed statistical significances in increased query reformulations and in reduced Web browsing (per query). Subjects were interviewed after the experiment and qualitative analysis revealed that they found the tool useful and saved time. Interestingly, exploratory analysis on collected data identified three different methods in which subjects had utilised the TKy tool. The research developed a holistic search model for Web searching and demonstrated that it can be used to hypothesise, design and evaluate information searching tools. Information system practitioners using it can better understand the context in which their search tools are developed and how these relate to users’ search processes and other search tools

    NetIQ Evaluation Project

    Get PDF
    The motivation behind the NetIQ Evaluation Project stems from the fact that Regis University does not currently utilize network management software across the various campus networks. Although this “state of the network” in and of itself is reason alone for such a project, other factors such as an overall increase in the student populous, an increase in the demand for on-line education, and a general need for a secure, reliable and efficient network that spans multiple Regis University campuses has helped fuel project initiative. The NetIQ Evaluation Project’s overall goal is to evaluate the network management software suite known as NetIQ AppManager. This evaluation will provide Regis University policy makers with unbiased information in terms of selecting a network management software suite

    Moi Helsinki. Personalised user interface solutions for generative data

    Get PDF
    In the modern days, online search stands out as the most popular way to access a major amount of information. At the same time, browsing through too much data could lead to an information overload. Helping users to feel more individual, as well as appropriately navigating them through the data is an objective designers should raise. In the theoretical background of this work, I bring attention to techniques that allow one to work with generative data and its contextualisation. I study historical and philosophical aspects of information perception, as well as the modern experience of working with online search engines such as Google. I refer to information architecture principles that can adapt user interface designs to generative content. In the age of big data and information pollution, a designer’s objective could be employing technology to make data more human-centred. Along with the theoretical writing, this thesis also consists of project work. Moi Helsinki is a location-based event calendar for the Helsinki area. The calendar gathers information about events retrieved from social media API, and showcases aggregated data in a single feed. Moi Helsinki reshapes the data output with the help of interface personalisation, showing the most relevant results at the top. It employs a user’s current geographical location in order to tailor search results based on proximity for each visitor. The options provided to website visitors within the UI are extended with further customisation, which can be enabled by adjusting the data output beyond just a user’s location. Setting aside certain distinctive features of event calendars, Moi Helsinki chooses another path to explore. Being more of a mediator than proprietor, Moi Helsinki offers a new way to reshape the data and communicate human-centred values through user interface

    Automated Analysis of Freeware Installers Promoted by Download Portals

    Get PDF
    Abstract We present an analysis system for studying Windows application installers. The analysis system is fully automated from installer download to execution and data collection. The system emulates the behavior of a lazy user who wants to finish the installation dialogs with the default options and with as few clicks as possible. The UI automation makes use of image recognition techniques and heuristics. During the installation, the system collects data about the system modification and network access. The analysis system is scalable and can run on bare-metal hosts as well as in a data center. We use the system to analyze 792 freeware application installers obtained from popular download portals. In particular, we measure how many of them drop potentially unwanted programs (PUP) such as browser plugins or make other unwanted system modifications. We discover that most installers that download executable files over the network are vulnerable to man-in-the-middle attacks. We also find, that while popular download portals are not used for blatant malware distribution, nearly 10% of the analyzed installers come with a third-party browser or a browser extension.Peer reviewe
    corecore