7,271 research outputs found

    Virtual Astronomy, Information Technology, and the New Scientific Methodology

    Get PDF
    All sciences, including astronomy, are now entering the era of information abundance. The exponentially increasing volume and complexity of modern data sets promises to transform the scientific practice, but also poses a number of common technological challenges. The Virtual Observatory concept is the astronomical community's response to these challenges: it aims to harness the progress in information technology in the service of astronomy, and at the same time provide a valuable testbed for information technology and applied computer science. Challenges broadly fall into two categories: data handling (or "data farming"), including issues such as archives, intelligent storage, databases, interoperability, fast networks, etc., and data mining, data understanding, and knowledge discovery, which include issues such as automated clustering and classification, multivariate correlation searches, pattern recognition, visualization in highly hyperdimensional parameter spaces, etc., as well as various applications of machine learning in these contexts. Such techniques are forming a methodological foundation for science with massive and complex data sets in general, and are likely to have a much broather impact on the modern society, commerce, information economy, security, etc. There is a powerful emerging synergy between the computationally enabled science and the science-driven computing, which will drive the progress in science, scholarship, and many other venues in the 21st century

    Data mining technology for the evaluation of learning content interaction

    Get PDF
    Interactivity is central for the success of learning. In e-learning and other educational multimedia environments, the evaluation of interaction and behaviour is particularly crucial. Data mining – a non-intrusive, objective analysis technology – shall be proposed as the central evaluation technology for the analysis of the usage of computer-based educational environments and in particular of the interaction with educational content. Basic mining techniques are reviewed and their application in a Web-based third-level course environment is illustrated. Analytic models capturing interaction aspects from the application domain (learning) and the software infrastructure (interactive multimedia) are required for the meaningful interpretation of mining results

    Fourteenth Biennial Status Report: März 2017 - February 2019

    No full text

    Utilizing Analytical Hierarchy Process for Pauper House Programme in Malaysia

    Get PDF
    In Malaysia, the selection and evaluation of candidates for Pauper House Programme (PHP) are done manually. In this paper, a technique based on Analytical Hierarchy Technique (AHP) is designed and developed in order to make an evaluation and selection of PHP application. The aim is to ensure the selection process is more precise, accurate and can avoid any biasness issue. This technique is studied and designed based on the Pauper assessment technique from one of district offices in Malaysia. A hierarchical indexes are designed based on the criteria that been used in the official form of PHP application. A number of 23 samples of data which had been endorsed by Exco of State in Malaysia are used to test this technique. Furthermore the comparison of those two methods are given in this paper. All the calculations of this technique are done in a software namely Expert Choice version 11.5. By comparing the manual and AHP shows that there are three (3) samples that are not qualified. The developed technique also satisfies in term of ease of accuracy and preciseness but need a further study due to some limitation as explained in the recommendation of this paper

    A traffic classification method using machine learning algorithm

    Get PDF
    Applying concepts of attack investigation in IT industry, this idea has been developed to design a Traffic Classification Method using Data Mining techniques at the intersection of Machine Learning Algorithm, Which will classify the normal and malicious traffic. This classification will help to learn about the unknown attacks faced by IT industry. The notion of traffic classification is not a new concept; plenty of work has been done to classify the network traffic for heterogeneous application nowadays. Existing techniques such as (payload based, port based and statistical based) have their own pros and cons which will be discussed in this literature later, but classification using Machine Learning techniques is still an open field to explore and has provided very promising results up till now

    Minors as miners: Modelling and evaluating ontological and linguistic learning

    Get PDF
    Gold Coas

    Survey of data mining approaches to user modeling for adaptive hypermedia

    Get PDF
    The ability of an adaptive hypermedia system to create tailored environments depends mainly on the amount and accuracy of information stored in each user model. Some of the difficulties that user modeling faces are the amount of data available to create user models, the adequacy of the data, the noise within that data, and the necessity of capturing the imprecise nature of human behavior. Data mining and machine learning techniques have the ability to handle large amounts of data and to process uncertainty. These characteristics make these techniques suitable for automatic generation of user models that simulate human decision making. This paper surveys different data mining techniques that can be used to efficiently and accurately capture user behavior. The paper also presents guidelines that show which techniques may be used more efficiently according to the task implemented by the applicatio

    The Evolution of First Person Vision Methods: A Survey

    Full text link
    The emergence of new wearable technologies such as action cameras and smart-glasses has increased the interest of computer vision scientists in the First Person perspective. Nowadays, this field is attracting attention and investments of companies aiming to develop commercial devices with First Person Vision recording capabilities. Due to this interest, an increasing demand of methods to process these videos, possibly in real-time, is expected. Current approaches present a particular combinations of different image features and quantitative methods to accomplish specific objectives like object detection, activity recognition, user machine interaction and so on. This paper summarizes the evolution of the state of the art in First Person Vision video analysis between 1997 and 2014, highlighting, among others, most commonly used features, methods, challenges and opportunities within the field.Comment: First Person Vision, Egocentric Vision, Wearable Devices, Smart Glasses, Computer Vision, Video Analytics, Human-machine Interactio
    corecore