941 research outputs found

    Text books untuk mata kuliah pemrograman web

    Get PDF
    .HTML.And.Web.Design.Tips.And.Techniques.Jan.2002.ISBN.0072228253.pd

    Usage Bibliometrics

    Full text link
    Scholarly usage data provides unique opportunities to address the known shortcomings of citation analysis. However, the collection, processing and analysis of usage data remains an area of active research. This article provides a review of the state-of-the-art in usage-based informetric, i.e. the use of usage data to study the scholarly process.Comment: Publisher's PDF (by permission). Publisher web site: books.infotoday.com/asist/arist44.shtm

    Money demand and macroeconomic uncertainty

    Get PDF
    In this study we construct a measure of macroeconomic uncertainty from several observable economic indicators for the euro area. Indicator variables are based on financial market data, such as medium-term returns, loss and volatility measures but also come from surveys that capture business and consumer sentiment. From these we estimate the path of underlying macroeconomic uncertainty using an unobserved components model. Employing cointegration analysis it is demonstrated that the extracted measures of uncertainty help to explain the increase in euro area M3 over the period 2001 to 2004. Similar evidence can be found for US monetary aggregates. --Money demand,Macroeconomic Uncertainty,Excess Liquidity

    Diverse Contributions to Implicit Human-Computer Interaction

    Full text link
    Cuando las personas interactúan con los ordenadores, hay mucha información que no se proporciona a propósito. Mediante el estudio de estas interacciones implícitas es posible entender qué características de la interfaz de usuario son beneficiosas (o no), derivando así en implicaciones para el diseño de futuros sistemas interactivos. La principal ventaja de aprovechar datos implícitos del usuario en aplicaciones informáticas es que cualquier interacción con el sistema puede contribuir a mejorar su utilidad. Además, dichos datos eliminan el coste de tener que interrumpir al usuario para que envíe información explícitamente sobre un tema que en principio no tiene por qué guardar relación con la intención de utilizar el sistema. Por el contrario, en ocasiones las interacciones implícitas no proporcionan datos claros y concretos. Por ello, hay que prestar especial atención a la manera de gestionar esta fuente de información. El propósito de esta investigación es doble: 1) aplicar una nueva visión tanto al diseño como al desarrollo de aplicaciones que puedan reaccionar consecuentemente a las interacciones implícitas del usuario, y 2) proporcionar una serie de metodologías para la evaluación de dichos sistemas interactivos. Cinco escenarios sirven para ilustrar la viabilidad y la adecuación del marco de trabajo de la tesis. Resultados empíricos con usuarios reales demuestran que aprovechar la interacción implícita es un medio tanto adecuado como conveniente para mejorar de múltiples maneras los sistemas interactivos.Leiva Torres, LA. (2012). Diverse Contributions to Implicit Human-Computer Interaction [Tesis doctoral no publicada]. Universitat Politècnica de València. https://doi.org/10.4995/Thesis/10251/17803Palanci

    Rethinking Consistency Management in Real-time Collaborative Editing Systems

    Get PDF
    Networked computer systems offer much to support collaborative editing of shared documents among users. Increasing concurrent access to shared documents by allowing multiple users to contribute to and/or track changes to these shared documents is the goal of real-time collaborative editing systems (RTCES); yet concurrent access is either limited in existing systems that employ exclusive locking or concurrency control algorithms such as operational transformation (OT) may be employed to enable concurrent access. Unfortunately, such OT based schemes are costly with respect to communication and computation. Further, existing systems are often specialized in their functionality and require users to adopt new, unfamiliar software to enable collaboration. This research discusses our work in improving consistency management in RTCES. We have developed a set of deadlock-free multi-granular dynamic locking algorithms and data structures that maximize concurrent access to shared documents while minimizing communication cost. These algorithms provide a high level of service for concurrent access to the shared document and integrate merge-based or OT-based consistency maintenance policies locally among a subset of the users within a subsection of the document – thus reducing the communication costs in maintaining consistency. Additionally, we have developed client-server and P2P implementations of our hierarchical document management algorithms. Simulations results indicate that our approach achieves significant communication and computation cost savings. We have also developed a hierarchical reduction algorithm that can minimize the space required of RTCES, and this algorithm may be pipelined through our document tree. Further, we have developed an architecture that allows for a heterogeneous set of client editing software to connect with a heterogeneous set of server document repositories via Web services. This architecture supports our algorithms and does not require client or server technologies to be modified – thus it is able to accommodate existing, favored editing and repository tools. Finally, we have developed a prototype benchmark system of our architecture that is responsive to users’ actions and minimizes communication costs

    Cosmic velocity--gravity relation in redshift space

    Full text link
    We propose a simple way to estimate the parameter beta = Omega_m^(0.6)/b from three-dimensional galaxy surveys. Our method consists in measuring the relation between the cosmological velocity and gravity fields, and thus requires peculiar velocity measurements. The relation is measured *directly in redshift space*, so there is no need to reconstruct the density field in real space. In linear theory, the radial components of the gravity and velocity fields in redshift space are expected to be tightly correlated, with a slope given, in the distant observer approximation, by g / v = (1 + 6 beta / 5 + 3 beta^2 / 7)^(1/2) / beta. We test extensively this relation using controlled numerical experiments based on a cosmological N-body simulation. To perform the measurements, we propose a new and rather simple adaptive interpolation scheme to estimate the velocity and the gravity field on a grid. One of the most striking results is that nonlinear effects, including `fingers of God', affect mainly the tails of the joint probability distribution function (PDF) of the velocity and gravity field: the 1--1.5 sigma region around the maximum of the PDF is *dominated by the linear theory regime*, both in real and redshift space. This is understood explicitly by using the spherical collapse model as a proxy of nonlinear dynamics. Applications of the method to real galaxy catalogs are discussed, including a preliminary investigation on homogeneous (volume limited) `galaxy' samples extracted from the simulation with simple prescriptions based on halo and sub-structure identification, to quantify the effects of the bias between the galaxy and the total matter distibution, and of shot noise (ABRIDGED).Comment: 24 pages, 10 figures. Matches the version accepted for publication in MNRAS. The definitive version is available at http://www.blackwell-synergy.co

    An Automated Methodology for Validating Web Related Cyber Threat Intelligence by Implementing a Honeyclient

    Get PDF
    Loodud töö panustab küberkaitse valdkonda pakkudes alternatiivse viisi, kuidas hoida ohuteadmus andmebaas uuendatuna. Veebilehti kasutatakse ära viisina toimetada pahatahtlik kood ohvrini. Peale veebilehe klassifitseerimist pahaloomuliseks lisatakse see ohuteadmus andmebaasi kui pahaloomulise indikaatorina. Lõppkokkuvõtteks muutuvad sellised andmebaasid mahukaks ja sisaldavad aegunud kirjeid. Lahendus on automatiseerida aegunud kirjete kontrollimist klient-meepott tarkvaraga ning kogu protsess on täielikult automatiseeritav eesmärgiga hoida kokku aega. Jahtides kontrollitud ja kinnitatud indikaatoreid aitab see vältida valedel alustel küberturbe intsidentide menetlemist.This paper is contributing to the open source cybersecurity community by providing an alternative methodology for analyzing web related cyber threat intelligence. Websites are used commonly as an attack vector to spread malicious content crafted by any malicious party. These websites become threat intelligence which can be stored and collected into corresponding databases. Eventually these cyber threat databases become obsolete and can lead to false positive investigations in cyber incident response. The solution is to keep the threat indicator entries valid by verifying their content and this process can be fully automated to keep the process less time consuming. The proposed technical solution is a low interaction honeyclient regularly tasked to verify the content of the web based threat indicators. Due to the huge amount of database entries, this way most of the web based threat indicators can be automatically validated with less time consumption and they can be kept relevant for monitoring purposes and eventually can lead to avoiding false positives in an incident response processes

    An analysis into the exploitation of the post-attendee URL feature in Zoom webinar regarding malware transmission

    Get PDF
    In response to the COVID-19 pandemic, large businesses and organizations relied on video conferencing applications such as Zoom to maintain public health guidelines due in part to their robust set of features to facilitate productive group events while maintaining social distancing recommendations. While Zoom has many features that can be found in similar video conferencing applications, Zoom also contains a plethora of unique and cutting-edge features to entice modern users. However, when new features are introduced, an inherent risk of vulnerability exploitation has the potential to overshadow the benefits of the feature. One such vulnerable feature within Zoom webinar that is often overlooked is the post-attendee URL, a feature that allows Zoom webinar hosts to set a URL that participants will be redirected to after joining. This study aims to showcase the vulnerabilities of this feature by utilizing URLs of malicious websites and direct download links of files to transmit malware to Zoom webinar participants of the desktop application version of Zoom webinar. This study will also provide an analysis of the residual digital artifacts that are left behind when this feature is utilized to provide digital forensic examiners with the ability to create a comprehensive timeline of events for cases involving this type of attack

    Specification of a partial replication protocol with TLA+

    Get PDF
    Nowadays, data available and used by companies is growing very fast creating the need to use and manage this data in the most efficient way. To this end, data is replicated overmultiple datacenters and use different replication protocols, according to their needs, like more availability or stronger consistency level. The costs associated with full data replication can be very high, and most of the times, full replication is not needed since information can be logically partitioned. Another problem, is that by using datacenters to store and process information clients become heavily dependent on them. We propose a partial replication protocol called ParTree, which replicates data to clients, and organizes clients in a hierarchy, using communication between them to propagate information. This solution addresses some of these problems, namely by supporting partial data replication and offline execution mode. Given the complexity of the protocol, the use of formal verification is crucial to ensure the protocol two correctness properties: causal consistency and preservation of data. The use of TLA+ language and tools to formally specificity and verify the proposed protocol are also described
    corecore