63 research outputs found

    Methodological Tools of Assessment of the Taxable Capacity of Territories

    Get PDF
    The purpose of the study is to develop a methodology for assessing the tax capacity of territorial entities. The role of taxes in the model of economic circulation is described; clarifies the rationale and role of the state tax policy, the essence and assessment of the taxable capacity of territories. The proposed system of indicators characterizing the taxable ability makes it possible to objectively predict tax revenues to local budgets and justify the amount of subsidies

    Detecting semantic social engineering attacks with the weakest link: Implementation and empirical evaluation of a human-as-a-security-sensor framework

    Get PDF
    The notion that the human user is the weakest link in information security has been strongly, and, we argue, rightly contested in recent years. Here, we take a step further showing that the human user can in fact be the strongest link for detecting attacks that involve deception, such as application masquerading, spearphishing, WiFi evil twin and other types of semantic social engineering. Towards this direction, we have developed a human-as-a-security-sensor framework and a practical implementation in the form of Cogni-Sense, a Microsoft Windows prototype application, designed to allow and encourage users to actively detect and report semantic social engineering attacks against them. Experimental evaluation with 26 users of different profiles running Cogni-Sense on their personal computers for a period of 45 days has shown that human sensors can consistently outperform technical security systems. Making use of a machine learning based approach, we also show that the reliability of each report, and consequently the performance of each human sensor, can be predicted in a meaningful and practical manner. In an organisation that employs a human-as-a-security-sensor implementation, such as Cogni-Sense, an attack is considered to have been detected if at least one user has reported it. In our evaluation, a small organisation consisting only of the 26 participants of the experiment would have exhibited a missed detection rate below 10%, down from 81% if only technical security systems had been used. The results strongly point towards the need to actively involve the user not only in prevention through cyber hygiene and user-centric security design, but also in active cyber threat detection and reporting

    Crowdsourcing Methods for Data Collection in Geophysics: State of the Art, Issues, and Future Directions

    Get PDF
    Data are essential in all areas of geophysics. They are used to better understand and manage systems, either directly or via models. Given the complexity and spatiotemporal variability of geophysical systems (e.g., precipitation), a lack of sufficient data is a perennial problem, which is exacerbated by various drivers, such as climate change and urbanization. In recent years, crowdsourcing has become increasingly prominent as a means of supplementing data obtained from more traditional sources, particularly due to its relatively low implementation cost and ability to increase the spatial and/or temporal resolution of data significantly. Given the proliferation of different crowdsourcing methods in geophysics and the promise they have shown, it is timely to assess the state‐of‐the‐art in this field, to identify potential issues and map out a way forward. In this paper, crowdsourcing‐based data acquisition methods that have been used in seven domains of geophysics, including weather, precipitation, air pollution, geography, ecology, surface water and natural hazard management are discussed based on a review of 162 papers. In addition, a novel framework for categorizing these methods is introduced and applied to the methods used in the seven domains of geophysics considered in this review. This paper also features a review of 93 papers dealing with issues that are common to data acquisition methods in different domains of geophysics, including the management of crowdsourcing projects, data quality, data processing and data privacy. In each of these areas, the current status is discussed and challenges and future directions are outlined

    GROUND MOTION PREDICTION EQUATIONS FOR MALAYSIA DUE TO EARTHQUAKES IN SUMATRAN REGION

    No full text
    This research comes as a part of the recent initiatives in Malaysia to assess the hazard posed by earthquakes. There have been numerous seismic hazard studies so far that included Malaysian territories, thus there was a need to assess how reliable those studies are. Two main potential contributors to error were identified: 1) seismic hazard analysis method 2) ground motion prediction equation (GMPE). The amount of variations in predictions a faulty GMPE can produce is huge, thus this research concentrates on assessing the available GMPEs and generating one specific for Malaysia

    Retrieving Quantifiable Social Media Data From Human Sensor Networks For Disaster Modeling And Crisis Mapping

    No full text
    This dissertation presents a novel approach that utilizes quantifiable social media data as a human aware, near real-time observing system, coupled with geophysical predictive models for improved response to disasters and extreme events. It shows that social media data has the potential to significantly improve disaster management beyond informing the public, and emphasizes the importance of different roles that social media can play in management, monitoring, modeling and mitigation of natural and human-caused extreme disasters. In the proposed approach Social Media users are viewed as human sensors that are deployed in the field, and their posts are considered to be sensor observations, thus different social media outlets all together form a Human Sensor Network. We utilized the human sensor observations, as boundary value forcings, to show improved geophysical model forecasts of extreme disaster events when combined with other scientific data such as satellite observations and sensor measurements. Several recent extreme disasters are presented as use case scenarios. In the case of the Deepwater Horizon oil spill disaster of 2010 that devastated the Gulf of Mexico, the research demonstrates how social media data from Flickr can be used as a boundary forcing condition of GNOME oil spill plume forecast model, and results in an order of magnitude forecast improvement. In the case of Hurricane Sandy NY/NJ landfall impact of 2012, we demonstrate how the model forecasts, when combined with social media data in a single framework, can be used for near real-time forecast validation, damage assessment and disaster management. Owing to inherent uncertainties in the weather forecasts, the NOAA operational surge model only forecasts the worst-case scenario for flooding from any given hurricane. Geolocated and time-stamped Instagram photos and tweets allow near real-time assessment of the surge levels at different locations, which can validate model forecasts, give timely views of the actual levels of surge, as well as provide an upper bound beyond which the surge did not spread. Additionally, we developed AsonMaps--a crisis-mapping tool that combines dynamic model forecast outputs with social media observations and physical measurements to define the regions of event impacts

    Retrieving quantifiable social media data from human sensor networks for disaster modeling and crisis mapping

    No full text
    This dissertation presents a novel approach that utilizes quantifiable social media data as a human aware, near real-time observing system, coupled with geophysical predictive models for improved response to disasters and extreme events. It shows that social media data has the potential to significantly improve disaster management beyond informing the public, and emphasizes the importance of different roles that social media can play in management, monitoring, modeling and mitigation of natural and human-caused extreme disasters. In the proposed approach Social Media users are viewed as "human sensors" that are "deployed" in the field, and their posts are considered to be "sensor observations", thus different social media outlets all together form a Human Sensor Network. We utilized the "human sensor" observations, as boundary value forcings, to show improved geophysical model forecasts of extreme disaster events when combined with other scientific data such as satellite observations and sensor measurements. Several recent extreme disasters are presented as use case scenarios. In the case of the Deepwater Horizon oil spill disaster of 2010 that devastated the Gulf of Mexico, the research demonstrates how social media data from Flickr can be used as a boundary forcing condition of GNOME oil spill plume forecast model, and results in an order of magnitude forecast improvement. In the case of Hurricane Sandy NY/NJ landfall impact of 2012, we demonstrate how the model forecasts, when combined with social media data in a single framework, can be used for near real-time forecast validation, damage assessment and disaster management. Owing to inherent uncertainties in the weather forecasts, the NOAA operational surge model only forecasts the worst-case scenario for flooding from any given hurricane. Geolocated and time-stamped Instagram photos and tweets allow near real-time assessment of the surge levels at different locations, which can validate model forecasts, give timely views of the actual levels of surge, as well as provide an upper bound beyond which the surge did not spread. Additionally, we developed AsonMaps—a crisis-mapping tool that combines dynamic model forecast outputs with social media observations and physical measurements to define the regions of event impacts
    corecore