12,236 research outputs found

    Public survey instruments for business administration using social network analysis and big data

    Get PDF
    Purpose: The subject matter of this research is closely intertwined with the scientific discussion about the necessity of developing and implementing practice-oriented means of measuring social well-being taking into account the intensity of contacts between individuals. The aim of the research is to test the toolkit for analyzing social networks and to develop a research algorithm to identify sources of consolidation of public opinion and key agents of influence. The research methodology is based on postulates of sociology, graph theory, social network analysis and cluster analysis. Design/Methodology/Approach: The basis for the empirical research was provided by the data representing the reflection of social media users on the existing image of Russia and its activities in the Arctic, chosen as a model case. Findings: The algorithm allows to estimate the density and intensity of connections between actors, to trace the main channels of formation of public opinion and key agents of influence, to identify implicit patterns and trends, to relate information flows and events with current information causes and news stories for the subsequent formation of a "cleansed" image of the object under study and the key actors with whom this object is associated. Practical Implications: The work contributes to filling the existing gap in the scientific literature, caused by insufficient elaboration of the issues of applying the social network analysis to solve sociological problems. Originality/Value: The work contributes to filling the existing gap in the scientific literature formed as a result of insufficient development of practical issues of using analysis of social networks to solve sociological problems.peer-reviewe

    The Tornado Warning Process: A Review of Current Research, Challenges, and Opportunities

    Get PDF
    With the unusually violent tornado season of 2011, there has been a renewed national interest, through such programs as NOAA\u27s Weather Ready Nation initiative, to reevaluate and improve our tornado warning process. This literature review provides an interdisciplinary, end-to-end examination of the tornado warning process. Following the steps outlined by the Integrated Warning System, current research in tornado prediction and detection, the warning decision process, warning dissemination, and public response are reviewed, and some of the major challenges for improving each stage are highlighted. The progress and challenges in multi-day to short-term tornado prediction are discussed, followed by an examination of tornado detection, focused primarily upon the contributions made by weather radar and storm spotters. Next is a review of the warning decision process and the challenges associated with dissemination of the warning, followed by a discussion of the complexities associated with understanding public response. Finally, several research opportunities are considered, with emphases on understanding acceptable risk, greater community and personal preparation, and personalization of the hazard risk

    Disaggregation of net-metered advanced metering infrastructure data to estimate photovoltaic generation

    Get PDF
    2019 Fall.Includes bibliographical references.Advanced metering infrastructure (AMI) is a system of smart meters and data management systems that enables communication between a utility and a customer's premise, and can provide real time information about a solar array's production. Due to residential solar systems typically being configured behind-the-meter, utilities often have very little information about their energy generation. In these instances, net-metered AMI data does not provide clear insight into PV system performance. This work presents a methodology for modeling individual array and system-wide PV generation using only weather data, premise AMI data, and the approximate date of PV installation. Nearly 850 homes with installed solar in Fort Collins, Colorado, USA were modeled for up to 36 months. By matching comparable periods of time to factor out sources of variability in a building's electrical load, algorithms are used to estimate the building's consumption, allowing the previously invisible solar generation to be calculated. These modeled outputs are then compared to previously developed white-box physical models. Using this new AMI method, individual premises can be modeled to agreement with physical models within ±20%. When modeling portfolio-wide aggregation, the AMI method operates most effectively in summer months when solar generation is highest. Over 75% of all days within three years modeled are estimated to within ±20% with established methods. Advantages of the AMI model with regard to snow coverage, shading, and difficult to model factors are discussed, and next-day PV prediction using forecasted weather data is also explored. This work provides a foundation for disaggregating solar generation from AMI data, without knowing specific physical parameters of the array or using known generation for computational training

    Seasonal prediction of lake inflows and rainfall in a hydro-electricity catchment, Waitaki river, New Zealand

    Get PDF
    The Waitaki River is located in the centre of the South Island of New Zealand, and hydro-electricity generated on the river accounts for 35-40% of New Zealand's electricity. Low inflows in 1992 and 2001 resulted in the threat of power blackouts. Improved seasonal rainfall and inflow forecasts will result in the better management of the water used in hydro-generation on a seasonal basis. Researchers have stated that two key directions in the fields of seasonal rainfall and streamflow forecasting are to a) decrease the spatial scale of forecast products, and b) tailor forecast products to end-user needs, so as to provide more relevant and targeted forecasts. Several season-ahead lake inflow and rainfall forecast models were calibrated for the Waitaki river catchment using statistical techniques to quantify relationships between land-ocean-atmosphere state variables and seasonally lagged inflows and rainfall. Techniques included principal components analysis and multiple linear regression, with cross-validation techniques applied to estimate model error and randomization techniques used to establish the significance of the skill of the models. Many of the models calibrated predict rainfall and inflows better than random chance and better than the long-term mean as a predictor. When compared to the range of all probable inflow seasonal totals (based on the 80-year recorded history in the catchment), 95% confidence limits around most model predictions offer significant skill. These models explain up to 19% of the variance in season-ahead rainfall and inflows in this catchment. Seasonal rainfall and inflow forecasting on a single catchment scale and focussed to end-user needs is possible with some skill in the South Island of New Zealand

    An analysis of short haul air passenger demand, volume 2

    Get PDF
    Several demand models for short haul air travel are proposed and calibrated on pooled data. The models are designed to predict demand and analyze some of the motivating phenomena behind demand generation. In particular, an attempt is made to include the effects of competing modes and of alternate destinations. The results support three conclusions: (1) the auto mode is the air mode's major competitor; (2) trip time is an overriding factor in intermodal competition, with air fare at its present level appearing unimportant to the typical short haul air traveler; and (3) distance appears to underly several demand generating phenomena, and therefore, must be considered very carefully to any intercity demand model. It may be the cause of the wide range of fare elasticities reported by researchers over the past 15 years. A behavioral demand model is proposed and calibrated. It combines the travel generating effects of income and population, the effects of modal split, the sensitivity of travel to price and time, and the effect of alternative destinations satisfying the trip purpose

    Evaluating Process-Based Integrated Assessment Models of Climate Change Mitigation

    Get PDF
    Process-based integrated assessment models (IAMs) analyse transformation pathways to mitigate climate change. Confidence in models is established by testing their structural assumptions and comparing their behaviour against observations as well as other models. Climate model evaluation is concerted, and prominently reported in a dedicated chapter in the IPCC WG1 assessments. By comparison, evaluation of process-based IAMs tends to be less visible and more dispersed among modelling teams, with the exception of model inter-comparison projects. We contribute the first comprehensive analysis of process-based IAM evaluation, drawing on a wide range of examples across eight different evaluation methods testing both structural and behavioural validity. For each evaluation method, we compare its application to process-based IAMs with its application to climate models, noting similarities and differences, and seeking useful insights for strengthening the evaluation of process-based IAMs. We find that each evaluation method has distinctive strengths and limitations, as well as constraints on their application. We develop a systematic evaluation framework combining multiple methods that should be embedded within the development and use of process-based IAMs

    Travel Time Prediction Model for Urban Road Network based on Multi-source Data

    Get PDF
    AbstractIn view of the deficiencies of single data source for travel time prediction, multi-source data are used to improve the precision of travel time. Floating car and fixed detector are commonly used in traffic data collection, and they have certain complementarities in data types and accuracy. Therefore, the real-time traffic data of these two detectors are used as input parameters of prediction model, and Kalman filtering theory is used to establish travel time prediction model of urban road network. Finally, the model is simulated by Vissim 4.3 and the simulation results show that the average absolute relative error of travel time based on multi-source data is 5.18%, and it is increased by13.4% comparing with fixed detector data and increased by 7.2% comparing with floating car data

    The Essential Role of Securities Regulation

    Get PDF
    This Article posits that the essential role of securities regulation is to create a competitive market for sophisticated professional investors and analysts (information traders). The Article advances two related theses-one descriptive and the other normative. Descriptively, the Article demonstrates that securities regulation is specifically designed to facilitate and protect the work of information traders. Securities regulation may be divided into three broad categories: (i) disclosure duties; (ii) restrictions on fraud and manipulation; and (iii) restrictions on insider trading-each of which contributes to the creation of a vibrant market for information traders. Disclosure duties reduce information traders\u27 costs of searching and gathering information. Restrictions on fraud and manipulation lower information traders\u27 cost of verifying the credibility of information, and thus enhance information traders\u27 ability to make accurate predictions. Finally, restrictions on insider trading protect information traders from competition from insiders that would undermine information traders\u27 ability to recoup their investment in information. Normatively, the Article shows that information traders can best underwrite efficient and liquid capital markets, and, hence, it is this group that securities regulation should strive to protect. Our account has important implications for several policy debates. First, our account supports the system of mandatory disclosure. We show that, although market forces may provide management with an adequate incentive to disclose at the initial public offering (IPO) stage, they cannot be relied on to effect optimal disclosure thereafter. Second, our analysis categorically rejects calls to limit disclosure duties to hard information and self-dealing by management. Third, our analysis supports the use of the fraud-on-the-market presumption in all fraud cases even when markets are inefficient. Fourth, our analysis suggests that in cases involving corporate misstatements, the appropriate standard of care should, in principle, be negligence, not fraud
    corecore