180,017 research outputs found

    A Sensing Error Aware MAC Protocol for Cognitive Radio Networks

    Full text link
    Cognitive radios (CR) are intelligent radio devices that can sense the radio environment and adapt to changes in the radio environment. Spectrum sensing and spectrum access are the two key CR functions. In this paper, we present a spectrum sensing error aware MAC protocol for a CR network collocated with multiple primary networks. We explicitly consider both types of sensing errors in the CR MAC design, since such errors are inevitable for practical spectrum sensors and more important, such errors could have significant impact on the performance of the CR MAC protocol. Two spectrum sensing polices are presented, with which secondary users collaboratively sense the licensed channels. The sensing policies are then incorporated into p-Persistent CSMA to coordinate opportunistic spectrum access for CR network users. We present an analysis of the interference and throughput performance of the proposed CR MAC, and find the analysis highly accurate in our simulation studies. The proposed sensing error aware CR MAC protocol outperforms two existing approaches with considerable margins in our simulations, which justify the importance of considering spectrum sensing errors in CR MAC design.Comment: 21 page, technical repor

    Centering the Margins: Outlier-Based Identification of Harmed Populations in Toxicity Detection

    Full text link
    The impact of AI models on marginalized communities has traditionally been measured by identifying performance differences between specified demographic subgroups. Though this approach aims to center vulnerable groups, it risks obscuring patterns of harm faced by intersectional subgroups or shared across multiple groups. To address this, we draw on theories of marginalization from disability studies and related disciplines, which state that people farther from the norm face greater adversity, to consider the "margins" in the domain of toxicity detection. We operationalize the "margins" of a dataset by employing outlier detection to identify text about people with demographic attributes distant from the "norm". We find that model performance is consistently worse for demographic outliers, with mean squared error (MSE) between outliers and non-outliers up to 70.4% worse across toxicity types. It is also worse for text outliers, with a MSE up to 68.4% higher for outliers than non-outliers. We also find text and demographic outliers to be particularly susceptible to errors in the classification of severe toxicity and identity attacks. Compared to analysis of disparities using traditional demographic breakdowns, we find that our outlier analysis frequently surfaces greater harms faced by a larger, more intersectional group, which suggests that outlier analysis is particularly beneficial for identifying harms against those groups.Comment: EMNLP 202

    Governance matters IV : governance indicators for 1996-2004

    Get PDF
    The authors present the latest update of their aggregate governance indicators, together with new analysis of several issues related to the use of these measures. The governance indicators measure the following six dimensions of governance: (1) voice and accountability; (2) political instability and violence; (3) government effectiveness; (4) regulatory quality; (5) rule of law, and (6) control of corruption. They cover 209 countries and territories for 1996, 1998, 2000, 2002, and 2004. They are based on several hundred individual variables measuring perceptions of governance, drawn from 37 separate data sources constructed by 31 organizations. The authors present estimates of the six dimensions of governance for each period, as well as margins of error capturing the range of likely values for each country. These margins of error are not unique to perceptions-based measures of governance, but are an important feature of all efforts to measure governance, including objective indicators. In fact, the authors give examples of how individual objective measures provide an incomplete picture of even the quite particular dimensions of governance that they are intended to measure. The authors also analyze in detail changes over time in their estimates of governance; provide a framework for assessing the statistical significance of changes in governance; and suggest a simple rule of thumb for identifying statistically significant changes in country governance over time. The ability to identify significant changes in governance over time is much higher for aggregate indicators than for any individual indicator. While the authors find that the quality of governance in a number of countries has changed significantly (in both directions), they also provide evidence suggesting that there are no trends, for better or worse, in global averages of governance. Finally, they interpret the strong observed correlation between income and governance, and argue against recent efforts to apply a discount to governance performance in low-income countries.Economic Policy, Institutions and Governance,National Governance,Corruption&Anitcorruption Law,Public Sector Corruption&Anticorruption Measures,Governance Indicators

    Design sensitivity analysis of a plate-finned air-cooled condenser for waste heat recovery ORCs

    Get PDF
    The study is related to the design sensitivity analysis of a plate-finned tube bundle V-shaped air-cooled condenser design problem for a range of representative low-temperature waste heat recovery Organic Rankine Cycle (ORC) cases. An iterative design model is implemented which reveals the thermodynamic and geometric design error margins that occur when different in-tube prediction methods are used. 19 condensation heat transfer correlations are used simultaneously within arrays of geometric and thermodynamic variables. Through attained 19 different convective coefficients, a design sensitivity on the calculated overall heat transfer coefficient, total transferred heat, degree of subcooling, required tube and fin material amount, air- and refrigerant-side pressure drops is reported

    THE EFFECTS OF PRIVATIZATION AND INTERNATIONAL COMPETITIVE PRESSURE ON FIRMS’ PRICE-COST MARGINS: MICRO EVIDENCE FROM EMERGING ECONOMIES1

    Full text link
    This paper uses representative firm level panel data of 1,701 Bulgarian and 2,047 Romanian manufacturing firms to estimate price-cost margins and to analyze how these are affected by privatization and increased competitive pressure. The estimation method used, which is based on Roeger (1995), deals with potential endogeneity problems that are associated with estimating firm performance, by making use of the properties of the primal and dual Solow residual. We find that privatization is associated with higher price-cost margins in both Bulgaria and Romania. Moreover, foreign owned firms have higher markups than domestic privatized firms. Our results suggest that the sequencing of reforms, such as demonopolization prior to privatization and the establishment of competition policy, may be important. In addition, our results give support to the idea that opening to trade has a disciplining effect on firms’ market power. We find that increased import penetration is associated with lower price-cost margins in sectors where product market concentration is relatively high. Our results can be of relevance for other emerging economies, such as China and Vietnam, which still have to undergo major privatization programs.http://deepblue.lib.umich.edu/bitstream/2027.42/39989/3/wp603.pd

    Run-time power and performance scaling in 28 nm FPGAs

    Get PDF

    Uncertainty quantification methods for neural networks pattern recognition

    Get PDF
    On-line monitoring techniques have attracted increasing attention as a promising strategy for improving safety, maintaining availability and reducing the cost of operation and maintenance. In particular, pattern recognition tools such as artificial neural networks are today largely adopted for sensor validation, plant component monitoring, system control, and fault-diagnostics based on the data acquired during operation. However, classic artificial neural networks do not provide an error context for the model response, whose robustness remains thus difficult to estimate. Indeed, experimental data generally exhibit a time/space-varying behaviour and are hence characterized by an intrinsic level of uncertainty that unavoidably affects the performance of the tools adopted and undermines the accuracy of the analysis. For this reason, the propagation of the uncertainty and the quantification of the so called margins of uncertainty in output are crucial in making risk-informed decision. The current study presents a comparison between two different approaches for the quantification of uncertainty in artificial neural networks. The first technique presented is based on the error estimation by a series association scheme, the second approach couples Bayesian model selection technique and model averaging into a unified framework. The efficiency of these two approaches are analysed in terms of their computational cost and predictive performance, through their application to a nuclear power plant fault diagnosis system
    corecore