9,351 research outputs found

    What is Fair Pay for Executives? An Information Theoretic Analysis of Wage Distributions

    Full text link
    The high pay packages of U.S. CEOs have raised serious concerns about what would constitute a fair pay.Comment: 16 page

    Fairness Is an Emergent Self-Organized Property of the Free Market for Labor

    Full text link
    The excessive compensation packages of CEOs of U.S. corporations in recent years have brought to the foreground the issue of fairness in economics. The conventional wisdom is that the free market for labor, which determines the pay packages, cares only about efficiency and not fairness. We present an alternative theory that shows that an ideal free market environment also promotes fairness, as an emergent property resulting from the self-organizing market dynamics. Even though an individual employee may care only about his or her salary and no one else's, the collective actions of all the employees, combined with the profit maximizing actions of all the companies, in a free market environment under budgetary constraints, lead towards a more fair allocation of wages, guided by Adam Smith's invisible hand of self-organization. By exploring deep connections with statistical thermodynamics, we show that entropy is the appropriate measure of fairness in a free market environment which is maximized at equilibrium to yield the lognormal distribution of salaries as the fairest inequality of pay in an organization under ideal conditions

    Models and metrics for software management and engineering

    Get PDF
    This paper attempts to characterize and present a state of the art view of several quantitative models and metrics of the software life cycle. These models and metrics can be used to aid in managing and engineering software projects. They deal with various aspects of the software process and product, including resources allocation and estimation, changes and errors, size, complexity and reliability. Some indication is given of the extent to which the various models have been used and the success they have achieved

    Income Distributional Effects of Using Market-Based Instruments for Managing Common Property Resources

    Get PDF
    In the face of growing management problems and conflicts over increasing demands and dwindling or increasingly variable supplies of surface and groundwater, the need for revising the conventional water resource allocation methods has been increasingly felt among natural resource managers and policy makers. For the past 30 years economists have advocated for the application of various types of market-based instruments (MBIs) as an efficient means of effecting the re-allocation water resources among competing uses. While MBIs have been implemented in several countries, they have continued to encounter strong socio-political opposition, due to the impacts imposed on third-parties during transfers and re-allocations, as well as the distributional effects across different types of water users. Despite the demonstrable efficiency gains of MBIs, the resulting equity or distributional effects of MBI-driven re-allocations can be of equal or greater importance to policy-makers and the constituents that they serve. At the same time, the realized gains in economic efficiency from the application of MBIs depend heavily on the heterogeneity of the agents they are targeted towards, as well as the degree of information asymmetry that the regulator faces. In this paper, we use a simple theoretical framework to show the trade-offs between efficiency and equity that might arise from the application of MBIs to a heterogenous population of agents drawing non-cooperatively from a natural resource pool. Using the idealized centralized planner as a benchmark of dynamic, allocative efficiency, we compare the realized efficiency gains that can be realized by alternative policy instruments and the resulting impacts on distributional equity, in terms of the cumulative net benefits over time. Using the specific example of groundwater and the empirical setting of Southern California, we are able to highlight the trade-offs between efficiency and equity that might exist among alternative policy instruments, and how MBIs perform with respect to those dual criteria. We find that under agent heterogeneity, there are asymmetric gains in efficiency when the centralized planner allocations are constrained by equity considerations. Through such results, this paper demonstrates the importance of considering both efficiency gains and the minimization of disparities in distributional inequity, when designing policy instruments that create winners and losers with potentially serious socio-political ramifications.Resource /Energy Economics and Policy,

    Dynamic Matching Algorithm of Human Resource Allocation Based on Big Data Mining

    Get PDF
    In order to ensure the dynamic matching effect of human resources allocation and improve the accuracy and efficiency of dynamic matching of human resources allocation, a dynamic matching algorithm of human resources allocation based on big data mining is studied. Analyze the meaning and function of big data mining, and explain the common analysis principles of big data mining. The information entropy is selected as the basis for measuring human resource allocation, the human resource allocation is extracted, and the similarity of human resource allocation is calculated using the Huasdorff similarity method based on time interpolation. According to the Apriori algorithm and FP-Growth classification algorithm, the human resource allocation is classified and mined, and the K-Means clustering algorithm is used to realize the dynamic matching of human resource allocation. The experimental results show that the proposed algorithm has better dynamic matching effect of human resources allocation, and can effectively improve the accuracy and efficiency of dynamic matching of human resources allocation

    Models, Techniques, and Metrics for Managing Risk in Software Engineering

    Get PDF
    The field of Software Engineering (SE) is the study of systematic and quantifiable approaches to software development, operation, and maintenance. This thesis presents a set of scalable and easily implemented techniques for quantifying and mitigating risks associated with the SE process. The thesis comprises six papers corresponding to SE knowledge areas such as software requirements, testing, and management. The techniques for risk management are drawn from stochastic modeling and operational research. The first two papers relate to software testing and maintenance. The first paper describes and validates novel iterative-unfolding technique for filtering a set of execution traces relevant to a specific task. The second paper analyzes and validates the applicability of some entropy measures to the trace classification described in the previous paper. The techniques in these two papers can speed up problem determination of defects encountered by customers, leading to improved organizational response and thus increased customer satisfaction and to easing of resource constraints. The third and fourth papers are applicable to maintenance, overall software quality and SE management. The third paper uses Extreme Value Theory and Queuing Theory tools to derive and validate metrics based on defect rediscovery data. The metrics can aid the allocation of resources to service and maintenance teams, highlight gaps in quality assurance processes, and help assess the risk of using a given software product. The fourth paper characterizes and validates a technique for automatic selection and prioritization of a minimal set of customers for profiling. The minimal set is obtained using Binary Integer Programming and prioritized using a greedy heuristic. Profiling the resulting customer set leads to enhanced comprehension of user behaviour, leading to improved test specifications and clearer quality assurance policies, hence reducing risks associated with unsatisfactory product quality. The fifth and sixth papers pertain to software requirements. The fifth paper both models the relation between requirements and their underlying assumptions and measures the risk associated with failure of the assumptions using Boolean networks and stochastic modeling. The sixth paper models the risk associated with injection of requirements late in development cycle with the help of stochastic processes

    Integration of software reliability into systems reliability optimization

    Get PDF
    Reliability optimization originally developed for hardware systems is extended to incorporate software into an integrated system reliability optimization. This hardware-software reliability optimization problem is formulated into a mixed-integer programming problem. The integer variables are the number of redundancies, while the real variables are the components reliabilities;To search a common framework under which hardware systems and software systems can be combined, a review and classification of existing software reliability models is conducted. A software redundancy model with common-cause failure is developed to represent the objective function. This model includes hardware redundancy with independent failure as a special case. A software reliability-cost function is then derived based on a binomial-type software reliability model to represent the constraint function;Two techniques, the combination of heuristic redundancy method with sequential search method, and the Lagrange multiplier method with the branch-and-bound method, are proposed to solve this mixed-integer reliability optimization problem. The relative merits of four major heuristic redundancy methods and two sequential search methods are investigated through a simulation study. The results indicate that the sequential search method is a dominating factor of the combination method. Comparison of the two proposed mixed-integer programming techniques is also studied by solving two numerical problems, a series system with linear constraints and a bridge system with nonlinear constraints. The Lagrange multiplier method with the branch-and-bound method has been shown to be superior to all other existing methods in obtaining the optimal solution;Finally an illustration is performed for integrating software reliability model into systems reliability optimization

    Robust optimization method of emergency resource allocation for risk management in inland waterways

    Get PDF
    This study proposes a robust optimization method for waterborne emergency resource allocation in inland waterways that addresses the uncertainties and mismatches between supply and demand. To accomplish this, we integrate the risk evaluation of maritime with a robust optimization model and employ the Entropy Weighted Method (EWM)-Technique for Order Preference by Similarity to Ideal Solution (TOPSIS)-Analytic Hierarchy Process (AHP) method to evaluate the risk of various areas. The approach enables exploration of the relationship between maritime risk and emergency resource allocation strategy. The robust optimization method is used to deal with uncertainty and derive the robust counterpart of the proposed model. We establish an emergency resource allocation model that considers both the economy and timeliness of emergency resource allocation. We construct an optimization model and transform it into an easily solvable robust counterpart model. The results demonstrate that the proposed method can adapt to real-world scenarios, and effectively optimize the configuration effect while improving rescue efficiency under reasonable resource allocation. Specifically, the proportion of rescue time saved ranges from 28.52% to 92.60%, and the proportion of total cost saved is 95.82%. Our approach has significant potential to provide a valuable reference for decision-making related to emergency resource allocation in maritime management
    • …
    corecore