426 research outputs found

    People in the E-Business: New Challenges, New Solutions

    Get PDF
    [Excerpt] Human Resource Planning Society’s (HRPS) annual State of the Art/Practice (SOTA/P) study has become an integral contributor to HRPS’s mission of providing leading edge thinking to its members. Past efforts conducted in 1995, 1996, 1997, 1998, and 1999 have focused on identifying the issues on the horizon that will have a significant impact on the field of Human Resources (HR). This year, in a divergence from past practice, the SOTA/P effort aimed at developing a deeper understanding of one critical issue having a profound impact on organizations and HR, the rise of e-business. The rise of e-business has been both rapid and dramatic. One estimate puts the rate of adoption of the internet at 4,000 new users each hour (eMarketer, 1999) resulting in the expectation of 250 million people on line by the end of 2000, and 350 million by 2005 (Nua, 1999). E-commerce is expected to reach $1.3 trillion by 2003, and of that, 87 percent will go to the business to business (B2B) and 13 percent to the business to consumer (B2C) segments, respectively (Plumely, 2000)

    Garside families in Artin-Tits monoids and low elements in Coxeter groups

    Get PDF
    We show that every finitely generated Artin-Tits group admits a finite Garside family, by introducing the notion of a low element in a Coxeter group and proving that the family of all low elements in a Coxeter system (W, S) with S finite includes S and is finite and closed under suffix and join with respect to the right weak order

    Execution: the Critical “What’s Next?” in Strategic Human Resource Management

    Get PDF
    The Human Resource Planning Society’s 1999 State of the Art/Practice (SOTA/P) study was conducted by a virtual team of researchers who interviewed and surveyed 232 human resource and line executives, consultants, and academics worldwide. Looking three to five years ahead, the study probed four basic topics: (1) major emerging trends in external environments, (2) essential organizational capabilities, (3) critical people issues, and (4) the evolving role of the human resource function. This article briefly reports some of the study’s major findings, along with an implied action agenda – the “gotta do’s for the leading edge. Cutting through the complexity, the general tone is one of urgency emanating from the intersection of several underlying themes: the increasing fierceness of competition, the rapid and unrelenting pace of change, the imperatives of marketplace and thus organizational agility, and the corresponding need to buck prevailing trends by attracting and, especially, retaining and capturing the commitment of world-class talent. While it all adds up to a golden opportunity for human resource functions, there is a clear need to get to get on with it – to get better, faster, and smarter – or run the risk of being left in the proverbial dust. Execute or be executed

    A note on the transitive Hurwitz action on decompositions of parabolic Coxeter elements

    Get PDF
    In this note, we provide a short and self-contained proof that the braid group on n strands acts transitively on the set of reduced factorizations of a Coxeter element in a Coxeter group of finite rank n into products of reflections. We moreover use the same argument to also show that all factorizations of an element in a parabolic subgroup of W lie as well in this parabolic subgroup.Comment: 5 page

    BCFA: Bespoke Control Flow Analysis for CFA at Scale

    Full text link
    Many data-driven software engineering tasks such as discovering programming patterns, mining API specifications, etc., perform source code analysis over control flow graphs (CFGs) at scale. Analyzing millions of CFGs can be expensive and performance of the analysis heavily depends on the underlying CFG traversal strategy. State-of-the-art analysis frameworks use a fixed traversal strategy. We argue that a single traversal strategy does not fit all kinds of analyses and CFGs and propose bespoke control flow analysis (BCFA). Given a control flow analysis (CFA) and a large number of CFGs, BCFA selects the most efficient traversal strategy for each CFG. BCFA extracts a set of properties of the CFA by analyzing the code of the CFA and combines it with properties of the CFG, such as branching factor and cyclicity, for selecting the optimal traversal strategy. We have implemented BCFA in Boa, and evaluated BCFA using a set of representative static analyses that mainly involve traversing CFGs and two large datasets containing 287 thousand and 162 million CFGs. Our results show that BCFA can speedup the large scale analyses by 1%-28%. Further, BCFA has low overheads; less than 0.2%, and low misprediction rate; less than 0.01%.Comment: 12 page

    Amortised likelihood-free inference for expensive time-series simulators with signatured ratio estimation

    Get PDF
    Simulation models of complex dynamics in the natural and social sciences commonly lack a tractable likelihood function, rendering traditional likelihood-based statistical inference impossible. Recent advances in machine learning have introduced novel algorithms for estimating otherwise intractable likelihood functions using a likelihood ratio trick based on binary classifiers. Consequently, efficient likelihood approximations can be obtained whenever good probabilistic classifiers can be constructed. We propose a kernel classifier for sequential data using path signatures based on the recently introduced signature kernel. We demonstrate that the representative power of signatures yields a highly performant classifier, even in the crucially important case where sample numbers are low. In such scenarios, our approach can outperform sophisticated neural networks for common posterior inference tasks

    Black-box Bayesian inference for agent-based models

    Get PDF
    Simulation models, in particular agent-based models, are gaining popularity in economics and the social sciences. The considerable flexibility they offer, as well as their capacity to reproduce a variety of empirically observed behaviours of complex systems, give them broad appeal, and the increasing availability of cheap computing power has made their use feasible. Yet a widespread adoption in real-world modelling and decision-making scenarios has been hindered by the difficulty of performing parameter estimation for such models. In general, simulation models lack a tractable likelihood function, which precludes a straightforward application of standard statistical inference techniques. A number of recent works have sought to address this problem through the application of likelihood-free inference techniques, in which parameter estimates are determined by performing some form of comparison between the observed data and simulation output. However, these approaches are (a) founded on restrictive assumptions, and/or (b) typically require many hundreds of thousands of simulations. These qualities make them unsuitable for large-scale simulations in economics and the social sciences, and can cast doubt on the validity of these inference methods in such scenarios. In this paper, we investigate the efficacy of two classes of simulation-efficient black-box approximate Bayesian inference methods that have recently drawn significant attention within the probabilistic machine learning community: neural posterior estimation and neural density ratio estimation. We present a number of benchmarking experiments in which we demonstrate that neural network-based black-box methods provide state of the art parameter inference for economic simulation models, and crucially are compatible with generic multivariate or even non-Euclidean time-series data. In addition, we suggest appropriate assessment criteria for use in future benchmarking of approximate Bayesian inference procedures for simulation models in economics and the social sciences
    • …
    corecore