3,256 research outputs found

    Working Capital Requirement and the Unemployment Volatility Puzzle

    Full text link
    Shimer (2005) argues that a search and matching model of the labor market in which wage is determined by Nash bargaining cannot generate the observed volatility in unemployment and vacancy in response to reasonable labor productivity shocks. This paper examines how incorporating monopolistically competitive firms with a working capital requirement (in which firms borrow funds to pay their wage bills) improves the ability of the search models to match the empirical fluctuations in unemployment and vacancy without resorting to an alternative wage setting mechanism. The monetary authority follows an interest rate rule in the model. A positive labor productivity shock lowers the real marginal cost of production and lowers inflation. In response to the fall in price level, the monetary authority reduces the nominal interest rate. A lower interest rate reduces the cost of financing and partially offsets the increase in labor cost from a higher productivity. A reduced labor cost implies the firms retain a greater portion of the gain from a productivity shock, which gives them a greater incentive to create vacancies. Simulations show that a working capital requirement does indeed improve the ability of the search models to generate fluctuations in key labor market variables to better match the U.S. data

    LMODEL: A satellite precipitation methodology using cloud development modeling. Part I: Algorithm construction and calibration

    Get PDF
    The Lagrangian Model (LMODEL) is a new multisensor satellite rainfall monitoring methodology based on the use of a conceptual cloud-development model that is driven by geostationary satellite imagery and is locally updated using microwave-based rainfall measurements from low earth-orbiting platforms. This paper describes the cloud development model and updating procedures; the companion paper presents model validation results. The model uses single-band thermal infrared geostationary satellite imagery to characterize cloud motion, growth, and dispersal at high spatial resolution (similar to 4 km). These inputs drive a simple, linear, semi-Lagrangian, conceptual cloud mass balance model, incorporating separate representations of convective and stratiform processes. The model is locally updated against microwave satellite data using a two-stage process that scales precipitable water fluxes into the model and then updates model states using a Kalman filter. Model calibration and updating employ an empirical rainfall collocation methodology designed to compensate for the effects of measurement time difference, geolocation error, cloud parallax, and rainfall shear

    Designing Network Protocols for Good Equilibria

    Get PDF
    Designing and deploying a network protocol determines the rules by which end users interact with each other and with the network. We consider the problem of designing a protocol to optimize the equilibrium behavior of a network with selfish users. We consider network cost-sharing games, where the set of Nash equilibria depends fundamentally on the choice of an edge cost-sharing protocol. Previous research focused on the Shapley protocol, in which the cost of each edge is shared equally among its users. We systematically study the design of optimal cost-sharing protocols for undirected and directed graphs, single-sink and multicommodity networks, and different measures of the inefficiency of equilibria. Our primary technical tool is a precise characterization of the cost-sharing protocols that induce only network games with pure-strategy Nash equilibria. We use this characterization to prove, among other results, that the Shapley protocol is optimal in directed graphs and that simple priority protocols are essentially optimal in undirected graphs

    The first automated negotiating agents competition (ANAC 2010)

    No full text
    Motivated by the challenges of bilateral negotiations between people and automated agents we organized the first automated negotiating agents competition (ANAC 2010). The purpose of the competition is to facilitate the research in the area bilateral multi-issue closed negotiation. The competition was based on the Genius environment, which is a General Environment for Negotiation with Intelligent multi-purpose Usage Simulation. The first competition was held in conjunction with the Ninth International Conference on Autonomous Agents and Multiagent Systems (AAMAS-10) and was comprised of seven teams. This paper presents an overview of the competition, as well as general and contrasting approaches towards negotiation strategies that were adopted by the participants of the competition. Based on analysis in post--tournament experiments, the paper also attempts to provide some insights with regard to effective approaches towards the design of negotiation strategies

    Path deviations outperform approximate stability in heterogeneous congestion games

    Get PDF
    We consider non-atomic network congestion games with heterogeneous players where the latencies of the paths are subject to some bounded deviations. This model encompasses several well-studied extensions of the classical Wardrop model which incorporate, for example, risk-aversion, altruism or travel time delays. Our main goal is to analyze the worst-case deterioration in social cost of a perturbed Nash flow (i.e., for the perturbed latencies) with respect to an original Nash flow. We show that for homogeneous players perturbed Nash flows coincide with approximate Nash flows and derive tight bounds on their inefficiency. In contrast, we show that for heterogeneous populations this equivalence does not hold. We derive tight bounds on the inefficiency of both perturbed and approximate Nash flows for arbitrary player sensitivity distributions. Intuitively, our results suggest that the negative impact of path deviations (e.g., caused by risk-averse behavior or latency perturbations) is less severe than approximate stability (e.g., caused by limited responsiveness or bounded rationality). We also obtain a tight bound on the inefficiency of perturbed Nash flows for matroid congestion games and homogeneous populations if the path deviations can be decomposed into edge deviations. In particular, this provides a tight bound on the Price of Risk-Aversion for matroid congestion games

    Extension of Public Warning System

    Get PDF
    This disclosure describes techniques to relay public warning system (PWS) messages from a host device to user devices that cannot receive broadcast messages from a cellular network. A PWS message broadcast by a cellular network is received by the host device and relayed to user devices within a personal area network (PAN). Fields from the incoming PWS message are compared to previously received PWS messages to avoid duplication of messages. Relay of the PWS message by the router enables reception of the PWS message by user devices in the PAN that are not connected to the cellular network. User configurable settings enable users to select or disable the feature of reception of PWS via the PAN

    DYNAMIC TRANSMISSION CONTROL FOR IMPROVING MOBILE DATA USAGE

    Get PDF
    This paper describes techniques for dynamically calculating values of one or more Transmission Control Protocol (TCP) memory buffer size variables (e.g., ā€œtcp_mem,ā€ ā€œtcp_rmem,ā€ and/or ā€œtcp_wmemā€ variables), based on consideration of real-time network conditions, to achieve improved data usage of a mobile computing device. By dynamically determining and/or adjusting the values of TCP memory buffer size variables, the described techniques enable a mobile computing device (e.g., a mobile phone, tablet computer, wearable and/or headset device) to avoid sending too many in-flight packets that exceed network capacity, thereby reducing packet loss and the need for data retransmission from the mobile computing device. In some cases, the described techniques introduce and utilize a machine-learning model to predict suitable values of the dynamically determined TCP memory buffer size variables. The machine-learning model accepts a number of different features as inputs in order to produce a predicted output value of a memory buffer size variable. These features may include, for example, a specified time frame, real-time network allocated bandwidth, a geographic region (e.g., cell tower identifier or Global Positioning Satellite (GPS) location), and/or a packet loss rate, to name only a few examples

    Preferences for cancer investigation:a vignette-based study of primary-care attendees

    Get PDF
    SummaryBackgroundThe UK lags behind many European countries in terms of cancer survival. Initiatives to address this disparity have focused on barriers to presentation, symptom recognition, and referral for specialist investigation. Selection of patients for further investigation has come under particular scrutiny, although preferences for referral thresholds in the UK population have not been studied. We investigated preferences for diagnostic testing for colorectal, lung, and pancreatic cancers in primary-care attendees.MethodsIn a vignette-based study, researchers recruited individuals aged at least 40 years attending 26 general practices in three areas of England between Dec 6, 2011, and Aug 1, 2012. Participants completed up to three of 12 vignettes (four for each of lung, pancreatic, and colorectal cancers), which were randomly assigned. The vignettes outlined a set of symptoms, the risk that these symptoms might indicate cancer (1%, 2%, 5%, or 10%), the relevant testing process, probable treatment, possible alternative diagnoses, and prognosis if cancer were identified. Participants were asked whether they would opt for diagnostic testing on the basis of the information in the vignette.Findings3469 participants completed 6930 vignettes. 3052 individuals (88%) opted for investigation in their first vignette. We recorded no strong evidence that participants were more likely to opt for investigation with a 1% increase in risk of cancer (odds ratio [OR] 1Ā·02, 95% CI 0Ā·99ā€“1Ā·06; p=0Ā·189), although the association between risk and opting for investigation was strong when colorectal cancer was analysed alone (1Ā·08, 1Ā·03ā€“1Ā·13; p=0Ā·0001). In multivariable analysis, age had an effect in all three cancer models: participants aged 60ā€“69 years were significantly more likely to opt for investigation than were those aged 40ā€“59 years, and those aged 70 years or older were less likely. Other variables associated with increased likelihood of opting for investigation were shorter travel times to testing centre (colorectal and lung cancers), a family history of cancer (colorectal and lung cancers), and higher household income (colorectal and pancreatic cancers).InterpretationParticipants in our sample expressed a clear preference for diagnostic testing at all risk levels, and individuals want to be tested at risk levels well below those stipulated by UK guidelines. This willingness should be considered during design of cancer pathways, particularly in primary care. The public engagement with our study should encourage general practitioners to involve patients in referral decision making.FundingThe National Institute for Health Research Programme Grants for Applied Research programme

    I'll Be Back: On the Multiple Lives of Users of a Mobile Activity Tracking Application

    Full text link
    Mobile health applications that track activities, such as exercise, sleep, and diet, are becoming widely used. While these activity tracking applications have the potential to improve our health, user engagement and retention are critical factors for their success. However, long-term user engagement patterns in real-world activity tracking applications are not yet well understood. Here we study user engagement patterns within a mobile physical activity tracking application consisting of 115 million logged activities taken by over a million users over 31 months. Specifically, we show that over 75% of users return and re-engage with the application after prolonged periods of inactivity, no matter the duration of the inactivity. We find a surprising result that the re-engagement usage patterns resemble those of the start of the initial engagement period, rather than being a simple continuation of the end of the initial engagement period. This evidence points to a conceptual model of multiple lives of user engagement, extending the prevalent single life view of user activity. We demonstrate that these multiple lives occur because the users have a variety of different primary intents or goals for using the app. We find evidence for users being more likely to stop using the app once they achieved their primary intent or goal (e.g., weight loss). However, these users might return once their original intent resurfaces (e.g., wanting to lose newly gained weight). Based on insights developed in this work, including a marker of improved primary intent performance, our prediction models achieve 71% ROC AUC. Overall, our research has implications for modeling user re-engagement in health activity tracking applications and has consequences for how notifications, recommendations as well as gamification can be used to increase engagement
    • ā€¦
    corecore