750 research outputs found

    Cost-Effective Incentive Allocation via Structured Counterfactual Inference

    Full text link
    We address a practical problem ubiquitous in modern marketing campaigns, in which a central agent tries to learn a policy for allocating strategic financial incentives to customers and observes only bandit feedback. In contrast to traditional policy optimization frameworks, we take into account the additional reward structure and budget constraints common in this setting, and develop a new two-step method for solving this constrained counterfactual policy optimization problem. Our method first casts the reward estimation problem as a domain adaptation problem with supplementary structure, and then subsequently uses the estimators for optimizing the policy with constraints. We also establish theoretical error bounds for our estimation procedure and we empirically show that the approach leads to significant improvement on both synthetic and real datasets

    Incorporating The Home Address of Road Users Involved in Traffic Crashes in Road Safety Analysis

    Get PDF
    Traditionally, road safety metrics are measured at the location of the crash and its surrounding area. For example, if a crash occurs at an intersection, depending on the scope of the study, the researchers or practitioners may count crashes at intersection level, corridor level, or at a coarser geographic area such as Traffic Analysis Zone (TAZ), city level, or county level. Attributing crash to the location of the crash helps us learn about the relationship between road, environment, traffic, and weather and road safety. Based on this practice, several countermeasures have been developed to prevent crashes or reduce the severity of traffic crashes. As a result, a large body of road safety literature was allocated to road and geometry design and their effect on traffic crashes. In my dissertation, I set out to take a more epidemiological approach to road safety analysis, looking at factors such as social geography and travel behavior surrounding the home addresses of the road users involved in traffic crashes –i.e., a Home-Based Approach. Knowing more about the role of a human factor origin, and expressly sociodemographic, and travel behavior could help us to understand road safety from a different perspective that enables researchers and road safety practitioners to target individuals with proper countermeasure and intervention with the intention of reducing crash risk or eliminating aberrant behaviors of road users. My dissertation consists of five chapters. I explored different applications of the Home-Based Approach (HBA) methods in economical cost of traffic crashes, seat belt use analysis, and negative externalities of the tourism industry

    A Neural Network Approach to Identify Hyperspectral Image Content

    Get PDF
    A Hyperspectral is the imaging technique that contains very large dimension data with the hundreds of channels. Meanwhile, the Hyperspectral Images (HISs) delivers the complete knowledge of imaging; therefore applying a classification algorithm is very important tool for practical uses. The HSIs are always having a large number of correlated and redundant feature, which causes the decrement in the classification accuracy; moreover, the features redundancy come up with some extra burden of computation that without adding any beneficial information to the classification accuracy. In this study, an unsupervised based Band Selection Algorithm (BSA) is considered with the Linear Projection (LP) that depends upon the metric-band similarities. Afterwards Monogenetic Binary Feature (MBF) has consider to perform the ‘texture analysis’ of the HSI, where three operational component represents the monogenetic signal such as; phase, amplitude and orientation. In post processing classification stage, feature-mapping function can provide important information, which help to adopt the Kernel based Neural Network (KNN) to optimize the generalization ability. However, an alternative method of multiclass application can be adopt through KNN, if we consider the multi-output nodes instead of taking single-output node

    An integrated model for warehouse and inventory planning

    Get PDF
    We propose a tactical model which integrates the replenishment decision in inventory management, the allocation of products to warehousing systems and the assignment of products to storage locations in warehousing management. The purpose of this article is to analyse the value of integrating warehouse and inventory decisions. This is achieved by proposing two methods for solving this tactical integrated model which differ in the level of integration of the inventory and warehousing decisions. A computational analysis is performed on a real world database and using multiple scenarios differing by the warehouse capacity limits. Our observation is that the total cost of the inventory and warehousing systems can be reduced drastically by taking into account the warehouse capacity restrictions in the inventory planning decisions, in an aggregate way. Moreover additional inventory and warehouse savings can be achieved by using more sophisticated integration methods for inventory and warehousing decisions

    RECURSIVE MULTI-MODEL UPDATING OF BUILDING STRUCTURE: A NEW SENSITIVITY BASED FINITE ELEMENT APPROACH

    Get PDF
    An invaluable tool for structural health monitoring and damage detection, parametric system identification through model-updating is an inverse problem, affected by several kinds of modelling assumptions and measurement errors. By minimizing the discrepancy between the measured data and the simulated response, traditional model-updating techniques identify one single optimal model that behaves similarly to the real structure. Due to several sources of errors, this mathematical optimum may be far from the true solution and lead to misleading conclusions about the structural state. Instead of the mere location of the global minimum, it should be therefore preferred the generation of several alternatives, capable to express near-optimal solutions while being as different as possible from each other in physical terms. The present paper accomplishes this goal through a new recursive, direct-search, multi-model updating technique, where multiple models are first created and separately solved for the respective minimum, and then a selection of quasi-optimal alternatives is retained and classified through data mining and clustering algorithm. The main novelty of the approach consists in the recursive strategy adopted for minimizing the objective function, where convergence towards optimality is sped up by sequentially changing only selected subsets of parameters, depending on their respective influence on the error function. Namely, this approach consists of two steps. First, a sensitivity analysis is performed. The input parameters are allowed to vary within a small interval of fractional variation around a nominal value to compute the partial derivatives numerically. After that, for each parameter the sensitivities to all the responses are summed up, and used as an indicator of sensitivity of the parameter. According to the sensitivity indicators the parameters are divided into an indicated number of subsets given by the user. Then every subset is updated recursively with a specified order according to the sensitivity indicator
    • …
    corecore