93 research outputs found

    Optimization under Uncertainty for E-retail Distribution: From Suppliers to the Last Mile

    Get PDF
    This thesis examines problems faced in the distribution management of e-retailers, in different stages of the supply chain, while accounting for sources of uncertainty. The first problem studies distribution planning, under stochastic customer demand, in a transshipment network. To decide on a transportation schedule that minimizes transportation, inventory and outsourcing costs, the problem is formulated as a two-stage stochastic programming model with recourse. Computational experiments demonstrate the cost-effectiveness of distribution plans generated while considering uncertainty, and provide insights on conditions under which the proposed model achieves significant cost savings. We then focus our attention on a later phase in the supply chain: last-mile same-day delivery. We specifically study crowdsourced delivery, a new delivery system where freelance drivers deliver packages to customers with their own cars. We provide a comprehensive review of this system in terms of academic literature and industry practice. We present a classification of industry platforms based on their matching mechanisms, target markets, and compensation schemes. We also identify new challenges that this delivery system brings about, and highlight open research questions. We then investigate two important research questions faced by crowdsourced delivery platforms. The second problem in this thesis examines the question of balancing driver capacity and demand in crowdsourced delivery systems when there is randomness in supply and demand. We propose models and test the use of heatmaps as a balancing tool for directing drivers to regions with shortage, with an increased likelihood, but not a guarantee, of a revenue-producing order match. We develop an MDP model to sequentially select matching and heatmap decisions that maximize demand fulfillment. The model is solved using a stochastic look-ahead policy, based on approximate dynamic programming. Computational experiments on a real-world dataset demonstrate the value of heatmaps, and factors that impact the effectiveness of heatmaps in improving demand fulfillment. The third problem studies the integration of driver welfare considerations within a platform's dynamic matching decisions. This addresses the common criticism of the lack of protection for workers in the sharing economy, by proposing compensation guarantees to drivers, while maintaining the work hour flexibility of the sharing economy. We propose and model three types of compensation guarantees, either utilization-based or wage-based. We formulate an MDP model, then utilize value function approximation to efficiently solve the problem. Computational experiments are presented to assess the proposed solution approach and evaluate the impact of the different types of guarantees on both the platform and the drivers

    Distribution Planning with Consolidation - A Two-Stage Stochastic Programming Approach

    Get PDF
    The distribution planning problem with consolidation center(s) addresses the coordination of distribution activities between a set of suppliers and a set of customers, through the use of intermediate facilities in order to achieve savings in transportation cost. We study the problem from the perspective of a third-party logistics provider (3PL) that is coordinating shipments between suppliers and customers. Given customer demand of products from different suppliers, the goal is to consolidate the shipments in fewer high volume loads, from suppliers to the consolidation center(s) and from the consolidation center(s) to customer. We assume that suppliers have a finite set of transportation options, each with a given capacity and time of arrival at the consolidation center(s). Similarly, customers have a set of transportation options, each with a given capacity and dispatch time from the consolidation center(s). The 3PL wants to determine the optimal transportation options, or shipment schedule, and the allocation of shipments to transportation options from suppliers to consolidation center(s), and from consolidation center(s) to customers, that minimize the total transportation cost and holding cost at the consolidation center. The literature studies many variations of this problem, which assume deterministic demand. This thesis extends the problem for stochastic demand and formulates it as a two-stage stochastic programming model. We model the case where the choice of transportation options is a \textit{contractual} decision, and a 3PL needs to decide on which options to reserve for a given planning period subject to stochastic customer demand. Therefore, the choices of transportation options are the stage one variables in the two-stage stochastic program. The second stage variables, which are decisions that are made after the uncertainty conditions become known, represent the allocation of orders to reserved transportation options as well as shipping orders through a spot-market carrier, at a greater transportation cost. Because of the high computational demand of the model, the integer L-shaped method is applied to decompose the problem. To increase the efficiency of the algorithm, we experiment with three valid cuts with the goal of generating stronger cuts than the L-cut. We also apply three algorithm enhancement techniques to speed up the convergence of the algorithm. Numerical results show that the performance of our proposed methodology and valid cuts is comparable to that of CPLEX. We suggest promising areas for future work to further improve the computational efficiency of our decomposition algorithm

    Modeling Time-Dependent Behavior of Concrete Affected by Alkali Silica Reaction in Variable Environmental Conditions

    Get PDF
    Alkali Silica Reaction (ASR) is known to be a serious problem for concrete worldwide, especially in high humidity and high temperature regions. ASR is a slow process that develops over years to decades and it is influenced by changes in environmental and loading conditions of the structure. The problem becomes even more complicated if one recognizes that other phenomena like creep and shrinkage are coupled with ASR. This results in synergistic mechanisms that can not be easily understood without a comprehensive computational model. In this paper, coupling between creep, shrinkage and ASR is modeled within the Lattice Discrete Particle Model (LDPM) framework. In order to achieve this, a multi-physics formulation is used to compute the evolution of temperature, humidity, cement hydration, and ASR in both space and time, which is then used within physics-based formulations of cracking, creep and shrinkage. The overall model is calibrated and validated on the basis of experimental data available in the literature. Results show that even during free expansions (zero macroscopic stress), a significant degree of coupling exists because ASR induced expansions are relaxed by meso-scale creep driven by self-equilibriated stresses at the meso-scale. This explains and highlights the importance of considering ASR and other time dependent aging and deterioration phenomena at an appropriate length scale in coupled modeling approaches

    Detection of oxLDL levels in platelets and its relation to CXCR4 and CXCR7 receptors surface expression in patients with coronary artery disease

    Get PDF
    Die koronare Herzkrankheit ist die häufigste Todesursache in Industrienationen weltweit. In dieser Studie haben wir bei 152 Patienten mit symptomatischer koronarer Herzerkrankung (KHK) und 15 gesunden Probanden thrombozytäres oxLDL, CXCR4 sowie CXCR7 untersucht. In dieser Studie wurde die thrombozytäre oxLDL Oberflächenexpression bei Gesunden, Patienten mit stabiler KHK sowie bei Patienten mit akutem Koronarsyndrom (ACS) untersucht. Der thrombozytäre oxLDL Spiegel war bei Patienten mit KHK im Vergleich zu gesunden Probanden erhöht. In der aktuellen Studie wurde eine signifikante positive Korrelation zwischen der thrombozytären oxLDL Oberflächenexpression mit derjenigen von CXCR7 gezeigt. Des Weiteren stellten wir eine inverse Korrelation mit CXCR4 fest. Hieraus schließen wir auf den potentiellen Einfluss der CXCL12/CRCR4/CXCR7 Achse auf die Regulation der thrombozytären Lipidaufnahme sowie auf das thromboembolische Potential der Blutplättchen. Dies bestätigen auch die früheren Studienergebnisse, die die Bedeutung dieser Achse bei KHK-Patienten und ihre Beziehung zum Plasmalipidom betreffen. Darüber hinaus wurden in dieser Studie Assoziationen zwischen thrombozytären oxLDL Spiegeln und Plasma-Lipiden untersucht. Es wurde einerseits eine positive Korrelation zwischen oxLDL und HDL sowie Triglyzeridspiegeln festgestellt. Andererseits zeigten wir eine inverse Korrelation zwischen oxLDL und Plasma-LDL Konzentrationen. Weitere Studien sind notwendig, um die Mechanismen, die diesen Prozess steuern, besser zu verstehen. Aktuell ist der Einfluss von intrathrombozytären oxidierten Lipidmetaboliten auf den Schweregrad der KHK unklar. Darüber hinaus muss die klinische Relevanz von thrombozytärem oxLDL weiter untersucht werden. Die Ergebnisse diese Studie suggerieren eine pro-thrombotische und pro-oxidative Rolle thrombozytärer Chemokine bei hyperlipidämer Stoffwechsellage. Die CXCL12/CXCR4/CXCR7 Achse könnte als mögliches therapeutisches Ziel zur Modulation der thrombozytären Lipide dienen, mit potentiell günstigen Auswirkungen auf die Progression der KHK

    Modeling of aging effects on concrete creep / shrinkage behavior : a lattice discrete particle modeling approach

    Get PDF
    The currently aging and deteriorating infrastructures both in the US and all around the world have been a major cause to extend the current design provisions for concrete structures to 100 years of design lifetime. During such a long period, concrete exhibits a well-known time dependent behavior that is a function of multiple factors including both rheological aspects of the concrete mix as well as the effect of environmental conditions, which contribute to its time dependent aging. While initial conditions (e.g. concrete mix design parameters) can be well controlled, much less knowledge is available on the type and extent of the environmental conditions that will affect the structure

    Shame, Identity, and Socio-Emotional Behavioral Regulation

    Get PDF
    This research aimed to study the effects of ‘anti-sexist’ shaming on men and how they would, as a result, regulate their behavior in future cross-sex interactions. The research hypothesized that each of the behaviors listed in the literature on shame (hostility, pro-sociality, and passing) is linked to different sexist attitudes, as per Glick and Fiske’s (1996) Ambivalent Sexism Theory. More specifically, it was hypothesized that hostile sexists would likely select aggressive responses in the questionnaire, benevolent sexists would likely employ passing, and non-sexists would be the most likely to engage in prosocial behavior. In this research, a double-blind, post-test only control group experimental design in the form of a survey-experiment was employed. The questionnaire was handed out via a random recruitment of an all-male sample on The American University in Cairo’s (AUC) New Cairo campus. Independent T-tests, reliability, correlational, and linear regression analyses were performed on the collected data. The hypotheses could not be accepted within the context of the data collected. This research managed to introduce a causal link between passing and shame in the literature as a viable response to shame, and secondly to bolster Gausel et al.’s (2012) conceptualization and differentiation of shame from felt inferiority. Adjusting the methodological flaws of this study and expanding it might prove to bridge the gaps and mend some of the contradictions in the psychological literature on shame

    Towards a conceptual framework to manage BIM/COBie asset data using a standard project management methodology

    Get PDF
    Purpose: The purpose of this paper is to investigate a systematic methodology to manage asset data flow between building stakeholders throughout building life cycle using the Construction Operation Building Information Exchange (COBie) standard. / Design/methodology/approach: A literature review of the relevant building information modelling (BIM) for facilities management (FM) studies including the gaps and challenges of producing COBie data is analysed. Then a standard project management methodology by Project Management Institute (PMI) is introduced as a theoretical framework to map the different areas of managing COBie data as a project in coordination with Royal Institute of British Architects (RIBA) Plan of work. This theoretical background is coupled with an inductive approach through the placement within a construction company (Bouygues, UK) in the UCLH construction project to produce the conceptual framework that is aligned with industry needs. / Findings: The lack of well-structured approach to manage COBie data throughout building life cycle causes many problems and confusions about the roles and responsibilities of different stakeholders in creating and managing asset data. This confusion in turn results in incomplete and low-quality COBie data at the handover phase which hinders the ability of facility managers to use these data effectively in the operations phase. The proposed conceptual framework provides a standard project management process to systemise the data flow among all stakeholders. / Practical implications: The proposed framework is developed in liaison with a large construction company, so it is well aligned with an actual industry approach to managing COBie data. Furthermore, it provides a systematic step-by-step approach to managing COBie as a project that could be easily implemented in actual construction projects. / Originality/value: The paper introduced a novel approach to manage COBie data using a standard project management methodology based on an actual live construction project perspective coupled with project management theory

    Automated Identification and Localization of Brain Tumor in MRI Using U-Net Segmentation and CNN-LSTM Classification

    Get PDF
    Nowadays, the use of computers to evaluate medical images automatically is critical part of the life. Today's treatment method relies heavily on early diagnosis and accurate disease identification, which were formerly difficult for medical research to achieve. Brain Magnetic Resonance Imaging (MRI) is essential to the detection and treatment of brain tumor (BT). Tumor of the brain are the result of brain cell division that has gone awry or is otherwise out of control. The manual MRI segmentation of BT is a difficult and time-consuming process. The most critical factor in the effective treatment and identification of BT is the ability to accurately locate the tumor. The detection of BT is regarded as a difficult task in medical image processing. For analysing and interpreting MRI, there are semi-automatic and fully automated systems that require large-scale professional input and evaluation, with varying degrees of effectiveness. Automated identification and extraction of the tumor's localization from brain MRI will be proposed in this paper. To achieve this goal, the data collected from Kaggle and the collected data are processed. Then the U-Net is employed to segment the tumor region from the MRI. Next, the MRI is classified using DL models like Convolutional Neural Network (CNN), and the hybrid Convolutional Neural Network and Long Short-Term Memory (CNN-LSTM). Both process segmentation and classification are evaluated using the metrics. From the evaluation, it is identified that CNN-LSTM outperforms the CNN model
    • …
    corecore