78 research outputs found

    Pricing Decisions and Member Participation Conditions in the Platform Supply Chain with Seasonal Demand: A Game Theory-Based Analysis

    Get PDF
    In a platform supply chain containing an sharing platform, a retailer, and two upstream factories, we consider the sales of two substitutable products and the constraints on the participation of non-platform members in the game, construct a dynamic game model of the platform supply chain with demand fluctuations due to seasonal changes, and use the inverse induction method to derive the thresholds for the participation of the retailer and factories in the game and the optimal price of each member, analyze the pricing decisions and participation game process among members. Our findings are as follows. (1) The threshold for the retailer to participate in the game during peak season is greater than that in the off-season, while the manufacturer's threshold is the opposite. (2) The platform should reduce commissions during peak season, factories should reduce wholesale prices, and the retailer should increase sales price. (3) The increase in price sensitivity coefficient, inventory cost and production cost are all detrimental to the platform, whereas an increase in the cross-price sensitivity coefficient is usually beneficial. (4) Whether it is the peak season or the off-season, the increase in demand fluctuations is bad for all chain members

    The effects of design thinking on students\u27 career self-efficacy in career guidance courses

    Get PDF
    The present study focuses on integrating design thinking into career guidance courses to test whether students’ career self-efficacy is increased by comparing the experiment group (by using design thinking method) and the general group (by using traditional teacher-centered method). The basic theoretic framework is Bandura’s self-efficacy theory (Bandura, 1977). Students will achieve career self-efficacy after experiencing repeatedly success (Bandura, 1977) in the career activities through design thinking method. Then students will have more confidence to make more appropriate career choices in their employment environment. This study used AMOS and path analysis to analyze a just-identified model. The model included five endogenous variables as well as six exogenous variables to control for age, sex and GPA. The data met all statistical assumptions of path modeling. In sum, all the five paths between design thinking and the other five endogenous variables were significant positive (p \u3c.001), which indicates that using the design thinking method to teach students’ career courses can improve students’ goal selection, problem solving, occupational information, planning, and self-appraisal scores

    Dynamic Emission Reduction Strategy for Shipping Company Considering Shipper's Cancellation of Cabin Space

    Get PDF
    As a major carbon emitting industry, the shipping industry urgently needs to actively reduce emissions. This article introduces parameters such as the shipper's low-carbon preference coefficient, cabin cancellation rate, and compensation rate, considering the shipper's low-carbon preference and cabin cancellation behavior. Based on the state changes of shipping emission reduction, an optimal control method is used to construct a dynamic decision-making model for shipping companies to reduce emissions. The optimal emission reduction effort of shipping companies is solved to clarify the optimal dynamic trajectory changes of shipping emission reduction, shipping volume, and shipping companies' expected discounted profits. The impact of shipper's low-carbon preference and cabin cancellation on shipping companies' emission reduction operation strategies is also revealed. An important finding is that the cancellation of cargo space by shippers will reduce the enthusiasm of shipping companies to reduce emissions, while the increase in shippers' low-carbon preference coefficient can help improve the enthusiasm of shipping companies to reduce emissions and increase shipping emissions. Moreover, the dynamic emission reduction operation strategies of shipping companies will dynamically change over time. Finally, the effectiveness of the model was validated through numerical analysis

    Comparison of value-added models for school ranking and classification: a Monte Carlo study

    Get PDF
    A “Value-Added” definition of school effectiveness calls for the evaluation of schools based on the unique contribution of schools to individual student academic growth. The estimates of value-added school effectiveness are usually used for ranking and classifying schools. The current simulation study examined and compared the validity of school effectiveness estimates in four statistical models for school ranking and classification. The simulation study was conducted under two sample size conditions and the situations typical in school effectiveness research. The Conditional Cross-Classified Model (CCCM) was used to simulate data. The findings indicated that the gain score model adjusting for students’ test scores at the end of kindergarten (i. e., prior entering to an elementary school) (Gain_kindergarten) could validly rank and classify schools. Other models, including the gain score model adjusting for students’ test scores at the end of Grade 4 (i. e., one year before estimating the school effectiveness in Grade 5) (Gain_grade4), the Unconditional Cross-Classified Model (UCCM), and the Layered Mixed Effect Model (LMEM), could not validly rank or classify schools. The failure of the UCCM model in school ranking and classification indicated that ignoring covariates would distort school rankings and classifications if no other analytical remedies were applied. The failure of the LMEM model in school ranking and classification indicated that estimation of correlations among repeated measures could not alleviate the damage caused by the omitted covariates. The failure of the Gain_grade4 model cautioned against adjustment using the test scores of the previous year. The success of the Gain_kindergarten model indicated that under some circumstances, it was possible to achieve valid school rankings and classifications with only two time points of data

    Cure: Strong semantics meets high availability and low latency

    Get PDF
    International audienceDevelopers of cloud-scale applications face a difficult decision of which kind of storage to use, summarised by the CAP theorem. Currently the choice is between classical CP databases, which provide strong guarantees but are slow, expensive, and unavailable under partition; and NoSQL-style AP databases, which are fast and available, but too hard to program against. We present an alternative: Cure provides the highest level of guarantees that remains compatible with availability. These guarantees include: causal consistency (no ordering anomalies), atomicity (consistent multi-key updates), and support for high-level data types (developer friendly API) with safe resolution of concurrent updates (guaranteeing convergence). These guarantees minimise the anomalies caused by parallelism and distribution, thus facilitating the development of applications. This paper presents the protocols for highly available transactions, and an experimental evaluation showing that Cure is able to achieve scalability similar to eventually-consistent NoSQL databases, while providing stronger guarantees

    Light-Reinforced Key Intermediate for Anticoking To Boost Highly Durable Methane Dry Reforming over Single Atom Ni Active Sites on CeO<sub>2</sub>.

    Get PDF
    Dry reforming of methane (DRM) has been investigated for more than a century; the paramount stumbling block in its industrial application is the inevitable sintering of catalysts and excessive carbon emissions at high temperatures. However, the low-temperature DRM process still suffered from poor reactivity and severe catalyst deactivation from coking. Herein, we proposed a concept that highly durable DRM could be achieved at low temperatures via fabricating the active site integration with light irradiation. The active sites with Ni-O coordination (NiSA/CeO2) and Ni-Ni coordination (NiNP/CeO2) on CeO2, respectively, were successfully constructed to obtain two targeted reaction paths that produced the key intermediate (CH3O*) for anticoking during DRM. In particular, the operando diffuse reflectance infrared Fourier transform spectroscopy coupling with steady-state isotopic transient kinetic analysis (operando DRIFTS-SSITKA) was utilized and successfully tracked the anticoking paths during the DRM process. It was found that the path from CH3* to CH3O* over NiSA/CeO2 was the key path for anticoking. Furthermore, the targeted reaction path from CH3* to CH3O* was reinforced by light irradiation during the DRM process. Hence, the NiSA/CeO2 catalyst exhibits excellent stability with negligible carbon deposition for 230 h under thermo-photo catalytic DRM at a low temperature of 472 °C, while NiNP/CeO2 shows apparent coke deposition behavior after 0.5 h in solely thermal-driven DRM. The findings are vital as they provide critical insights into the simultaneous achievement of low-temperature and anticoking DRM process through distinguishing and directionally regulating the key intermediate species

    The effects of design thinking on students\u27 career self-efficacy in career guidance courses

    Get PDF
    The present study focuses on integrating design thinking into career guidance courses to test whether students’ career self-efficacy is increased by comparing the experiment group (by using design thinking method) and the general group (by using traditional teacher-centered method). The basic theoretic framework is Bandura’s self-efficacy theory (Bandura, 1977). Students will achieve career self-efficacy after experiencing repeatedly success (Bandura, 1977) in the career activities through design thinking method. Then students will have more confidence to make more appropriate career choices in their employment environment. This study used AMOS and path analysis to analyze a just-identified model. The model included five endogenous variables as well as six exogenous variables to control for age, sex and GPA. The data met all statistical assumptions of path modeling. In sum, all the five paths between design thinking and the other five endogenous variables were significant positive (p \u3c.001), which indicates that using the design thinking method to teach students’ career courses can improve students’ goal selection, problem solving, occupational information, planning, and self-appraisal scores

    Speculation in partially-replicated transactional data stores

    No full text
    The last few decades have witnessed the unprecedented growth of large-scale online services. Distributed data storage systems, which are the fundamental building blocks of large-scale online services, are faced with a number of challenging, and often antagonistic, requirements. On the one hand, many distributed data storage systems have shifted away from weak consistency and embraced strong, transactional, semantics in order to tame the ever growing complexity of modern applications. On the other hand, the need for storing sheer amount of data and serving geo-dispersed clients with low latency has driven modern data storage systems to adopt partial replication techniques, often applied to geo-distributed infrastructures. Unfortunately, when employed in the geo-distributed and/or partial replicated settings, state of the art approaches to enforce transactional consistency suffer from severe bottlenecks that strongly hinder their efficiency. This dissertation investigates the use of speculative techniques to enhance performance of partially replicated transactional data stores, with a focus on geo-distributed platforms. With the term speculation, in this dissertation, we refer to the possibility of exposing the updates produced by uncommitted transactions to other transactions and/or to external clients in order to enhance performance. We apply speculation techniques to two fundamental approaches to develop replicated transactional data stores, namely Deferred Update Replication (DUR) and State Machine Replication (SMR). In DUR-based systems, transactions are firstly executed in a node and then propagated to other nodes for a global verification phase, during which pre-commit locks have to be held on data items updated by transactions. The global verification phase can throttle system throughput, especially when there is high conflict. We tackle this problem by introducing Speculative Transaction Replication (STR), a DUR protocol that exploits speculative reads to enhance performance of geo-distributed, partially replicated transactional data stores. The use of speculative reads greatly reduces the ‘effective duration’ of pre-commit locks, thus removing one of the key bottlenecks of DUR-based protocols. However, the indiscriminate use of speculative reads can expose applications to concurrency anomalies that can compromise their correctness in subtle ways. We tackle this issue by introducing Speculative Snapshot Isolation (SPSI), an extension of Snapshot Isolation (SI), which specifies desirable atomicity and isolation guarantees that must hold when using speculative processing techniques. In a nutshell, SPSI guarantees that, applications designed to operate using SI can safely execute atop STR, sheltering programmers from complex concurrency anomalies and source code modification. Our experimental study shows that STR, thanks to the use of speculative reads, yields up to 11× throughput improvements over state-of-the-art approaches that do not adopt speculative techniques. In SMR-based systems, transactions first undergo an ordering phase, then replicas have to guarantee that the result of transaction execution is equivalent to a serial execution according to the produced order from the ordering phase. To ensure this guarantee, existing approaches use a single-thread to execute or serialize transactions, which severely limits throughput especially given the current architectural trend towards massively parallel multi-core processors. This limitation is tackled through the introduction of SPARKLE. SPARKLE is an innovative deterministic concurrency control designed for Partially-Replicated State Machines (PRSMs). SPARKLE untaps the potential parallelism of modern multi-core systems through the use of speculative technique and by avoiding inherently non-scalable designs that rely on a single thread for either executing or scheduling transactions. The key contribution of SPARKLE is a set of techniques that can greatly minimize the frequency of misspeculations and the cost associated with correcting them. Our evaluation shows that SPARKLE achieves up to one order of magnitude throughput gains when compared to state of the art systems.(FSA - Sciences de l'ingĂ©nieur) -- UCL, 202

    A Domain Specific Search Engine WithExplicit Document Relations

    No full text
    The current web consists of documents that are highly heterogeneous and hard for machines to understand. The SemanticWeb is a progressive movement of the Word Wide Web, aiming at converting the current web of unstructured documents to the web of data. In the Semantic Web, web documents are annotated with metadata using standardized ontology language. These annotated documents are directly processable by machines and it highly improves their usability and usefulness. In Ericsson, similar problems occur. There are massive documents being created with well-defined structures. Though these documents are about domain specific knowledge and can have rich relations, they are currently managed by a traditional search engine, which ignores the rich domain specific information and presents few data to users. Motivated by the Semantic Web, we aim to find standard ways to process these documents, extract rich domain specific information and annotate these data to documents with formal markup languages. We propose this project to develop a domain specific search engine for processing different documents and building explicit relations for them. This research project consists of the three main focuses: examining different domain specific documents and finding ways to extract their metadata; integrating a text search engine with an ontology server; exploring novel ways to build relations for documents. We implement this system and demonstrate its functions. As a prototype, the system provides required features and will be extended in the future

    Testing the effect of varying environments on the speed of evolution

    No full text
    One of the most important tasks in computer science and artificial intelligence is optimization. Computer scientists use simulation of natural evolution to create algorithms and data structures to solve complex optimization problems. This field of study is called evolutionary computation. In evolutionary computation, the speed of evolution is defined as the number of generations needed for an initially random population to achieve a given goal. Recent studies have shown that varying environments might significantly speed up evolution, and suggested modularly varying goals can accelerate optimization algorithms. In this thesis, we study the effect of varying goals on the speed of evolution. Two test models, the NK model and the midunitation model, are used for this study. Three different evolutionary algorithms are used to test the hypothesis. Statistical analyses of the results showed that under NK model, evolution with fixed goal is faster than evolution with switching goals. Under midunitation model, different algorithms lead to different results. With some string lengths using hill climbing, switching goals sped up evolution. With other string lengths using hill climbing, and using the other evolutionary algorithms, either evolution with a fixed goal was faster or results were inconclusive
    • 

    corecore