10,727 research outputs found

    Supply chain collaboration

    Get PDF
    In the past, research in operations management focused on single-firm analysis. Its goal was to provide managers in practice with suitable tools to improve the performance of their firm by calculating optimal inventory quantities, among others. Nowadays, business decisions are dominated by the globalization of markets and increased competition among firms. Further, more and more products reach the customer through supply chains that are composed of independent firms. Following these trends, research in operations management has shifted its focus from single-firm analysis to multi-firm analysis, in particular to improving the efficiency and performance of supply chains under decentralized control. The main characteristics of such chains are that the firms in the chain are independent actors who try to optimize their individual objectives, and that the decisions taken by a firm do also affect the performance of the other parties in the supply chain. These interactions among firms’ decisions ask for alignment and coordination of actions. Therefore, game theory, the study of situations of cooperation or conflict among heterogenous actors, is very well suited to deal with these interactions. This has been recognized by researchers in the field, since there are an ever increasing number of papers that applies tools, methods and models from game theory to supply chain problems

    Multiplicative noise: A mechanism leading to nonextensive statistical mechanics

    Full text link
    A large variety of microscopic or mesoscopic models lead to generic results that accommodate naturally within Boltzmann-Gibbs statistical mechanics (based on S1kdup(u)lnp(u)S_1\equiv -k \int du p(u) \ln p(u)). Similarly, other classes of models point toward nonextensive statistical mechanics (based on Sqk[1du[p(u)]q]/[q1]S_q \equiv k [1-\int du [p(u)]^q]/[q-1], where the value of the entropic index qq\in\Re depends on the specific model). We show here a family of models, with multiplicative noise, which belongs to the nonextensive class. More specifically, we consider Langevin equations of the type u˙=f(u)+g(u)ξ(t)+η(t)\dot{u}=f(u)+g(u)\xi(t)+\eta(t), where ξ(t)\xi(t) and η(t)\eta(t) are independent zero-mean Gaussian white noises with respective amplitudes MM and AA. This leads to the Fokker-Planck equation tP(u,t)=u[f(u)P(u,t)]+Mu{g(u)u[g(u)P(u,t)]}+AuuP(u,t)\partial_t P(u,t) = -\partial_u[f(u) P(u,t)] + M\partial_u\{g(u)\partial_u[g(u)P(u,t)]\} + A\partial_{uu}P(u,t). Whenever the deterministic drift is proportional to the noise induced one, i.e., f(u)=τg(u)g(u)f(u) =-\tau g(u) g'(u), the stationary solution is shown to be P(u,){1(1q)β[g(u)]2}11qP(u, \infty) \propto \bigl\{1-(1-q) \beta [g(u)]^2 \bigr\}^{\frac{1}{1-q}} (with qτ+3Mτ+Mq \equiv \frac{\tau + 3M}{\tau+M} and β=τ+M2A\beta=\frac{\tau+M}{2A}). This distribution is precisely the one optimizing SqS_q with the constraint q{du[g(u)]2[P(u)]q}/{du[P(u)]q}=_q \equiv \{\int du [g(u)]^2[P(u)]^q \}/ \{\int du [P(u)]^q \}= constant. We also introduce and discuss various characterizations of the width of the distributions.Comment: 3 PS figure

    Development and Validation of a Rule-based Time Series Complexity Scoring Technique to Support Design of Adaptive Forecasting DSS

    Get PDF
    Evidence from forecasting research gives reason to believe that understanding time series complexity can enable design of adaptive forecasting decision support systems (FDSSs) to positively support forecasting behaviors and accuracy of outcomes. Yet, such FDSS design capabilities have not been formally explored because there exists no systematic approach to identifying series complexity. This study describes the development and validation of a rule-based complexity scoring technique (CST) that generates a complexity score for time series using 12 rules that rely on 14 features of series. The rule-based schema was developed on 74 series and validated on 52 holdback series using well-accepted forecasting methods as benchmarks. A supporting experimental validation was conducted with 14 participants who generated 336 structured judgmental forecasts for sets of series classified as simple or complex by the CST. Benchmark comparisons validated the CST by confirming, as hypothesized, that forecasting accuracy was lower for series scored by the technique as complex when compared to the accuracy of those scored as simple. The study concludes with a comprehensive framework for design of FDSS that can integrate the CST to adaptively support forecasters under varied conditions of series complexity. The framework is founded on the concepts of restrictiveness and guidance and offers specific recommendations on how these elements can be built in FDSS to support complexity

    "Slimming" of power law tails by increasing market returns

    Full text link
    We introduce a simple generalization of rational bubble models which removes the fundamental problem discovered by [Lux and Sornette, 1999] that the distribution of returns is a power law with exponent less than 1, in contradiction with empirical data. The idea is that the price fluctuations associated with bubbles must on average grow with the mean market return r. When r is larger than the discount rate r_delta, the distribution of returns of the observable price, sum of the bubble component and of the fundamental price, exhibits an intermediate tail with an exponent which can be larger than 1. This regime r>r_delta corresponds to a generalization of the rational bubble model in which the fundamental price is no more given by the discounted value of future dividends. We explain how this is possible. Our model predicts that, the higher is the market remuneration r above the discount rate, the larger is the power law exponent and thus the thinner is the tail of the distribution of price returns.Comment: 13 pages + 4 figure

    SOME GUIDING PRINCIPLES FOR EMPIRICAL PRODUCTION RESEARCH IN AGRICULTURE

    Get PDF
    Constraints on production economic research are examined in three dimensions: problem focus, methodology, and data availability. Data availability has played a large role in the choice of problem focus and explains some misdirected focus. A proposal is made to address the data availability constraint. The greatest self-imposed constraints are methodological. Production economics has focused on flexible representations of technology at the expense of specificity in preferences. Yet some of the major problems faced by decision makers relate to long-term problems, e.g., the commodity boom and ensuring debt crisis of the 1970s and 1980s where standard short-term profit maximization models are unlikely to capture the essence of decision maker concerns.Production Economics,
    corecore