93 research outputs found

    Real-Time Decentralized Information Processing and Returns to Scale

    Get PDF
    We study the properties of real-time decentralized information processing, as a model of human information processing in organizations, and use the model to understand how constraints on human information processing affect the returns to scale of firms. With real-time processing, decentralization does not unambiguously reduce delay, because processing a subordinate's report precludes processing current data. Because decision rules are endogenous, delay does not inexorably lead to eventually decreasing returns to scale; however, returns are more likely to be decreasing when computation constraints, rat her than sampling costs, limit the information upon which decisions are conditioned. The results illustrate that the requirement of informational integration causes a breakdown of the replication arguments that are often used to establish non-decreasing returns.Information Systems Working Papers Serie

    Berges Maximum Theorem With Two Topologies On The Action Set (Now published in Economics Letters, vol.61 (1999), pp.285-291.)

    Get PDF
    We give variants on Berge's Maximum Theorem in which the lower and the upper semicontinuities of the preference relation are assumed for two different topologies on the action set, i.e., the set of actions availabe a priori to the decision-maker (e.g. a household with its consumption set). Two new uses are pointed to. One result, stated here without a detailed proof, is the norm-to-weak* continuity of consumer demand as a function of prices (a property pointed to in existing literature but without proof or precise formulation). This improves significantly upon an earlier demand continuity result which, with the extremally strong 'finite' topology on the price space, is of limited interest other than as a vehicle for an equilibrium existence proof. With the norm topology on the price space, our demand continuity result acquires an independent significance - particularly for practical implementations of the equilibrium solution. The second application referred to establishes the continuity of the optimal plan as a function of the decision-maker's information (represented by a field of events in a probability spcace of states).Berge’s Maximum Theorem, demand continuity

    Real-Time Decentralization Information Processing and Returns to Scale

    Get PDF
    We use a model of real-time decentralized information processing to understand how constraints on human information processing affect the returns to scale of organizations. We identify three informational (dis)economies of scale: diversification of heterogeneous risks (positive), sharing of information and of costs (positive), and crowding out of recent information due to information processing delay (negative). Because decision rules are endogenous, delay does not inexorably lead to decreasing returns to scale. However, returns are more likely to be decreasing when computation constraints, rather than sampling costs, limit the information upon which decisions are conditioned. The results illustrate how information processing constraints together with the requirement of informational integration cause a breakdown of the replication arguments that have been used to establish nondecreasing technological returns to scale.Information Systems Working Papers Serie

    Préface

    Get PDF

    Real-Time Hierarchical Resource Allocation

    Get PDF
    This paper presents a model that distinguishes between decentralized information processing and decentralized decision making in organizations; it shows that decentralized decision making can be advantageous due to computational delay, even in the absence of communication costs. The key feature of the model, which makes this result possible, is that decisions in a stochastic control problem are calculated in real time by boundedly rational members of an adminstration staff. The decision problem is to allocate resources in a changing environment. We consider a class of hierarchical procedures in which information about cost functions flow down and are disaggregated by the hierarchy. Nodes of the hierarchy correspond not to a single person but to decision-making units within which there may be decentralized information processing. The lower tiers of multitier hierarchies can allocate resources quickly within small groups, while higher tiers are still able to exploit gains from trade between the groups (although on the basis of older informations).decentralization, hierarchies, bounded rationality, real-time control

    Variable response of three Trifolium repens ecotypes to soil flooding by seawater.

    Get PDF
    BACKGROUND AND AIMS: Despite concerns about the impact of rising sea levels and storm surge events on coastal ecosystems, there is remarkably little information on the response of terrestrial coastal plant species to seawater inundation. The aim of this study was to elucidate responses of a glycophyte (white clover, Trifolium repens) to short-duration soil flooding by seawater and recovery following leaching of salts. METHODS: Using plants cultivated from parent ecotypes collected from a natural soil salinity gradient, the impact of short-duration seawater soil flooding (8 or 24 h) on short-term changes in leaf salt ion and organic solute concentrations was examined, together with longer term impacts on plant growth (stolon elongation) and flowering. KEY RESULTS: There was substantial Cl(-) and Na(+) accumulation in leaves, especially for plants subjected to 24 h soil flooding with seawater, but no consistent variation linked to parent plant provenance. Proline and sucrose concentrations also increased in plants following seawater flooding of the soil. Plant growth and flowering were reduced by longer soil immersion times (seawater flooding followed by drainage and freshwater inputs), but plants originating from more saline soil responded less negatively than those from lower salinity soil. CONCLUSIONS: The accumulation of proline and sucrose indicates a potential for solute accumulation as a response to the osmotic imbalance caused by salt ions, while variation in growth and flowering responses between ecotypes points to a natural adaptive capacity for tolerance of short-duration seawater soil flooding in T. repens. Consequently, it is suggested that selection for tolerant ecotypes is possible should the predicted increase in frequency of storm surge flooding events occur

    [Comment] Redefine statistical significance

    Get PDF
    The lack of reproducibility of scientific studies has caused growing concern over the credibility of claims of new discoveries based on “statistically significant” findings. There has been much progress toward documenting and addressing several causes of this lack of reproducibility (e.g., multiple testing, P-hacking, publication bias, and under-powered studies). However, we believe that a leading cause of non-reproducibility has not yet been adequately addressed: Statistical standards of evidence for claiming discoveries in many fields of science are simply too low. Associating “statistically significant” findings with P < 0.05 results in a high rate of false positives even in the absence of other experimental, procedural and reporting problems. For fields where the threshold for defining statistical significance is P<0.05, we propose a change to P<0.005. This simple step would immediately improve the reproducibility of scientific research in many fields. Results that would currently be called “significant” but do not meet the new threshold should instead be called “suggestive.” While statisticians have known the relative weakness of using P≈0.05 as a threshold for discovery and the proposal to lower it to 0.005 is not new (1, 2), a critical mass of researchers now endorse this change. We restrict our recommendation to claims of discovery of new effects. We do not address the appropriate threshold for confirmatory or contradictory replications of existing claims. We also do not advocate changes to discovery thresholds in fields that have already adopted more stringent standards (e.g., genomics and high-energy physics research; see Potential Objections below). We also restrict our recommendation to studies that conduct null hypothesis significance tests. We have diverse views about how best to improve reproducibility, and many of us believe that other ways of summarizing the data, such as Bayes factors or other posterior summaries based on clearly articulated model assumptions, are preferable to P-values. However, changing the P-value threshold is simple and might quickly achieve broad acceptance
    corecore