328,514 research outputs found

    The effects of contract detail and prior ties on contract change : a learning perspective

    Get PDF
    Despite the large literature on alliance contract design, we know little about how transacting parties change and amend their underlying contracts during the execution of strategic alliances. Drawing on existing research in the alliance contracting literature, we develop the empirical question of how contract detail and prior ties influence the amount, direction, and type of change in such agreements during the collaboration. We generated a sample of 115 joint ventures (JVs) by distributing a survey to JV board members or top managers and found that the amount of contract change is negatively associated with the level of detail in the initial contract but is positively associated with the number of prior ties between alliance partners. In relation to the direction of contract change, we find that the level of detail of the initial agreements negatively correlates with the likelihood of removing or weakening existing provisions and that prior collaborative experience positively correlates with the likelihood of strengthening of existing provisions or adding of new ones. We also find that prior ties affect the type of change in that JV parents prefer to change enforcement provisions more so than the coordination provisions in the contract. Our paper generates new insights on the complementarities between relational governance and transaction cost economics perspectives on alliance contracting

    Multi-dimensional modelling for the national mapping agency: a discussion of initial ideas, considerations, and challenges

    Get PDF
    The Ordnance Survey, the National Mapping Agency (NMA) for Great Britain, has recently begun to research the possible extension of its 2-dimensional geographic information into a multi-dimensional environment. Such a move creates a number of data creation and storage issues which the NMA must consider. Many of these issues are highly relevant to all NMAā€™s and their customers alike, and are presented and explored here. This paper offers a discussion of initial considerations which NMAā€™s face in the creation of multi-dimensional datasets. Such issues include assessing which objects should be mapped in 3 dimensions by a National Mapping Agency, what should be sensibly represented dynamically, and whether resolution of multi-dimensional models should change over space. This paper also offers some preliminary suggestions for the optimal creation method for any future enhanced national height model for the Ordnance Survey. This discussion includes examples of problem areas and issues in both the extraction of 3-D data and in the topological reconstruction of such. 3-D feature extraction is not a new problem. However, the degree of automation which may be achieved and the suitability of current techniques for NMAā€™s remains a largely unchartered research area, which this research aims to tackle. The issues presented in this paper require immediate research, and if solved adequately would mark a cartographic paradigm shift in the communication of geographic information ā€“ and could signify the beginning of the way in which NMAā€™s both present and interact with their customers in the future

    "Last-Mile" preparation for a potential disaster

    Get PDF
    Extreme natural events, like e.g. tsunamis or earthquakes, regularly lead to catastrophes with dramatic consequences. In recent years natural disasters caused hundreds of thousands of deaths, destruction of infrastructure, disruption of economic activity and loss of billions of dollars worth of property and thus revealed considerable deficits hindering their effective management: Needs for stakeholders, decision-makers as well as for persons concerned include systematic risk identification and evaluation, a way to assess countermeasures, awareness raising and decision support systems to be employed before, during and after crisis situations. The overall goal of this study focuses on interdisciplinary integration of various scientific disciplines to contribute to a tsunami early warning information system. In comparison to most studies our focus is on high-end geometric and thematic analysis to meet the requirements of small-scale, heterogeneous and complex coastal urban systems. Data, methods and results from engineering, remote sensing and social sciences are interlinked and provide comprehensive information for disaster risk assessment, management and reduction. In detail, we combine inundation modeling, urban morphology analysis, population assessment, socio-economic analysis of the population and evacuation modeling. The interdisciplinary results eventually lead to recommendations for mitigation strategies in the fields of spatial planning or coping capacity

    Marine Heritage Monitoring with High Resolution Survey Tools: ScapaMAP 2001-2006

    Get PDF
    Archaeologically, marine sites can be just as significant as those on land. Until recently, however, they were not protected in the UK to the same degree, leading to degradation of sites; the difficulty of investigating such sites still makes it problematic and expensive to properly describe, schedule and monitor them. Use of conventional high-resolution survey tools in an archaeological context is changing the economic structure of such investigations however, and it is now possible to remotely but routinely monitor the state of submerged cultural artifacts. Use of such data to optimize expenditure of expensive and rare assets (e.g., divers and on-bottom dive time) is an added bonus. We present here the results of an investigation into methods for monitoring of marine heritage sites, using the remains of the Imperial German Navy (scuttled 1919) in Scapa Flow, Orkney as a case study. Using a baseline bathymetric survey in 2001 and a repeat bathymetric and volumetric survey in 2006, we illustrate the requirements for such surveys over and above normal hydrographic protocols and outline strategies for effective imaging of large wrecks. Suggested methods for manipulation of such data (including processing and visualization) are outlined, and we draw the distinction between products for scientific investigation and those for outreach and education, which have very different requirements. We then describe the use of backscatter and volumetric acoustic data in the investigation of wrecks, focusing on the extra information to be gained from them that is not evident in the traditional bathymetric DTM models or sounding point-cloud representations of data. Finally, we consider the utility of high-resolution survey as part of an integrated site management policy, with particular reference to the economics of marine heritage monitoring and preservation

    Probing the Universe with Weak Lensing

    Get PDF
    Gravitational lenses can provide crucial information on the geometry of the Universe, on the cosmological scenario of formation of its structures as well as on the history of its components with look-back time. In this review, I focus on the most recent results obtained during the last five years from the analysis of the weak lensing regime. The interest of weak lensing as a probe of dark matter and the for study of the coupling between light and mass on scales of clusters of galaxies, large scale structures and galaxies is discussed first. Then I present the impact of weak lensing for the study of distant galaxies and of the population of lensed sources as function of redshift. Finally, I discuss the potential interest of weak lensing to constrain the cosmological parameters, either from pure geometrical effects observed in peculiar lenses, or from the coupling of weak lensing with the CMB.Comment: To appear Annual Review of Astronomy and Astrophysiscs Vol. 37. Latex and psfig.sty. Version without figure, 54 pages, 73Kb. Complete version including 13 figures (60 pages) available on ftp.iap.fr anonymous account in /pub/from_users/mellier/AnnualReview ; file ARAAmellier.ps.gz 1.6 M

    Finding faint HI structure in and around galaxies: scraping the barrel

    Get PDF
    Soon to be operational HI survey instruments such as APERTIF and ASKAP will produce large datasets. These surveys will provide information about the HI in and around hundreds of galaxies with a typical signal-to-noise ratio of āˆ¼\sim 10 in the inner regions and āˆ¼\sim 1 in the outer regions. In addition, such surveys will make it possible to probe faint HI structures, typically located in the vicinity of galaxies, such as extra-planar-gas, tails and filaments. These structures are crucial for understanding galaxy evolution, particularly when they are studied in relation to the local environment. Our aim is to find optimized kernels for the discovery of faint and morphologically complex HI structures. Therefore, using HI data from a variety of galaxies, we explore state-of-the-art filtering algorithms. We show that the intensity-driven gradient filter, due to its adaptive characteristics, is the optimal choice. In fact, this filter requires only minimal tuning of the input parameters to enhance the signal-to-noise ratio of faint components. In addition, it does not degrade the resolution of the high signal-to-noise component of a source. The filtering process must be fast and be embedded in an interactive visualization tool in order to support fast inspection of a large number of sources. To achieve such interactive exploration, we implemented a multi-core CPU (OpenMP) and a GPU (OpenGL) version of this filter in a 3D visualization environment (SlicerAstro\tt{SlicerAstro}).Comment: 17 pages, 9 figures, 4 tables. Astronomy and Computing, accepte
    • ā€¦
    corecore