43,276 research outputs found

    International business: past, present and futures

    Get PDF
    This article provides the context for futures thinking in the field of international business (IB). The article begins by considering the nature of IB. Its historical development is then elaborated, before its current significance and trends are considered. Building on the review of past and present we speculate briefly on the possible futures of IB. In so doing, we provide a basis from which the contributions to this Special Issue on the Futures of IB can be understood and situated in a broader context

    MusA: Using Indoor Positioning and Navigation to Enhance Cultural Experiences in a museum

    Get PDF
    In recent years there has been a growing interest into the use of multimedia mobile guides in museum environments. Mobile devices have the capabilities to detect the user context and to provide pieces of information suitable to help visitors discovering and following the logical and emotional connections that develop during the visit. In this scenario, location based services (LBS) currently represent an asset, and the choice of the technology to determine users' position, combined with the definition of methods that can effectively convey information, become key issues in the design process. In this work, we present MusA (Museum Assistant), a general framework for the development of multimedia interactive guides for mobile devices. Its main feature is a vision-based indoor positioning system that allows the provision of several LBS, from way-finding to the contextualized communication of cultural contents, aimed at providing a meaningful exploration of exhibits according to visitors' personal interest and curiosity. Starting from the thorough description of the system architecture, the article presents the implementation of two mobile guides, developed to respectively address adults and children, and discusses the evaluation of the user experience and the visitors' appreciation of these application

    Synergies for Improving Oil Palm Production and Forest Conservation in Floodplain Landscapes

    Get PDF
    Lowland tropical forests are increasingly threatened with conversion to oil palm as global demand and high profit drives crop expansion throughout the world’s tropical regions. Yet, landscapes are not homogeneous and regional constraints dictate land suitability for this crop. We conducted a regional study to investigate spatial and economic components of forest conversion to oil palm within a tropical floodplain in the Lower Kinabatangan, Sabah, Malaysian Borneo. The Kinabatangan ecosystem harbours significant biodiversity with globally threatened species but has suffered forest loss and fragmentation. We mapped the oil palm and forested landscapes (using object-based-image analysis, classification and regression tree analysis and on-screen digitising of high-resolution imagery) and undertook economic modelling. Within the study region (520,269 ha), 250,617 ha is cultivated with oil palm with 77% having high Net-Present-Value (NPV) estimates (413/ha?yr413/ha?yr–637/ha?yr); but 20.5% is under-producing. In fact 6.3% (15,810 ha) of oil palm is commercially redundant (with negative NPV of 299/ha?yr-299/ha?yr--65/ha?yr) due to palm mortality from flood inundation. These areas would have been important riparian or flooded forest types. Moreover, 30,173 ha of unprotected forest remain and despite its value for connectivity and biodiversity 64% is allocated for future oil palm. However, we estimate that at minimum 54% of these forests are unsuitable for this crop due to inundation events. If conversion to oil palm occurs, we predict a further 16,207 ha will become commercially redundant. This means that over 32,000 ha of forest within the floodplain would have been converted for little or no financial gain yet with significant cost to the ecosystem. Our findings have globally relevant implications for similar floodplain landscapes undergoing forest transformation to agriculture such as oil palm. Understanding landscape level constraints to this crop, and transferring these into policy and practice, may provide conservation and economic opportunities within these seemingly high opportunity cost landscapes

    Launching the Grand Challenges for Ocean Conservation

    Get PDF
    The ten most pressing Grand Challenges in Oceans Conservation were identified at the Oceans Big Think and described in a detailed working document:A Blue Revolution for Oceans: Reengineering Aquaculture for SustainabilityEnding and Recovering from Marine DebrisTransparency and Traceability from Sea to Shore:  Ending OverfishingProtecting Critical Ocean Habitats: New Tools for Marine ProtectionEngineering Ecological Resilience in Near Shore and Coastal AreasReducing the Ecological Footprint of Fishing through Smarter GearArresting the Alien Invasion: Combating Invasive SpeciesCombatting the Effects of Ocean AcidificationEnding Marine Wildlife TraffickingReviving Dead Zones: Combating Ocean Deoxygenation and Nutrient Runof

    Advanced Knowledge Technologies at the Midterm: Tools and Methods for the Semantic Web

    Get PDF
    The University of Edinburgh and research sponsors are authorised to reproduce and distribute reprints and on-line copies for their purposes notwithstanding any copyright annotation hereon. The views and conclusions contained herein are the author’s and shouldn’t be interpreted as necessarily representing the official policies or endorsements, either expressed or implied, of other parties.In a celebrated essay on the new electronic media, Marshall McLuhan wrote in 1962:Our private senses are not closed systems but are endlessly translated into each other in that experience which we call consciousness. Our extended senses, tools, technologies, through the ages, have been closed systems incapable of interplay or collective awareness. Now, in the electric age, the very instantaneous nature of co-existence among our technological instruments has created a crisis quite new in human history. Our extended faculties and senses now constitute a single field of experience which demands that they become collectively conscious. Our technologies, like our private senses, now demand an interplay and ratio that makes rational co-existence possible. As long as our technologies were as slow as the wheel or the alphabet or money, the fact that they were separate, closed systems was socially and psychically supportable. This is not true now when sight and sound and movement are simultaneous and global in extent. (McLuhan 1962, p.5, emphasis in original)Over forty years later, the seamless interplay that McLuhan demanded between our technologies is still barely visible. McLuhan’s predictions of the spread, and increased importance, of electronic media have of course been borne out, and the worlds of business, science and knowledge storage and transfer have been revolutionised. Yet the integration of electronic systems as open systems remains in its infancy.Advanced Knowledge Technologies (AKT) aims to address this problem, to create a view of knowledge and its management across its lifecycle, to research and create the services and technologies that such unification will require. Half way through its sixyear span, the results are beginning to come through, and this paper will explore some of the services, technologies and methodologies that have been developed. We hope to give a sense in this paper of the potential for the next three years, to discuss the insights and lessons learnt in the first phase of the project, to articulate the challenges and issues that remain.The WWW provided the original context that made the AKT approach to knowledge management (KM) possible. AKT was initially proposed in 1999, it brought together an interdisciplinary consortium with the technological breadth and complementarity to create the conditions for a unified approach to knowledge across its lifecycle. The combination of this expertise, and the time and space afforded the consortium by the IRC structure, suggested the opportunity for a concerted effort to develop an approach to advanced knowledge technologies, based on the WWW as a basic infrastructure.The technological context of AKT altered for the better in the short period between the development of the proposal and the beginning of the project itself with the development of the semantic web (SW), which foresaw much more intelligent manipulation and querying of knowledge. The opportunities that the SW provided for e.g., more intelligent retrieval, put AKT in the centre of information technology innovation and knowledge management services; the AKT skill set would clearly be central for the exploitation of those opportunities.The SW, as an extension of the WWW, provides an interesting set of constraints to the knowledge management services AKT tries to provide. As a medium for the semantically-informed coordination of information, it has suggested a number of ways in which the objectives of AKT can be achieved, most obviously through the provision of knowledge management services delivered over the web as opposed to the creation and provision of technologies to manage knowledge.AKT is working on the assumption that many web services will be developed and provided for users. The KM problem in the near future will be one of deciding which services are needed and of coordinating them. Many of these services will be largely or entirely legacies of the WWW, and so the capabilities of the services will vary. As well as providing useful KM services in their own right, AKT will be aiming to exploit this opportunity, by reasoning over services, brokering between them, and providing essential meta-services for SW knowledge service management.Ontologies will be a crucial tool for the SW. The AKT consortium brings a lot of expertise on ontologies together, and ontologies were always going to be a key part of the strategy. All kinds of knowledge sharing and transfer activities will be mediated by ontologies, and ontology management will be an important enabling task. Different applications will need to cope with inconsistent ontologies, or with the problems that will follow the automatic creation of ontologies (e.g. merging of pre-existing ontologies to create a third). Ontology mapping, and the elimination of conflicts of reference, will be important tasks. All of these issues are discussed along with our proposed technologies.Similarly, specifications of tasks will be used for the deployment of knowledge services over the SW, but in general it cannot be expected that in the medium term there will be standards for task (or service) specifications. The brokering metaservices that are envisaged will have to deal with this heterogeneity.The emerging picture of the SW is one of great opportunity but it will not be a wellordered, certain or consistent environment. It will comprise many repositories of legacy data, outdated and inconsistent stores, and requirements for common understandings across divergent formalisms. There is clearly a role for standards to play to bring much of this context together; AKT is playing a significant role in these efforts. But standards take time to emerge, they take political power to enforce, and they have been known to stifle innovation (in the short term). AKT is keen to understand the balance between principled inference and statistical processing of web content. Logical inference on the Web is tough. Complex queries using traditional AI inference methods bring most distributed computer systems to their knees. Do we set up semantically well-behaved areas of the Web? Is any part of the Web in which semantic hygiene prevails interesting enough to reason in? These and many other questions need to be addressed if we are to provide effective knowledge technologies for our content on the web

    The Jatropha Biofuels Sector in Tanzania 2005-9: Evolution Towards Sustainability?

    Get PDF
    Biofuel production has recently attracted a great deal of attention. Some anticipate substantial social and environmental benefits, while at the same time expecting sound profitability for investors. Others are more doubtful, envisaging large trade-offs between the pursuit of social, environmental and economic objectives, particularly in poor countries in the tropics. The paper explores these issues in Tanzania, which is a forerunner in Africa in the cultivation of a bio-oil shrub called Jatropha curcas L. We trace how isolated Jatropha biofuel experiments developed since their inception in early 2005 towards a fully fledged sectoral production and innovation system; and investigate to what extent that system has been capable of developing ànd maintaining sustainable practices and producing sustainable outcomes. The application of evolutionary economic theory allows us to view the development processes in the sector as a result of evolutionary variation and selection on the one hand, and revolutionary contestation between different coalitions of stakeholders on the other. Both these processes constitute significant engines of change in the sector. While variation and selection is driven predominantly by localised learning, the conflict-driven dynamics are highly globalised. The sector is found to have moved some way towards a full sectoral innovation and production system, but it is impossible to predict whether a viable sector with a strong “triple bottom line” orientation will ultimate emerge, since many issues surrounding the social, environmental and financial sustainability still remain unresolved.biofuels, evolutionary theory, innovation systems, sustainability, stakeholder conflict, learning, Tanzania.

    An informatics model for guiding assembly of telemicrobiology workstations for malaria collaborative diagnostics using commodity products and open-source software

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Deficits in clinical microbiology infrastructure exacerbate global infectious disease burdens. This paper examines how commodity computation, communication, and measurement products combined with open-source analysis and communication applications can be incorporated into laboratory medicine microbiology protocols. Those commodity components are all now sourceable globally. An informatics model is presented for guiding the use of low-cost commodity components and free software in the assembly of clinically useful and usable telemicrobiology workstations.</p> <p>Methods</p> <p>The model incorporates two general principles: 1) collaborative diagnostics, where free and open communication and networking applications are used to link distributed collaborators for reciprocal assistance in organizing and interpreting digital diagnostic data; and 2) commodity engineering, which leverages globally available consumer electronics and open-source informatics applications, to build generic open systems that measure needed information in ways substantially equivalent to more complex proprietary systems. Routine microscopic examination of Giemsa and fluorescently stained blood smears for diagnosing malaria is used as an example to validate the model.</p> <p>Results</p> <p>The model is used as a constraint-based guide for the design, assembly, and testing of a functioning, open, and commoditized telemicroscopy system that supports distributed acquisition, exploration, analysis, interpretation, and reporting of digital microscopy images of stained malarial blood smears while also supporting remote diagnostic tracking, quality assessment and diagnostic process development.</p> <p>Conclusion</p> <p>The open telemicroscopy workstation design and use-process described here can address clinical microbiology infrastructure deficits in an economically sound and sustainable manner. It can boost capacity to deal with comprehensive measurement of disease and care outcomes in individuals and groups in a distributed and collaborative fashion. The workstation enables local control over the creation and use of diagnostic data, while allowing for remote collaborative support of diagnostic data interpretation and tracking. It can enable global pooling of malaria disease information and the development of open, participatory, and adaptable laboratory medicine practices. The informatic model highlights how the larger issue of access to generic commoditized measurement, information processing, and communication technology in both high- and low-income countries can enable diagnostic services that are much less expensive, but substantially equivalent to those currently in use in high-income countries.</p
    corecore