23 research outputs found

    Participatory development : myths and dilemmas

    Get PDF
    The recent evolution of development thinking has highlighted popular involvement in decision making. Yet policy gridlock and stop-and-go implementation have been associated with excessive responsiveness to interest groups. This paper aims to pull together seemingly disparate strands of development thinking and experience. After debunking some popular myths, the development antecedents of participation are identified and a definition of participation is offered. Next, a stylized theory is presented at the micro level. Some implications are then drawn for organizational design and for development policy planning. The focus on participatory development signifies an opening of development economics to disciplines other than macroeconomics. In particular, microeconomics and business administration must join forces under the umbrella of institutional economics, political economists, and development practice should be shaped by all the social science disciplines.Health Economics&Finance,Economic Theory&Research,ICT Policy and Strategies,TF054599-PHRD-KYRGYZ REPUBLIC: WATER MANAGEMENT IMPROVEMENT PROJECT,Governance Indicators

    Food Aid: A Cause, or Symptom of Development Failure or an Instrument for Success?

    Get PDF

    Applications integration for manufacturing control systems with particular reference to software interoperability issues

    Get PDF
    The introduction and adoption of contemporary computer aided manufacturing control systems (MCS) can help rationalise and improve the productivity of manufacturing related activities. Such activities include product design, process planning and production management with CAD, CAPP and CAPM. However, they tend to be domain specific and would generally have been designed as stand-alone systems where there is a serious lack of consideration for integration requirements with other manufacturing activities outside the area of immediate concern. As a result, "islands of computerisation" exist which exhibit deficiencies and constraints that inhibit or complicate subsequent interoperation among typical MCS components. As a result of these interoperability constraints, contemporary forms of MCS typically yield sub-optimal benefits and do not promote synergy on an enterprise-wide basis. The move towards more integrated manufacturing systems, which requires advances in software interoperability, is becoming a strategic issue. Here the primary aim is to realise greater functional synergy between software components which span engineering, production and management activities and systems. Hence information of global interest needs to be shared across conventional functional boundaries between enterprise functions. The main thrust of this research study is to derive a new generation of MCS in which software components can "functionally interact" and share common information through accessing distributed data repositories in an efficient, highly flexible and standardised manner. It addresses problems of information fragmentation and the lack of formalism, as well as issues relating to flexibly structuring interactions between threads of functionality embedded within the various components. The emphasis is on the: • definition of generic information models which underpin the sharing of common data among production planning, product design, finite capacity scheduling and cell control systems. • development of an effective framework to manage functional interaction between MCS components, thereby coordinating their combined activities. • "soft" or flexible integration of the MCS activities over an integrating infrastructure in order to (i) help simplify typical integration problems found when using contemporary interconnection methods for applications integration; and (ii) enable their reconfiguration and incremental development. In order to facilitate adaptability in response to changing needs, these systems must also be engineered to enable reconfigurability over their life cycle. Thus within the scope of this research study a new methodology and software toolset have been developed to formally structure and support implementation, run-time and change processes. The tool set combines the use of IDEFO (for activity based or functional modelling), IDEFIX (for entity-attribute relationship modelling), and EXPRESS (for information modelling). This research includes a pragmatic but effective means of dealing with legacyl software, which often may be a vital source of readily available information which supports the operation of the manufacturing enterprise. The pragmatism and medium term relevance of the research study has promoted particular interest and collaboration from software manufacturers and industrial practitioners. Proof of concept studies have been carried out to implement and evaluate the developed mechanisms and software toolset

    Maintaining the Status Quo: Protecting established water uses in the Pacific Northwest, despite the rules of prior appropriation

    Get PDF
    Water law in the Northwest states has long been based on the well-established rules of the Prior Appropriation Doctrine. In recent years, however, the four Northwest states often have not applied these rules against existing water users. State legislatures, courts, and water resource agencies have routinely changed the rules, or refused to implement them, if doing so might curtail current uses. This Article examines the ways in which the Northwest states have maintained the water use status quo despite the traditional rules. The Article then evaluates the economic and environmental implications of state efforts to protect existing water uses, and assesses how these efforts may affect other water users

    Towards formalisation of situation-specific computations in pervasive computing environments

    Get PDF
    We have categorised the characteristics and the content of pervasive computing environments (PCEs), and demonstrated why a non-dynamic approach to knowledge conceptualisation in PCEs does not fulfil the expectations we may have from them. Consequently, we have proposed a formalised computational model, the FCM, for knowledge representation and reasoning in PCEs which, secures the delivery of situation and domain specific services to their users. The proposed model is a user centric model, materialised as a software engineering solution, which uses the computations generated from the FCM, stores them within software architectural components, which in turn can be deployed using modern software technologies. The model has also been inspired by the Semantic Web (SW) vision and provision of SW technologies. Therefore, the FCM creates a semantically rich situation-specific PCE based on SWRL-enabled OWL ontologies that allows reasoning about the situation in a PCE and delivers situation specific service. The proposed FCM model has been illustrated through the example of remote patient monitoring in the healthcare domain. Numerous software applications generated from the FCM have been deployed using Integrated Development Environments and OWL-API

    Board of Directors Meeting Minutes (November 10, 2008)

    Get PDF
    This file contains the minutes from the Des Moines Area Community College Board meeting held on November 10, 2008

    End-to-End Trust Fulfillment of Big Data Workflow Provisioning over Competing Clouds

    Get PDF
    Cloud Computing has emerged as a promising and powerful paradigm for delivering data- intensive, high performance computation, applications and services over the Internet. Cloud Computing has enabled the implementation and success of Big Data, a relatively recent phenomenon consisting of the generation and analysis of abundant data from various sources. Accordingly, to satisfy the growing demands of Big Data storage, processing, and analytics, a large market has emerged for Cloud Service Providers, offering a myriad of resources, platforms, and infrastructures. The proliferation of these services often makes it difficult for consumers to select the most suitable and trustworthy provider to fulfill the requirements of building complex workflows and applications in a relatively short time. In this thesis, we first propose a quality specification model to support dual pre- and post-cloud workflow provisioning, consisting of service provider selection and workflow quality enforcement and adaptation. This model captures key properties of the quality of work at different stages of the Big Data value chain, enabling standardized quality specification, monitoring, and adaptation. Subsequently, we propose a two-dimensional trust-enabled framework to facilitate end-to-end Quality of Service (QoS) enforcement that: 1) automates cloud service provider selection for Big Data workflow processing, and 2) maintains the required QoS levels of Big Data workflows during runtime through dynamic orchestration using multi-model architecture-driven workflow monitoring, prediction, and adaptation. The trust-based automatic service provider selection scheme we propose in this thesis is comprehensive and adaptive, as it relies on a dynamic trust model to evaluate the QoS of a cloud provider prior to taking any selection decisions. It is a multi-dimensional trust model for Big Data workflows over competing clouds that assesses the trustworthiness of cloud providers based on three trust levels: (1) presence of the most up-to-date cloud resource verified capabilities, (2) reputational evidence measured by neighboring users and (3) a recorded personal history of experiences with the cloud provider. The trust-based workflow orchestration scheme we propose aims to avoid performance degradation or cloud service interruption. Our workflow orchestration approach is not only based on automatic adaptation and reconfiguration supported by monitoring, but also on predicting cloud resource shortages, thus preventing performance degradation. We formalize the cloud resource orchestration process using a state machine that efficiently captures different dynamic properties of the cloud execution environment. In addition, we use a model checker to validate our monitoring model in terms of reachability, liveness, and safety properties. We evaluate both our automated service provider selection scheme and cloud workflow orchestration, monitoring and adaptation schemes on a workflow-enabled Big Data application. A set of scenarios were carefully chosen to evaluate the performance of the service provider selection, workflow monitoring and the adaptation schemes we have implemented. The results demonstrate that our service selection outperforms other selection strategies and ensures trustworthy service provider selection. The results of evaluating automated workflow orchestration further show that our model is self-adapting, self-configuring, reacts efficiently to changes and adapts accordingly while enforcing QoS of workflows

    Fast GO/PO RCS calculation: A GO/PO parallel algorithm implemented on GPU and accelerated using a BVH data structure and the Type 3 Non-Uniform FFT

    Get PDF
    The purpose of this PhD research was to develop and optimize a fast numeric algorithm able to compute monostatic and bistatic RCS predictions obtaining an accuracy comparable to what commercially available from well-known electromagnetic CADs, but requiring unprecedented computational times. This was realized employing asymptotic approximated methods to solve the scattering problem, namely the Geometrical Optics (GO) and the Physical Optics (PO) theories, and exploiting advanced algorithmical concepts and cutting-edge computing technology to drastically speed-up the computation. The First Chapter focuses on an historical and operational overview of the concept of Radar Cross Section (RCS), with specific reference to aeronautical and maritime platforms. How geometries and materials influence RCS is also described. The Second Chapter is dedicated to the first phase of the algorithm: the electromagnetic field transport phase, where the GO theory is applied to implement the “ray tracing”. In this Chapter the first advanced algorithmical concept which was adopted is described: the Bounding Volume Hierarchy (BVH) data structure. Two different BVH approaches and their combination are described and compared. The Third Chapter is dedicated to the second phase of the calculation: the radiation integral, based on the PO theory, and its numerical optimization. Firstly the Type-3 Non-Uniform Fast Fourier Transform (NUFFT) is presented as the second advanced algorithmical tool that was used and it was indeed the foundation of the calculation of the radiation integral. Then, to improve the performance but also to make the application of the approach feasible in case of electrically large objects, the NUFFT was further optimized using a “pruning” technique, which is a stratagem used to save memory and computational time by avoiding calculating points of the transformed domain that are not of interest. To validate the algorithm, a preliminary measurement campaign was held at the headquarter of the Ingegneria Dei Sistemi (IDS) Company, located in Pisa. The measurements, performed on canonical scatterers using a Synthetic Aperture Radar (SAR) imaging equipment set up on a planar scanner inside a semi-anechoic chamber, are discussed

    Religion, social structure and economic development in Yi dynasty Korea.

    Full text link
    Thesis (Ph.D.)--Boston UniversityThe purpose of this dissertation is to investigate the key variables related to the economy of pre-modern Korean society. Similar studies have been made on China and Japan, but no comparable study has been made on Korea. Our hypothesis is that pre-modern Korea failed to develop the social and cultural bases tor a modern industrial society largely because of its weak polity and its lack of favorable religious elements, and because ot the formidable resistance to change of the Korean family and class system with their stress upon their own integration and solidarity. [TRUNCATED] Korean religion reinforced the primacy of the integrative value through the yangban class. It reinforced the particularistic relations and the family, and emphasized less the universalistic relations or the state. Conversely religion railed to help the efforts of the chungin to emphasize political economic values. These are among the essential factors responsible for the failure of Yi dynasty Korea to develop the bases of a modern industrial society

    Studies on quality assurance in haemocytometry

    Get PDF
    The objectives of this thesis are: a. to review the state of the art, the concepts, the problems and the perspectives of comprehensive QA in hey (Chapters II and III); b. to study the basic principles and problems of cell counting and sizing and the recent progress in approaching these problems; to describe the fluidic and electronic improvements in modern instruments, enabling the simultaneous measurement of wbc, rbc and plt and their characteristics, with special reference to the Coulter Counter Model S PlusII (Ch IIIl; c. to develop IQC materials of medium and/or long-term stability with special emphasis on ease of preparation and low costs, thus contributing to optimization of IQC (Ch IV l
    corecore