39,181 research outputs found
Aggregating Impact: A Funder's Guide to Mission Investment Intermediaries
This report provides a guide to mission investment intermediaries, organizations that collect capital from multiple sources and reinvest it in people and enterprises, whether nonprofit or for-profit, that deliver both social impact and financial returns. A growing number of foundations and other funders are beginning to use such intermediaries versus making mission investments directly. This is due to a number of advantages that intermediaries can provide, such as ease of investment, reduced risk, lower transaction costs, specialized expertise, performance reporting, and an expanded deal flow. Yet research disclosed that many funders are unaware of the wide range of mission investment intermediaries that are available and of the advantages they can offer. The authors provide an overview of mission investment intermediaries and how foundations use them, the benefits and challenges of investing in intermediaries, and an analysis of available intermediaries that address economic development, housing and the environment
The role of community partners in urban investments
Institutional investors seeking to deploy capital to underserved areas do not have either the time or the expertise to actively manage these specialized investments. Investment vehicles intervene by using their financial expertise to pool assets and lower transaction costs. Community partners, in turn, link the investment vehicle to the neighborhood. This paper develops a typology of community partners and their unique characteristics that enable them to overcome information asymmetries in certain markets. The paper also discusses the business models that establish the relationship between the investment vehicle and community partner to highlight strengths of the different models for delivering community transformation.Community development
A UML/OCL framework for the analysis of fraph transformation rules
In this paper we present an approach for the analysis of graph transformation rules based on an intermediate OCL representation. We translate different rule semantics into OCL, together with the properties of interest (like rule applicability, conflicts or independence). The intermediate representation serves three purposes: (i) it allows the seamless integration of graph transformation rules with the MOF and OCL standards, and enables taking the meta-model and its OCL constraints (i.e. well-formedness rules) into account when verifying the correctness of the rules; (ii) it permits the interoperability of graph transformation concepts with a number of standards-based model-driven development tools; and (iii) it makes available a plethora of OCL tools to actually perform the rule analysis. This approach is especially useful to analyse the operational semantics of Domain Specific Visual Languages. We have automated these ideas by providing designers with tools for the graphical specification and analysis of graph transformation rules, including a backannotation mechanism that presents the analysis results in terms of the original language notation
CATS: linearizability and partition tolerance in scalable and self-organizing key-value stores
Distributed key-value stores provide scalable, fault-tolerant, and self-organizing
storage services, but fall short of guaranteeing linearizable consistency
in partially synchronous, lossy, partitionable, and dynamic networks, when data
is distributed and replicated automatically by the principle of consistent hashing.
This paper introduces consistent quorums as a solution for achieving atomic
consistency. We present the design and implementation of CATS, a distributed
key-value store which uses consistent quorums to guarantee linearizability and partition tolerance in such adverse and dynamic network conditions. CATS is
scalable, elastic, and self-organizing; key properties for modern cloud storage
middleware. Our system shows that consistency can be achieved with practical
performance and modest throughput overhead (5%) for read-intensive workloads
Recommended from our members
An evaluation of e-learning standards
The aim of this investigation is to perform an independent study of the various emerging elearning standards. This paper presents a summary of these standards in order to make them more accessible and understandable, and provide preliminary evidence as to their utility and adoption by the various UK higher and further education institutions. Recently there have been efforts to define standards for the elearning contents and elearning components like the IEEELOM, UKLOM, IMS, SCORM and OKI. Since it was not possible to cover all the standards in detail within the time available, so our independent study focuses on eight standards Although the results of the preliminary study suggest that the eight standards considered in the study may help interoperability, accessibility and reusability of the elearning content and elearning components, but it is yet to be seen how many of these are actually followed at UK higher education institutions
Tolerance analysis approach based on the classification of uncertainty (aleatory / epistemic)
Uncertainty is ubiquitous in tolerance analysis problem. This paper deals with tolerance analysis formulation, more particularly, with the uncertainty which is necessary to take into account into the foundation of this formulation. It presents: a brief view of the uncertainty classification: Aleatory uncertainty comes from the inherent uncertain nature and phenomena, and epistemic uncertainty comes from the lack of knowledge, a formulation of the tolerance analysis problem based on this classification, its development: Aleatory uncertainty is modeled by probability distributions while epistemic uncertainty is modeled by intervals; Monte Carlo simulation is employed for probabilistic analysis while nonlinear optimization is used for interval analysis.âAHTOLAâ project (ANR-11- MONU-013
Natural Resources, Development Models and Sustainable Development
This paper starts out from the optimistic assumption that the basic policies for environmental economic development are known but uncertainties surround the speed of their adoption. In many developing countries the key obstacle is poor governance: consequently, renewable resources continue to be mined, non-renewable resources are depleted irresponsibly, and reductions in pollution intensity lag. Recent research identifies resource abundance as an important cause of policy failure. This is because the primary sector remains large in relation to GDP so that differences in the scale of natural resource rents (and in their socio-economic linkages) condition macro policy in important ways. Most developing countries are resource-rich, a condition that engenders predatory political states that deploy resource rents in ways that cumulatively distort the economy so it falls into a staple trap, which undermines economic growth and environmentally sustainable policies. Sound macroeconomic policy is critical to the success of microeconomic measures like much of environmental policy, a fact often neglected by environmental reformers. There are two implications of this. First, in the long term, improved governance will enhance environmentally sustainable management of: renewable resources (by taking account of the total economic value of resources); finite resources (guided by the need to maintain genuine saving); and the global pollution sinks (by flattening the environmental Kuznets curve). Second, until such improvements occur, environmental policies are likely to underperform unless they are adapted to take account of flawed macro policies. Environmental reformers therefore need to support efforts by the international financial institutions to improve macroeconomic management.Environmental Economics and Policy, International Development, Resource /Energy Economics and Policy,
- âŠ