5 research outputs found

    "Mapping" Nonprofit Infrastructure Organizations in Texas

    Get PDF
    The stability of the nonprofit sector and its ability to meet our nation‘s needs in an era of unprecedented challenges requires a solid nonprofit infrastructure (Brown, et al., 2008). These organizations that comprise this infrastructure system work behind the scenes to provide nonprofit organizations with capacity-building support. However, little is known about the actual infrastructure system, especially at the state and local levels. In order to better understand this system, student researchers from the Bush School of Government and Public Service at Texas A&M University were asked to replicate Dr. David O. Renz‘s 2008 study, ―The U.S. Nonprofit Infrastructure Mapped.‖ The Bush School study focused specifically on the nonprofit infrastructure structure in Texas by categorizing and mapping selected nonprofit organizations using the 11 roles and functions identified by Renz (2008). This report provides a literature review of nonprofit capacity building and organizational infrastructure. In addition, the data collection and classification using Renz‘s 11 roles and functions are detailed and mapping methodology is described. Finally, the researchers offer findings, questions to consider, and recommendations for further research. Findings from this study include: o Urban areas had the largest concentration of infrastructure organizations. Of the 389 nonprofit infrastructure organizations, the largest concentration of organizations was located near Dallas, Houston, Austin, and San Antonio. Several non-metropolitan regions in the state are lacking similar concentrations, even after consideration of the size of the nonprofit sector or general population in the respective regions. o Many organizations performed multiple roles and functions. In one case, one organization performed 10 functions. Many other organizations that were studied performed more than one of the 11 functions. o A large number of infrastructure organizations provide financial support to nonprofits. More than half of the organizations analyzed were categorized as Renz‘s Function Three-Financial Intermediaries because they facilitated the collection and distribution of financial resources to nonprofit organizations. Additionally, 40.4% of the organizations were categorized as Renz‘s Function Four-Funding Organizations because they provided financial resources to nonprofit operating organizations through the distribution of funds from asset pools that they own, manage, and allocate. Future research needs to be Page 4 conducted, however, to determine what proportion of funding is devoted to funding of the other nine Renz categories versus funding to nonprofits providing direct services. It would be useful to consider and respond to categories lacking in such funding, relative to the infrastructure needs of Texas nonprofits generally and also in particular regions of Texas or nonprofit subfields. o Some infrastructure functions were not as apparent. Researchers found that two of Renz‘s functions (Function One-Accountability and Regulation and Function Ten-Research) were performed by less than 5% of the organizations that were analyzed. Recommendations that emerged from this study were: o Regular updates of nonprofit information are important for future research. Nonprofit managers need to be educated about the importance of updating their organization‘s publicly available information. If their website or GuideStar reports are not current, researchers,practitioners, and other constituents cannot accurately analyze the organization. o Nonprofits need to clarify their roles using Renz’s 11 roles and function. Organizations with a mission to support the nonprofit sector should clarify their focus based on the definitions of capacity-building and infrastructure developed by Renz (2008). Do the organizations intend to support the entire nonprofit infrastructure in Texas or only support Function Nine-Capacity Development and Technical Assistance? o Strengthen associations of nonprofit infrastructure organizations throughout the Texas. This action will benefit nonprofit organizations through improved communication among infrastructure organizations, as well as economies of scale and scope. o Facilitate the creation of a network of representatives from each Council of Governments (COG). This organization can serve as a point of contact for matters about the nonprofit infrastructure of that COG.OneStar Foundation, with additional funding from The Meadows Foundatio

    History of concepts: comparative perspectives

    Get PDF
    Although vastly influential in German-speaking Europe, conceptual history (Begriffsgeschichte) has until now received little attention in English. This genre of intellectual history differs from both the French history of mentalités and the Anglophone history of discourses by positing the concept - the key occupier of significant syntactical space - as the object of historical investigation. Contributions by distinguished practitioners and critics of conceptual history from Europe and America illustrate both the distinctiveness and diversity of the genre. The first part of the book is devoted to the origins and identity of the field, as well as methodological issues. Part two presents exemplary studies focusing either on a particular concept (such as Maurizio Viroli's 'Reason of the State') or a particular approach to conceptual history (e.g. Bernard Scholz for literary criticism and Terence Ball for political science). The final, most innovative section of the book looks at concepts and art - high, bourgeois and demotic. Here Bram Kempers discusses the conceptual history of Raphael's frescos in the Stanza della Segnatura of the Vatican; Eddy de Jongh examines the linguistic character of much Dutch genre painting; and Rolf Reichardt considers the conceptual structure implicit in card games of the French Revolution, used to induct those on the margins of literacy into the new revolutionary world-view

    Ontological approach for database integration

    Get PDF
    Database integration is one of the research areas that have gained a lot of attention from researcher. It has the goal of representing the data from different database sources in one unified form. To reach database integration we have to face two obstacles. The first one is the distribution of data, and the second is the heterogeneity. The Web ensures addressing the distribution problem, and for the case of heterogeneity there are many approaches that can be used to solve the database integration problem, such as data warehouse and federated databases. The problem in these two approaches is the lack of semantics. Therefore, our approach exploits the Semantic Web methodology. The hybrid ontology method can be facilitated in solving the database integration problem. In this method two elements are available; the source (database) and the domain ontology, however, the local ontology is missing. In fact, to ensure the success of this method the local ontologies should be produced. Our approach obtains the semantics from the logical model of database to generate local ontology. Then, the validation and the enhancement can be acquired from the semantics obtained from the conceptual model of the database. Now, our approach can be applied in the generation phase and the validation-enrichment phase. In the generation phase in our approach, we utilise the reverse engineering techniques in order to catch the semantics hidden in the SQL language. Then, the approach reproduces the logical model of the database. Finally, our transformation system will be applied to generate an ontology. In our transformation system, all the concepts of classes, relationships and axioms will be generated. Firstly, the process of class creation contains many rules participating together to produce classes. Our unique rules succeeded in solving problems such as fragmentation and hierarchy. Also, our rules eliminate the superfluous classes of multi-valued attribute relation as well as taking care of neglected cases such as: relationships with additional attributes. The final class creation rule is for generic relation cases. The rules of the relationship between concepts are generated with eliminating the relationships between integrated concepts. Finally, there are many rules that consider the relationship and the attributes constraints which should be transformed to axioms in the ontological model. The formal rules of our approach are domain independent; also, it produces a generic ontology that is not restricted to a specific ontology language. The rules consider the gap between the database model and the ontological model. Therefore, some database constructs would not have an equivalent in the ontological model. The second phase consists of the validation and the enrichment processes. The best way to validate the transformation result is to facilitate the semantics obtained from the conceptual model of the database. In the validation phase, the domain expert captures the missing or the superfluous concepts (classes or relationships). In the enrichment phase, the generalisation method can be applied to classes that share common attributes. Also, the concepts of complex or composite attributes can be represented as classes. We implement the transformation system by a tool called SQL2OWL in order to show the correctness and the functionally of our approach. The evaluation of our system showed the success of our proposed approach. The evaluation goes through many techniques. Firstly, a comparative study is held between the results produced by our approach and the similar approaches. The second evaluation technique is the weighting score system which specify the criteria that affect the transformation system. The final evaluation technique is the score scheme. We consider the quality of the transformation system by applying the compliance measure in order to show the strength of our approach compared to the existing approaches. Finally the measures of success that our approach considered are the system scalability and the completeness

    Building Publics: The Early History of the New York Shakespeare Festival

    Get PDF
    This dissertation explores the New York Shakespeare Festival/Public Theater’s earliest history, with a special focus in the company’s evolving use of the rhetoric and concept of “public.” As founder Joseph Papp noted early in the theater’s history, they struggled to function as a “private organization engaged in public work.” To mitigate the challenges of this struggle, the company pursued potential audiences and publics for their theatrical and cultural offerings in a variety of spaces on the cityscape, from Central Park to neighborhood parks and common spaces to a 19th century historic landmark. In documenting and exploring the festival’s development and perambulations, this dissertation suggests that the festival’s position as both a private and public-minded organization presented as many opportunities as it did challenges. In this way, company rhetoric surrounding “public-ness” emerged as a powerful strategy for the company’s survival and growth, embodied most apparently by their current moniker as The Public Theater

    Exploitation dynamique des données de production pour améliorer les méthodes DFM dans l'industrie Microélectronique

    Get PDF
    La conception pour la fabrication ou DFM (Design for Manufacturing) est une méthode maintenant classique pour assurer lors de la conception des produits simultanément la faisabilité, la qualité et le rendement de la production. Dans l'industrie microélectronique, le Design Rule Manual (DRM) a bien fonctionné jusqu'à la technologie 250nm avec la prise en compte des variations systématiques dans les règles et/ou des modèles basés sur l'analyse des causes profondes, mais au-delà de cette technologie, des limites ont été atteintes en raison de l'incapacité à sasir les corrélations entre variations spatiales. D'autre part, l'évolution rapide des produits et des technologies contraint à une mise à jour dynamique des DRM en fonction des améliorations trouvées dans les fabs. Dans ce contexte les contributions de thèse sont (i) une définition interdisciplinaire des AMDEC et analyse de risques pour contribuer aux défis du DFM dynamique, (ii) un modèle MAM (mapping and alignment model) de localisation spatiale pour les données de tests, (iii) un référentiel de données basé sur une ontologie ROMMII (referential ontology Meta model for information integration) pour effectuer le mapping entre des données hétérogènes issues de sources variées et (iv) un modèle SPM (spatial positioning model) qui vise à intégrer les facteurs spatiaux dans les méthodes DFM de la microélectronique, pour effectuer une analyse précise et la modélisation des variations spatiales basées sur l'exploitation dynamique des données de fabrication avec des volumétries importantes.The DFM (design for manufacturing) methods are used during technology alignment and adoption processes in the semiconductor industry (SI) for manufacturability and yield assessments. These methods have worked well till 250nm technology for the transformation of systematic variations into rules and/or models based on the single-source data analyses, but beyond this technology they have turned into ineffective R&D efforts. The reason for this is our inability to capture newly emerging spatial variations. It has led an exponential increase in technology lead times and costs that must be addressed; hence, objectively in this thesis we are focused on identifying and removing causes associated with the DFM ineffectiveness. The fabless, foundry and traditional integrated device manufacturer (IDM) business models are first analyzed to see coherence against a recent shift in business objectives from time-to-market (T2M) and time-to-volume towards (T2V) towards ramp-up rate. The increasing technology lead times and costs are identified as a big challenge in achieving quick ramp-up rates; hence, an extended IDM (e-IDM) business model is proposed to support quick ramp-up rates which is based on improving the DFM ineffectiveness followed by its smooth integration. We have found (i) single-source analyses and (ii) inability to exploit huge manufacturing data volumes as core limiting factors (failure modes) towards DFM ineffectiveness during technology alignment and adoption efforts within an IDM. The causes for single-source root cause analysis are identified as the (i) varying metrology reference frames and (ii) test structures orientations that require wafer rotation prior to the measurements, resulting in varying metrology coordinates (die/site level mismatches). A generic coordinates mapping and alignment model (MAM) is proposed to remove these die/site level mismatches, however to accurately capture the emerging spatial variations, we have proposed a spatial positioning model (SPM) to perform multi-source parametric correlation based on the shortest distance between respective test structures used to measure the parameters. The (i) unstructured model evolution, (ii) ontology issues and (iii) missing links among production databases are found as causes towards our inability to exploit huge manufacturing data volumes. The ROMMII (referential ontology Meta model for information integration) framework is then proposed to remove these issues and enable the dynamic and efficient multi-source root cause analyses. An interdisciplinary failure mode effect analysis (i-FMEA) methodology is also proposed to find cyclic failure modes and causes across the business functions which require generic solutions rather than operational fixes for improvement. The proposed e-IDM, MAM, SPM, and ROMMII framework results in accurate analysis and modeling of emerging spatial variations based on dynamic exploitation of the huge manufacturing data volumes.SAVOIE-SCD - Bib.électronique (730659901) / SudocGRENOBLE1/INP-Bib.électronique (384210012) / SudocGRENOBLE2/3-Bib.électronique (384219901) / SudocSudocFranceF
    corecore