479,373 research outputs found

    Review on Quality Models for Open Source Software and its reflection on Social Coding

    Get PDF
    Social Coding Sites (SCSs) are social media services for sharing software development projects on the Web, many open source projects are currently being developed on SCSs. Assessing the quality is a crucial element for better selection of a specific project serving people requirements or needs. In this paper, we reviewed existing traditional models which evolved prior the evolution of open source software as well as open source quality models. We evaluated the selected models according to their reflection with respect to social coding project success factors: sociality, popularity, activity and supportability. Eight models were included in our research as we only selected models that introduces explicit metrics well defined for measuring, neither a process nor a generic methodology. Based on our selection criteria, a summary of the findings we obtained is that existing models doesn't fully consider or cover social factors for open source software evaluation hence there is a need for a model to measure the maturity / quality of open source projects from social factors perspective. We have also evaluated the existing models against a selected open source project hosted on social coding GitHub to assess each model applicability. Some of the measurements from the existing models were not applicable for evaluation

    Evaluating Innovation

    Get PDF
    In their pursuit of the public good, foundations face two competing forces -- the pressure to do something new and the pressure to do something proven. The epigraph to this paper, "Give me something new and prove that it works," is my own summary of what foundations often seek. These pressures come from within the foundations -- their staff or boards demand them, not the public. The aspiration to fund things that work can be traced to the desire to be careful, effective stewards of resources. Foundations' recognition of the growing complexity of our shared challenges drives the increased emphasis on innovation. Issues such as climate change, political corruption, and digital learning andwork environments have enticed new players into the social problem-solving sphere and have con-vinced more funders of the need to find new solutions. The seemingly mutually exclusive desires for doing something new and doing something proven are not new, but as foundations have grown in number and size the visibility of the paradox has risen accordingly.Even as foundations seek to fund innovation, they are also seeking measurements of those investments success. Many people's first response to the challenge of measuring innovation is to declare the intention oxymoronic. Innovation is by definition amorphous, full of unintended consequences, and a creative, unpredictable process -- much like art. Measurements, assessments, evaluation are -- also by most definitions -- about quantifying activities and products. There is always the danger of counting what you can count, even if what you can count doesn't matter.For all our awareness of the inherent irony of trying to measure something that we intend to be unpredictable, many foundations (and others) continue to try to evaluate their innovation efforts. They are, as John Westley, Brenda Zimmerman, and Michael Quinn Patton put it in "Getting to Maybe", grappling with "....intentionality and complexity -- (which) meet in tension." It is important to see the struggles to measure for what they are -- attempts to evaluate the success of the process of innovation, not necessarily the success of the individual innovations themselves. This is not a semantic difference.What foundations are trying to understand is how to go about funding innovation so that more of it can happenExamples in this report were chosen because they offer a look at innovation within the broader scope of a foundation's work. This paper is the fifth in a series focused on field building. In this context I am interested in where evaluation fits within an innovation strategy and where these strategies fit within a foundation's broader funding goals. I will present a typology of innovation drawn from the OECD that can be useful inother areas. I lay the decisions about evaluation made by Knight, MacArthur, and the Jewish NewMedia Innovation Funders against their program-matic goals. Finally, I consider how evaluating innovation may improve our overall use of evaluation methods in philanthropy

    Assessing technical candidates on the social web

    Get PDF
    This is the pre-print version of this Article. The official published version can be accessed from the link below - Copyright @ 2012 IEEEThe Social Web provides comprehensive and publicly available information about software developers: they can be identified as contributors to open source projects, as experts at maintaining weak ties on social network sites, or as active participants to knowledge sharing sites. These signals, when aggregated and summarized, could be used to define individual profiles of potential candidates: job seekers, even if lacking a formal degree or changing their career path, could be qualitatively evaluated by potential employers through their online contributions. At the same time, developers are aware of the Web’s public nature and the possible uses of published information when they determine what to share with the world. Some might even try to manipulate public signals of technical qualifications, soft skills, and reputation in their favor. Assessing candidates on the Web for technical positions presents challenges to recruiters and traditional selection procedures; the most serious being the interpretation of the provided signals. Through an in-depth discussion, we propose guidelines for software engineers and recruiters to help them interpret the value and trouble with the signals and metrics they use to assess a candidate’s characteristics and skills

    New Voices: What Works

    Get PDF
    Reviews grantees' accomplishments in building community news sites, keys to sustainability, and lessons learned about engagement, staffing, business models, social media, technology, partnerships, and limitations of university, youth, and radio projects
    • …
    corecore