252,876 research outputs found
Business Process Management Education in Academia: Status, challenges, and Recommendations
In response to the growing proliferation of Business Process Management (BPM) in industry and the demand this creates for BPM expertise, universities across the globe are at various stages of incorporating knowledge and skills in their teaching offerings. However, there are still only a handful of institutions that offer specialized education in BPM in a systematic and in-depth manner. This article is based on a global educatorsâ panel discussion held at the 2009 European Conference on Information Systems in Verona, Italy. The article presents the BPM programs of five universities from Australia, Europe, Africa, and North America, describing the BPM content covered, program and course structures, and challenges and lessons learned. The article also provides a comparative content analysis of BPM education programs illustrating a heterogeneous view of BPM. The examples presented demonstrate how different courses and programs can be developed to meet the educational goals of a university department, program, or school. This article contributes insights on how best to continuously sustain and reshape BPM education to ensure it remains dynamic, responsive, and sustainable in light of the evolving and ever-changing marketplace demands for BPM expertise
Recommended from our members
Practitioner Track Proceedings of the 6th International Learning Analytics & Knowledge Conference (LAK16)
Practitioners spearhead a significant portion of learning analytics, relying on implementation and experimentation rather than on traditional academic research. Both approaches help to improve the state of the art. The LAK conference has created a practitioner track for submissions, which first ran in 2015 as an alternative to the researcher track.
The primary goal of the practitioner track is to share thoughts and findings that stem from learning analytics project implementations. While both large and small implementations are considered, all practitioner track submissions are required to relate to initiatives that are designed for large-scale and/or long-term use (as opposed to research-focused initiatives). Other guidelines include:
⢠Implementation track record The project should have been used by an institution or have been deployed on a learning site. There are no hard guidelines about user numbers or how long the project has been running.
⢠Learning/education related Submissions have to describe work that addresses learning/academic analytics, either at an educational institution or in an area (such as corporate training, health care or informal learning) where the goal is to improve the learning environment or learning outcomes.
⢠Institutional involvement Neither submissions nor presentations have to include a named person from an academic institution. However, all submissions have to include information collected from people who have used the tool or initiative in a learning environment (such as faculty, students, administrators and trainees).
⢠No sales pitches While submissions from commercial suppliers are welcome; reviewers do not accept overt (or covert) sales pitches. Reviewers look for evidence that a presentation will take into account challenges faced, problems that have arisen, and/or user feedback that needs to be addressed.
Submissions are limited to 1,200 words, including an abstract, a summary of deployment with end users, and a full description. Most papers in the proceedings are therefore short, and often informal, although some authors chose to extend their papers once they had been accepted.
Papers accepted in 2016 fell into two categories.
⢠Practitioner Presentations Presentation sessions are designed to focus on deployment of a single learning analytics tool or initiative.
⢠Technology Showcase The Technology Showcase event enables practitioners to demonstrate new and emerging learning analytics technologies that they are piloting or deploying.
Both types of paper are included in these proceedings
Recommended from our members
Technology-enhanced Personalised Learning: Untangling the Evidence
Technology-enhanced personalised learning is not yet common in Germany, which is why we have tasked scientists with summarising the current status of international research on the matter. This study demonstrates the great potential of technology in implementing effective personalised learning. Nevertheless, it has not been assessed yet whether the practical implementation actually works: Even in countries such as the U.S., which lead the way in using techology in classroom settings, hardly any evaluation studies have been done to prove the effectiveness of technology-enhanced personalised learning. In the light of the above, the authors make recommendations for actions to be taken in Germany to make best use of the potential of technology in providing individual support and guidance to students
Scaling Success: Lessons from Adaptation Pilots in the Rainfed Regions of India
"Scaling Success" examines how agricultural communities are adapting to the challenges posed by climate change through the lens of India's rainfed agriculture regions. Rainfed agriculture currently occupies 58 percent of India's cultivated land and accounts for up to 40 percent of its total food production. However, these regions face potential production losses of more than $200 billion USD in rice, wheat, and maize by 2050 due to the effects of climate change. Unless action is taken soon at a large scale, farmers will see sharp decreases in revenue and yields.Rainfed regions across the globe have been an important focus for the first generation of adaptation projects, but to date, few have achieved a scale that can be truly transformational. Drawing on lessons learnt from 21 case studies of rainfed agriculture interventions, the report provides guidance on how to design, fund and support adaptation projects that can achieve scale
Evaluation of the Valley Transportation Authorityâs DO IT! Program A âLadders of Opportunity Initiativeâ Program Funded by the Federal Transit Administration
The U.S. Department of Transportation has increasingly demonstrated interest in developing programs that will enhance the workforce capacity of future transportation systems. To that end, the Department sponsored the 2015 Innovative Public Transportation Workforce Development program, directed by the Federal Transit Administration. This program sought to enhance the availability of skilled workers from targeted groups, including the underserved, underemployed, and/or minority groups into possible transit-oriented career paths. One of the 19 programs selected for funding was the âDO IT!â project proposed and developed by the Valley Transit Authority (VTA). VTA proposed to create an innovative education and training program that focused on attracting and ultimately hiring underserved, underemployed, and/or minority groups in its service area of Santa Clara County. The program was created with two major goals in mind: (1) to enable VTA to work with local youth who may not otherwise have the opportunity to be exposed to a career in public transportation â specifically in the area of transportation planning; and (2) to provide a ladder of opportunity into the middle class which will help strengthen our workforce and our intercity communities by building the critical skillset needed to maintain a competitive and efficient public transportation service. This report provides an evaluation of the âDo It! Program and presents recommendations for program replication or improvement
Scaling up a learning technology strategy: Supporting student/faculty teams in learnerâcentred design
Many postâsecondary institutions are experiencing the challenge of scaling up their learning technology initiatives without a matching increase in staff resources. This mismatch is particularly acute at the design stage of projects, where both domain knowledge and instructional design expertise are needed. To address this, we are developing structures and tools for a small cadre of instructional design experts to support a growing number of learning technology projects developed by student/faculty teams. One of these tools, the LearnerâCentred Design Idea Kit, is an interactive WWWâbased resource now in a fourth iteration of use in an undergraduate course, Designing Learning Activities with Interactive Multimedia. The course and the LCD Idea Kit which supports it are part of a larger institutional strategy to introduce technologyâenabled change in the learning process, working âbottomâupâ with individual faculty and using the LCD Idea Kit to scale up the course across multiple university departments. In this paper, we describe the course and the Kit in detail and provide and overview of our current status and lessons learned
Bridging the Gap
School districts across the country are increasingly seeking out digital tools to support the work of educators, in the hopes of improving students' academic achievement. With the rapid emergence of this new market, many districts have been challenged by the task of identifying and procuring educational technology (ed-tech) products that match the needs of their educators and students.The NYC Department of Education's "Innovate NYC Schools" division, supported by a U.S. DOE Investing in Innovation (i3) grant, aims to address this problem, in part by promoting "user-centered design," an approach that puts the needs and preferences of products' intended users (in this case, teachers, students, and parents) front and center in the development and procurement of new technology.Bridging the Gap describes the design and implementation of three Innovate NYC Schools initiatives grounded in user-centered design theory:School Choice Design Challenge (SCDC),an effort to develop apps that would help students explore and narrow down their choices of high school.#SharkTankEDU events, during which ed-tech developers present a product to a panel of educators who provide feedback on the tool.Short-Cycle Evaluation Challenges (SCEC), a classroom-based, semester-long pilot of ed-tech tools intended to inform product development, as well as the ultimate procurement decisions of school staff.The report focuses on four phases of work involved in bringing ed-tech companies and the users of their products together: defining a problem; selecting users and ed-tech companies; implementing pilot-based initiatives; and evaluating products. It describes strategies used and challenges faced, and offers practical lessons gleaned from the experiences of the individuals who designed and participated in these efforts.
Early Learning Innovation Fund Evaluation Final Report
This is a formative evaluation of the Hewlett Foundation's Early Learning Innovation Fund that began in 2011 as part of the Quality Education in Developing Countries (QEDC) initiative. The Fund has four overarching objectives, which are to: promote promising approaches to improve children's learning; strengthen the capacity of organizations implementing those approaches; strengthen those organizations' networks and ownership; and grow 20 percent of implementing organizations into significant players in the education sector. The Fund's original design was to create a "pipeline" of innovative approaches to improve learning outcomes, with the assumption that donors and partners would adopt the most successful ones. A defining feature of the Fund was that it delivered assistance through two intermediary support organizations (ISOs), rather than providing funds directly to implementing organizations. Through an open solicitation process, the Hewlett Foundation selected Firelight Foundation and TrustAfrica to manage the Fund. Firelight Foundation, based in California, was founded in 1999 with a mission to channel resources to community-based organizations (CBOs) working to improve the lives of vulnerable children and families in Africa. It supports 12 implementing organizations in Tanzania for the Fund. TrustAfrica, based in Dakar, Senegal, is a convener that seeks to strengthen African-led initiatives addressing some of the continent's most difficult challenges. The Fund was its first experience working specifically with early learning and childhood development organizations. Under the Fund, it supported 16 such organizations: one in Mali and five each in Senegal, Uganda and Kenya. At the end of 2014, the Hewlett Foundation commissioned Management Systems International (MSI) to conduct a mid-term evaluation assessing the implementation of the Fund exploring the extent to which it achieved intended outcomes and any factors that had limited or enabled its achievements. It analyzed the support that the ISOs provided to their implementing organizations, with specific focus on monitoring and evaluation (M&E). The evaluation included an audit of the implementing organizations' M&E systems and a review of the feasibility of compiling data collected to support an impact evaluation. Finally, the Foundation and the ISOs hoped that this evaluation would reveal the most promising innovations and inform planning for Phase II of the Fund. The evaluation findings sought to inform the Hewlett Foundation and other donors interested in supporting intermediary grant-makers, early learning innovations and the expansion of innovations. TrustAfrica and Firelight Foundation provided input to the evaluation's scope of work. Mid-term evaluation reports for each ISO provided findings about their management of the Fund's Phase I and recommendations for Phase II. This final evaluation report will inform donors, ISOs and other implementing organizations about the best approaches to support promising early learning innovations and their expansion. The full report outlines findings common across both ISOs' experience and includes recommendations in four key areas: adequate time; appropriate capacity building; advocacy and scaling up; and evaluating and documenting innovations. Overall, both Firelight Foundation and TrustAfrica supported a number of effective innovations working through committed and largely competent implementing organizations. The program's open-ended nature avoided being prescriptive in its approach, but based on the lessons learned in this evaluation and the broader literature, the Hewlett Foundation and other donors could have offered more guidance to ISOs to avoid the need to continually relearn some lessons. For example, over the evaluation period, it became increasingly evident that the current context demands more focused advance planning to measure impact on beneficiaries and other stakeholders and a more concrete approach to promoting and resourcing potential scale-up. The main findings from the evaluation and recommendations are summarized here
Evaluating Foundation-Supported Capacity Building: Lessons Learned
This study of lessons learned from evaluations of philanthropic capacity building programs used a national database of 473 programs, and a survey and interviews with 87 funders (82 foundations or foundation collaboratives, and five foundation-supported intermediaries) to answer two questions:1) How do foundations that support nonprofit capacity building evaluate their grantmaking and direct service activities?2) What lessons can be learned from evaluation, both to improve these programs and justify the investments made in them
- âŚ