281,395 research outputs found

    Making the Connection Explicit: Service Learning, the SDGs, and the University’s Third Mission

    Get PDF
    Service learning (or community engaged learning) is a rapidly growing form of experiential education that results in mutually beneficial partnerships between students, institutions and local communities. SL (or CEL) is well-developed at European universities, and a variety of networks exist to further SL as a pedagogy. We examined how a range of universities consider SL a part of their institutional missions, and if explicit connections were made between SL and the United Nations’ Sustainable Development Goals (a universal framework of 17 goals to achieve by 2030) and the university’s third mission (an orientation toward the public good). This study will be presented at the 5th annual European Conference for Service Learning in Higher Education (ECSLHE), organized by one of the aforementioned European networks of universities. As the conference is upcoming, the attached research output is as of yet only the conference proposal submitted to the ECSLHE

    Communication of higher education institutions: Historical developments and changes over the past decade

    Get PDF
    Higher education institutions (HEIs) are pivotal organizations in modern societies. Over the past decades, the higher education sector has expanded considerably in countries across the world, with many newly founded colleges and universities and rapid increases in student enrollment and research output. In addition, new public management reforms and a growing need for societal legitimation have led many HEIs to establish or enlarge their communication departments, pursue branding and reputation management, and professionalize their communication efforts across various channels. Although a growing body of literature has shed light on how HEIs engage in public relations (PR) and science communication, we know little about how their communication has developed over time and in relation to the fundamental transformations in higher education systems and the media landscape in recent years, decades, and even centuries. Most existing sketches of such historical developments have focused on one country – as is typical for histories of PR in general – and have been dedicated to the second half of the 20th century. In contrast, the early beginnings of university communication since the late 19th century and recent trends in the past decade have been little researched. This guest editorial and the contributions of this Thematic Section on Changing Communication of Higher Education Institutions address these gaps in research and together shed light on developments in different European countries, as well as in the U. S

    The Intercultural Skills Graduates and Businesses in Europe Need Today

    Get PDF
    This ERASMUS+ funded project, “Developing the cross-cultural skills of graduates in response to the needs of European enterprise”, is developed in response to recent research highlighting the importance of intercultural competencies for graduates wanting to work in Europe, the employers’ needs, and the intercultural competencies and skills higher education institutions provide. This project aims to develop the intercultural competencies of graduates in the EU by enhancing the quality and relevance of their knowledge and skills to enable them to be active professionals in the European working environment. Five Higher Education Institutions have participated in this study: University of Worcester (Project lead, UK), London South Bank University (UK), UC Leuven-Limburg (Belgium), Halmstad University (Sweden), and Bursa Uludağ University (Turkey). The diversity of these partners, their respective regional and national contexts, and their experience in working together with regional businesses are central to achieve the project aims. As the first output of the project, this report presents results based on two types of analysis methods and data collected from four European countries (UK, Sweden, Belgium, and Turkey). Firstly, two surveys and the quantitative analysis of data collected from 585 student surveys responses and 403 employer survey responses and secondly, on an analysis of qualitative data collected through 50 interviews with employees in European organizations and 50 interviews with students studying in European universities

    Is higher education more important for firms than research? Disentangling university spillovers

    Get PDF
    The paper is the first attempt to integrate microdata on universities and firms across most European countries in order to disentangle the impact of knowledge spillovers from human capital (graduates) and intellectual capital (codified research output) on the performance of firms. Data cover all Higher Education Institutions (HEIs) registered in the official European Tertiary Education Register (ETER). Data on performance of firms are from ORBIS and refer to change in the 2011–2015 period in turnover, total assets, intangible assets, and employment. Firms are georeferred and the spillovers from all HEIs located at a given distance are summed and integrated. The findings suggest that, among knowledge spillovers, the creation of human capital via education of students has a larger impact than the circulation of research knowledge. Moreover, the two factors seem to be complements rather than substitutes. Spatial proximity is important for embodied knowledge spillovers (i.e. educated people), while for codified and disembodied spillovers (citations to publications) the spatial dimension is less relevant. The findings have important managerial and policy-making consequences

    Unequal Representation and Morality Issues: The Effect of Education on Opinion-Policy Congruence

    Get PDF
    A government must be responsive to all its’ citizens for a democracy to thrive, meaning that all individuals must be politically equal. This entails that their preferences are equally influential on the government’s decisions, regardless of personal background. The thesis looks at (un)equal representation by analysing opinion-policy congruence for different educational groups on issues related to morality. Much research has been done on congruence between opinions and policy output, but in-depth analyses on morality policy are lacking. Research has demonstrated political inequality between groups with different levels of education, wherein government policy is more responsive to higher educated citizens. The combination of the higher educated citizens increasingly dominating the political arena and their preferences on morality issues being substantively different from those with lower levels of education make it likely for there to be unequal representation related to morality policy. Applying pooled OLS regression analyses with time-series cross-sectional data on European- and OECD countries, I find that the higher educated citizens are indeed better represented both generally and when their opinions differ from those of lower educated citizens. It has long been argued by social scientists that public opinion shapes the government’s policy output, but also that the influence varies according to the presence of certain political institutions and the characteristics of the relevant issues. Institutions and partisan actors influence may condition representation by creating differences in access to policymaking for subgroups of the population. This is investigated by applying a political veto points and players framework. It is argued that an increased number of veto points and players decrease unequal representation for citizens with different educational backgrounds as the higher number of veto points and players both allow for more avenues of influence and create a bias toward the status quo. Both benefits lower educated citizens whose preferences are generally more conservative. The result of the analyses of interaction effects shows that institutions do not affect whose preferences are reflected in policy. However, it indicates that the presence of more partisan veto players does.MasteroppgaveSAMPOL350MASV-SAP

    Understanding Productivity Changes in Public Universities: Evidence from Spain

    Full text link
    This paper describes the dynamic changes in productivity in Spanish public universities (SPU) in the period 1994 to 2008. The Malmquist index is used to illustrate the contribution of efficiency and technological change to changes in the productivity of university activities. The results indicate that annual productivity growth is attributable more to efficiency improvements than technological progress. Gains in scale efficiency appear to play only a minor role in productivity gains. The fact that technical efficiency contributes more than technological progress suggests that most universities are not operating close to the best-practice frontier.Garcia Aracil, A. (2013). Understanding Productivity Changes in Public Universities: Evidence from Spain. Research Evaluation. 22(5):351-368. doi:10.1093/reseval/rvt009S351368225Agasisti, T., Catalano, G., Landoni, P., & Verganti, R. (2012). Evaluating the performance of academic departments: an analysis of research-related output efficiency. Research Evaluation, 21(1), 2-14. doi:10.1093/reseval/rvr001Agasisti, T., & Pérez-Esparrells, C. (2009). Comparing efficiency in a cross-country perspective: the case of Italian and Spanish state universities. Higher Education, 59(1), 85-103. doi:10.1007/s10734-009-9235-8ARCELUS‡, F. J., & Coleman‡§, D. F. (1997). An efficiency review of university departments. International Journal of Systems Science, 28(7), 721-729. doi:10.1080/00207729708929431Athanassopoulos, A. D., & Shale, E. (1997). Assessing the Comparative Efficiency of Higher Education Institutions in the UK by the Means of Data Envelopment Analysis. Education Economics, 5(2), 117-134. doi:10.1080/09645299700000011Attewell, P., Heil, S., & Reisel, L. (2012). What Is Academic Momentum? And Does It Matter? Educational Evaluation and Policy Analysis, 34(1), 27-44. doi:10.3102/0162373711421958Balk, B. M. (1993). Malmquist Productivity Indexes and Fisher Ideal Indexes: Comment. The Economic Journal, 103(418), 680. doi:10.2307/2234540Beasley, J. E. (1990). Comparing university departments. Omega, 18(2), 171-183. doi:10.1016/0305-0483(90)90064-gBeasley, J. E. (1995). Determining Teaching and Research Efficiencies. Journal of the Operational Research Society, 46(4), 441-452. doi:10.1057/jors.1995.63Bessent, A. M., & Bessent, E. W. (1980). Determining the Comparative Efficiency of Schools through Data Envelopment Analysis. Educational Administration Quarterly, 16(2), 57-75. doi:10.1177/0013161x8001600207Bonaccorsi, A., Daraio, C., & Simar, L. (2006). Advanced indicators of productivity of universitiesAn application of robust nonparametric methods to Italian data. Scientometrics, 66(2), 389-410. doi:10.1007/s11192-006-0028-xBonaccorsi, A., Daraio, C., Lepori, B., & Slipersæter, S. (2007). Indicators on individual higher education institutions: addressing data problems and comparability issues. Research Evaluation, 16(2), 66-78. doi:10.3152/095820207x218141Caves, D. W., Christensen, L. R., & Diewert, W. E. (1982). The Economic Theory of Index Numbers and the Measurement of Input, Output, and Productivity. Econometrica, 50(6), 1393. doi:10.2307/1913388Charnes, A., Cooper, W. W., & Rhodes, E. (1978). Measuring the efficiency of decision making units. European Journal of Operational Research, 2(6), 429-444. doi:10.1016/0377-2217(78)90138-8Coelli, T., & Perelman, S. (1999). A comparison of parametric and non-parametric distance functions: With application to European railways. European Journal of Operational Research, 117(2), 326-339. doi:10.1016/s0377-2217(98)00271-9Cohn, E., Rhine, S. L. W., & Santos, M. C. (1989). Institutions of Higher Education as Multi-Product Firms: Economies of Scale and Scope. The Review of Economics and Statistics, 71(2), 284. doi:10.2307/1926974COSTAS, R., & BORDONS, M. (2007). The h-index: Advantages, limitations and its relation with other bibliometric indicators at the micro level. Journal of Informetrics, 1(3), 193-203. doi:10.1016/j.joi.2007.02.001De Groot, H., McMahon, W. W., & Volkwein, J. F. (1991). The Cost Structure of American Research Universities. The Review of Economics and Statistics, 73(3), 424. doi:10.2307/2109566F�re, R., Grosskopf, S., & Lovell, C. A. K. (1992). Indirect productivity measurement. Journal of Productivity Analysis, 2(4), 283-298. doi:10.1007/bf00156471Farrell, M. J. (1957). The Measurement of Productive Efficiency. Journal of the Royal Statistical Society. Series A (General), 120(3), 253. doi:10.2307/2343100Flegg, A. T., & Allen, D. O. (2007). Does Expansion Cause Congestion? The Case of the Older British Universities, 1994–2004. Education Economics, 15(1), 75-102. doi:10.1080/09645290601133928FLEGG, A. T., ALLEN, D. O., FIELD, K., & THURLOW, T. W. (2004). Measuring the efficiency of British universities: a multi‐period data envelopment analysis. Education Economics, 12(3), 231-249. doi:10.1080/0904529042000258590García-Aracil, A., & Palomares-Montero, D. (2009). Examining benchmark indicator systems for the evaluation of higher education institutions. Higher Education, 60(2), 217-234. doi:10.1007/s10734-009-9296-8García-Aracil, A., & Palomares-Montero, D. (2012). Indicadores para la evaluación de las instituciones universitarias: validación a través del método Delphi. Revista española de Documentación Científica, 35(1), 119-144. doi:10.3989/redc.2012.1.863Giménez, V. M., & Martínez, J. L. (2006). Cost efficiency in the university: A departmental evaluation model. Economics of Education Review, 25(5), 543-553. doi:10.1016/j.econedurev.2005.05.006Glass, J. C., McKillop, D. G., & O’Rourke, G. (1998). Journal of Productivity Analysis, 10(2), 153-175. doi:10.1023/a:1018607223276Grifell-Tatjé, E., & Lovell, C. A. K. (1999). A generalized Malmquist productivity index. Top, 7(1), 81-101. doi:10.1007/bf02564713Grosskopf, S., Margaritis, D., & Valdmanis, V. (1995). Estimating output substitutability of hospital services: A distance function approach. European Journal of Operational Research, 80(3), 575-587. doi:10.1016/0377-2217(94)00138-3Jiménez-Contreras, E., de Moya Anegón, F., & López-Cózar, E. D. (2003). The evolution of research activity in Spain. Research Policy, 32(1), 123-142. doi:10.1016/s0048-7333(02)00008-2Johnes, G. (1988). Determinants of research output in economics departments in British universities. Research Policy, 17(3), 171-178. doi:10.1016/0048-7333(88)90041-8JOHNES, J. (2008). EFFICIENCY AND PRODUCTIVITY CHANGE IN THE ENGLISH HIGHER EDUCATION SECTOR FROM 1996/97 TO 2004/5*. Manchester School, 76(6), 653-674. doi:10.1111/j.1467-9957.2008.01087.xJohnes, G., & Schwarzenberger, A. (2011). Differences in cost structure and the evaluation of efficiency: the case of German universities. Education Economics, 19(5), 487-499. doi:10.1080/09645291003726442JOHNES, J., & YU, L. (2008). Measuring the research performance of Chinese higher education institutions using data envelopment analysis. China Economic Review, 19(4), 679-696. doi:10.1016/j.chieco.2008.08.004Koshal, R. K., & Koshal, M. (1999). Economies of scale and scope in higher education: a case of comprehensive universities. Economics of Education Review, 18(2), 269-277. doi:10.1016/s0272-7757(98)00035-1Kortelainen, M. (2008). Dynamic environmental performance analysis: A Malmquist index approach. Ecological Economics, 64(4), 701-715. doi:10.1016/j.ecolecon.2007.08.001Laudel, G. (2005). Is external research funding a valid indicator for research performance? Research Evaluation, 14(1), 27-34. doi:10.3152/147154405781776300Lovell, C. A. K. (2003). Journal of Productivity Analysis, 20(3), 437-458. doi:10.1023/a:1027312102834Lucas, S. R., & Beresford, L. (2010). Naming and Classifying: Theory, Evidence, and Equity in Education. Review of Research in Education, 34(1), 25-84. doi:10.3102/0091732x09353578Madden, G., Savage, S., & Kemp, S. (1997). Measuring Public Sector Efficiency: A Study of Economics Departments at Australian Universities. Education Economics, 5(2), 153-168. doi:10.1080/09645299700000013Abbott, M., & Doucouliagos, C. (2001). Total factor productivity and efficiency in Australian colleges of advanced education. Journal of Educational Administration, 39(4), 384-393. doi:10.1108/eum0000000005497Malmquist, S. (1953). Index numbers and indifference surfaces. Trabajos de Estadistica, 4(2), 209-242. doi:10.1007/bf03006863Mamun, S. A. K. (2012). Stochastic estimation of cost frontier: evidence from Bangladesh. Education Economics, 20(2), 211-227. doi:10.1080/09645292.2010.494836Maniadakis, N., & Thanassoulis, E. (2004). A cost Malmquist productivity index. European Journal of Operational Research, 154(2), 396-409. doi:10.1016/s0377-2217(03)00177-2Molinero, C. M. (1996). On the Joint Determination of Efficiencies in a Data Envelopment Analysis Context. Journal of the Operational Research Society, 47(10), 1273-1279. doi:10.1057/jors.1996.154Molinero, C. M., & Tsai, P. F. (1997). Some mathematical properties of a DEA model for the joint determination of efficiencies. Journal of the Operational Research Society, 48(1), 51-56. doi:10.1057/palgrave.jors.2600327McLendon, M. K., Hearn, J. C., & Deaton, R. (2006). Called to Account: Analyzing the Origins and Spread of State Performance-Accountability Policies for Higher Education. Educational Evaluation and Policy Analysis, 28(1), 1-24. doi:10.3102/01623737028001001Monk, D. H. (1992). Education Productivity Research: An Update and Assessment of Its Role in Education Finance Reform. Educational Evaluation and Policy Analysis, 14(4), 307-332. doi:10.3102/01623737014004307Nishimizu, M., & Page, J. M. (1982). Total Factor Productivity Growth, Technological Progress and Technical Efficiency Change: Dimensions of Productivity Change in Yugoslavia, 1965-78. The Economic Journal, 92(368), 920. doi:10.2307/2232675Rodrı́guez-Álvarez, A., Fernández-Blanco, V., & Lovell, C. A. K. (2004). Allocative inefficiency and its cost: International Journal of Production Economics, 92(2), 99-111. doi:10.1016/j.ijpe.2003.08.012Salerno, C. (2006). Using Data Envelopment Analysis to Improve Estimates of Higher Education Institution’s Per‐student Education Costs1. Education Economics, 14(3), 281-295. doi:10.1080/09645290600777485Sarafoglou, N., & Haynes, K. E. (1996). University productivity in Sweden: a demonstration and explanatory analysis for economics and business programs. The Annals of Regional Science, 30(3), 285-304. doi:10.1007/bf01580523Schmoch, U., Schubert, T., Jansen, D., Heidler, R., & von Görtz, R. (2010). How to use indicators to measure scientific performance: a balanced approach. Research Evaluation, 19(1), 2-18. doi:10.3152/095820210x492477New, B. (1997). The rationing debate: Defining a package of healthcare services the NHS is responsible for The case for. BMJ, 314(7079), 498-498. doi:10.1136/bmj.314.7079.498Sinuany-Stern, Z., Mehrez, A., & Barboy, A. (1994). Academic departments efficiency via DEA. Computers & Operations Research, 21(5), 543-556. doi:10.1016/0305-0548(94)90103-1Tomkins, C., & Green, R. (1988). AN EXPERIMENT IN THE USE OF DATA ENVELOPMENT ANALYSIS FOR EVALUATING THE EFFICIENCY OF UK UNIVERSITY DEPARTMENTS OF ACCOUNTING. Financial Accountability and Management, 4(2), 147-164. doi:10.1111/j.1468-0408.1988.tb00066.xUri, N. D. (2003). Technical efficiency in telecommunications in the United States and the impact of incentive regulation. Applied Mathematical Modelling, 27(1), 53-67. doi:10.1016/s0307-904x(02)00098-7Uri, N. D. (2003). The adoption of incentive regulation and its effect on technical efficiency in telecommunications in the United States. International Journal of Production Economics, 86(1), 21-34. doi:10.1016/s0925-5273(03)00002-1Vidal, J. (2003). Quality Assurance, Legal Reforms and the European Higher Education Area in Spain. European Journal of Education, 38(3), 301-313. doi:10.1111/1467-3435.00149Williams, J. D., & Kerckhoff, A. C. (1995). The Challenge of Developing New Educational Indicators. Educational Evaluation and Policy Analysis, 17(1), 113-131. doi:10.3102/01623737017001113Worthington, A. C., & Lee, B. L. (2008). Efficiency, technology and productivity change in Australian universities, 1998–2003. Economics of Education Review, 27(3), 285-298. doi:10.1016/j.econedurev.2006.09.01

    Needs Assessment Report: Gap Analysis

    Get PDF
    The Research Output Management through Open Access Institutional Repositories in Palestinian Higher Education Institutions (ROMOR) project aims to improve the management, visibility, and accessibility of scientific research outputs in Palestinian HEIs by establishing new or enhancing existing Open Access Institutional Repositories (OAIRs), improving institutional capacity for the management and sharing of research outputs held within the repositories, and developing and/or refining curricula to ensure that emerging researchers are better able to manage their work across the entire research lifecycle. Planning for Open Access Institutional Repositories (OAIRs) requires identifying key stakeholders who would support us in realizing the objectives of the project. There are so many layers of work to manage institutional repositories. Universities libraries, research offices, information technology departments, academic departments, university administration work together side by side to make sure that the OAIR process proceeds smoothly and sustainably over time. They share the responsibility of capturing the research output, organizing it and ensuring that long term availability and preservation are maintained, educating the scholars and the researchers about their privileges and rights as the authors of those works, and helping them understand the larger information policy and the copyright issues that touch their works. In the first four months of the project, the Palestinian partners conducted two surveys. The first aimed to assess researchers’ current practices, and the second explored institutional support staff capacity. The four participating institutions include: The Islamic University of Gaza (IUG) Al-Quds Open University (QOU) Birzeit University (BZU) Palestine Technical University-Kadoori (KAD) The EU partners have conducted a survey based on a survey carried out by DCC in 2015, to assess the current practice of Open Access and Research Data Management (RDM) in the UK, Europe, Australia and USA. The four participating EU institutions include: Vienna University of Technology (TUWIEN) The University of Parma (PARMA) The University of Brighton (BU) The University of Glasgow (GLA) The main objectives of this workshop were: To present the findings of a needs assessment survey study that was carried out with researchers and support staff in four Palestinian Higher Education Institutions (PS HEIs) between December 2016 and February 2017 To present good practice in establishing RDM and OA services from the European partners universities To engage the stakeholders in discussions to identify their requirements, interests or concerns regarding the OAIRs To setup requirements for doing the gap analysis and the road map for Research Data Management in PS partners through OAIRs. This report discusses a number of gaps that were identified and presented at the Needs Assessment workshop

    Development of academic ranking model in Croatia

    Get PDF
    Kvalitetno obrazovanje i doprinos svjetskom fondu znanja glavni su strateški ciljevi sustava obrazovanja i znanosti Republike Hrvatske. Ukoliko je kvalitetno obrazovanje poslanje hrvatskog obrazovnog sustava, potrebno je usuglasiti sve dionike oko definiranja kvalitete i modela njezina mjerenja kako bi sva visoka učilišta mogla prilagoditi svoje djelatnosti tome zajedničkome cilju. S druge strane, budući da „nacionalna“ znanost i visoko obrazovanje nisu izolirani sustavi, važno je kvalitetu promatrati i u globalnom svjetlu. Sustavi vrednovanja kvalitete – svjetska i nacionalna rangiranja te vrednovanja u hrvatskom sustavu visokog obrazovanja i znanosti su, u svojoj suštini, kombinacija odabranih pokazatelja koji mjere ulazne podatke (input), procese (process) i ishode (output) (Westerheijden, 1991) – jedna matrica ili poučavanje i učenje, istraživanje i društvena uloga – druga matrica. Pomoću ove dvije matrice moguće je opisati sustave vrednovanja na globalnoj i nacionalnoj razini te ih usporediti prema vrijednosti udjela pokazatelja u svakoj od kategorija. Analizom udjela svake od šest varijabli unutar ove dvije za svaki od šest analiziranih tipologija vrednovanja, dobiveni su rezultati, odnosno, koncept kvalitete koja se kroz svaki od tipologija na nacionalnoj i svjetskoj razini „nagrađuje“ kao poželjan. Sustav visokog obrazovanja u Hrvatskoj jedan je od rijetkih u Europi koji ne provodi nacionalno, niti globalno rangiranje institucija i(li) programa visokog obrazovanja, stoga potreba ujednačavanja definiranja i vrednovanja kvalitete na nacionalnoj razini treba obuhvatiti i njezinu usklađenost s međunarodnom perspektivom. Ona je, kako je ovo istraživanje potvrdilo, za sada deklarativne prirode, i kreće se od obrazovanja temeljenog na ulaznim podatcima (input), procesima (process) i poučavanju i učenju u smjeru obrazovanja i znanosti temeljenih na ishodima (output) i istraživanju. Ukoliko bi se željela potaći vidljivost te bolja kvaliteta ishoda nastavnog i istraživačkog procesa u Hrvatskoj, što je temeljna kategorija vrednovanja na međunarodnoj razini (rangiranjima), tada bi pokazatelji kvalitete ishoda (output) i istraživanja svakako trebali biti jače kapacitirani (u smislu broja i težinskih udjela) u sustavima vrednovanja i uspoređivanja kvalitete. Kao uravnotežen skup pokazatelja i težinskih udjela, predloženi model rangiranja sveučilišta u Hrvatskoj jedan je od prvih sustavno obrazloženih prijedloga, čija primjena zahtjeva konsenzus svih dionika u sustavu visokog obrazovanja i, naravno, detaljniju razradu u smislu realizacije.The quality and relevance of higher education is in the focus of European policy documents (European Commission, 2013). As a part of that environment, Croatia is also facing challenges of adjustment, dealing with and reviewing the status and role of the higher education institutions. Due to the increasing restrictions of state funding, the transformation of the education system into the student-oriented service, the need to prove the accountability and quality of the service offered by the higher education institutions, the increasing competitiveness among higher education institutions, the massification and internationalization of higher education (Avralev and Efimova, 2014), the academic ranking systems are becoming an increasingly common phenomenon. The information regarding the quality of the study programs and/or institutions and their status in comparison with other universities and programs (Altbach, 2010) becomes essential when it comes to choosing between a large number of higher education institutions and study programs (Blanco-Ramirez and Berger, 2014). In recent times the results of external quality assessment do not apply only to inform students and their parents about the quality of higher education institutions, but also to the geopolitical positioning of each country (EUA, 2014). The ranking of national universities and the number of universities that occupy high places on those rankings are increasingly considered to be one of the important elements of the country's economic competitiveness (Kishkovsky, 2012). Given that education has not the characteristics of a "product" (Majumdar, 1983; Winch, 2010), its quality is considered to be a multifaceted and multidimensional category difficult to be measured closely (Chattopadhyay, 2012). For this reason, the concept of quality should be analysed and defined in relation to the stakeholders in the higher education and science - prospective students, students, institutions, staff at higher education institutions, employers and financiers. Westerheijden (2007) gives priority to the importance of achieving a broad consensus on what the term quality encompasses in a particular education system over the attempts to define this ambiguous contextual concept. Each one of the approaches to defining the quality has implications for the quality assurance system and the policies adopted in each higher education system. Behind the context of the quality in a higher education system usually lies the idea of higher education, and given that the meaning and context of higher education changes over time, in that sense the notion of the quality refers to the values, goals, desired actions, experiences and results of higher education (Boyle and Bowden, 1997). In most cases, it is about approaching the quality as fitness for purpose, which allows the higher education institutions to evaluate and define the quality in line with their missions and goals (Woodhouse, 1999; Nicholson, 2011). The methodological approach to quality assessment and qualitative or quantitative procedures depends on the definition of quality requirements. Some types of assessment require the use of quantitative methods (e.g., ranking, thematic evaluations), while for others are much more acceptable qualitative methods (accreditation). The selection of indicators will depend on the choice of evaluation methods (qualitative or quantitative) and their possible complementary use in the same evaluation. The performance indicators are considered to provide clear, objective and measurable information that may serve as a solid basis for some (political) decision (Van Vught and Westerheijden, 1994). This clearly links the procedures of quality assurance with the intended results of the quality ensuring policy. The problem of the use of performance indicators is their applicability in the field of higher education and education in general considering the complexity, including the comparability of the data collected. In fact, it is sometimes difficult to obtain data that are mutually comparable. The assessment models are a combination of indicators that measure input, process and output, while the choice of the quality assessment model is based on the particularities of the national system, the current quality assessment model at international level, and finally, the availability of information. The latter can have negative consequences in cases when the emphasis is placed on measurable, not relevant indicators. The external quality assessment procedures vary within the national higher education systems and vary in their scope and focus. In today's systems of the external quality assessment of universities there are basically four models: accreditation, auditing, benchmarking and classification (including ranking). They further vary according to the subject and emphasis of the evaluation. The focus of the evaluation also changes, sometimes emphasizing the management and regulation or financial sustainability, and sometimes the student experiences, learning, curriculum development, curriculum design and competence of teachers. In the context of this research, a systematic review includes all assessment models existing in Croatia, while the analytical part is focused on the contents of the two procedures of the university quality assessment: re-accreditation and university rankings. Due to their importance, the university rankings are divided into national systems and ranking leagues, with particular emphasis placed on their contents. According to the purpose and methodology, the accreditation and ranking are different quality assessment tools. Both systems allow a certain way of description and comparison of the quality of institution and (or) program, based on the indicators used for the quality measuring and assessment. The main problems to which this research seeks to provide answers are related to an ambivalent understanding of the concept of quality which in Croatian system of higher education is not clearly defined and varies depending on the stakeholders, and to a problem of poor visibility of Croatian higher education institutions in the world rankings (Jokić et al., 2012). Among other, the reasons for that can be found in the specifics and large mutual differences of national institutions with respect to the size and profile (public / private universities, non-integrated / integrated universities; public / private universities of applied sciences and colleges, public / private research institutes). In addition to the specific structure of the Croatian higher education, the reason also lies in an increasing number and scope of external quality assessment procedures which still do not have a comparative element, and finally, in the absence of ranking system at national level that would enable a comparison of Croatian higher education institutions. On the one hand, using a matrix analysis of the performance indicators of higher education institutions, the research seeks to identify relevant indicators in ranking systems according to their share in the following categories: input, process and output, as well as teaching and learning, research and university third mission, and to determine differences between those and the indicators used as a quality measure in the national framework of its measurements. The objective of this paper is, on the basis of the methodology application and the results used in the process of re-accreditation of Croatian universities, as well as the results of the analysis of the methodological approach elaboration, the most commonly used global and national rankings, to propose a model of ranking of the higher education institutions in Croatia. The dissertation analyses the differences and specificities of the international and national evaluation models and understanding of quality as a basis for modelling. In order to determine areas of success of Croatian universities in different scientific fields have been used data and quality grades of public universities realized in existing national model of quality assessment – the re-accreditation. A comparison with international models of quality assessment will be based on the five most influential global rankings. In order to be able to mutually compare different models of the quality assessment of programs and (or) universities that have a different purpose, but use the same instruments - adequate indicators, documents that describe them - in this research they are divided into six analytical sections: 1. World academic rankings ARWU (Academic Ranking of World Universities), THE (Times Higher Education World University Rankings), QS (Quaquarelli Simmonds Top Universities), Webometrics Ranking of World Universities, CWTS Leiden Ranking and SJR (SCImago Institutions Rankings) 2. National academic rankings (Macedonian University Rankings, Macedonia, Bulgarian University Ranking System, Bulgaria, Perspektywy Ranking, Poland, Akademická rankingová a ratingová agentúra (ARRA), Slovakia) 3. Policy documents in the Croatian system of higher education and science (Strategy for Science, education and Technology, Three year funding agreements between the state and public HE institutions, Criteria for academic promotion) 4. Principles and criteria for the quality assessment of institutions and (or) programs in the external assessment procedures (Ordinance on Conditions for Issuing Licence for Scientific Activity, Conditions for Re-accreditation of Scientific Organisations and Content of Licence, Thematic evaluation Criteria), Higher education audit criteria, Principles and criteria for evaluation of scientific institution in Croatia, Reaccreditation of doctoral study programmes) 5. Criteria for the quality assessment of higher education institutions within universities 6. Quality grades of higher education institutions within public universities (technology, science and humanities) It is possible to analyse the indicators of quality in the framework of samples divided in this manner with regard to their content (according to Neuendorf, 2002) in two matrices: input - process - output, and teaching and learning - research - university third mission, and determine towards which group the results within these matrices deflect. With each pattern it is also described a process of its equalization with other samples, and as well the preparations for the most relevant comparison. The methodological limitations regarding the use of this approach are basically related to the size of the sample which, although comprising the entire population, represents a small sample. On such sample it possible to conduct the correspondence analysis only by using the descriptive statistical methods. Due to above limitations of the sample, it is not possible to confirm it by inferential statistics. By analytical elaboration and comparison of these six units are established differences between the Croatian and world models of the quality assessment, while their common features are the starting point for drafting a model of ranking of the higher education institutions in Croatia. The results of this research have confirmed all three hypotheses: 1. Hypothesis: Global systems of academic ranking include the indicators which are divided into three groups: input, process and output, and focused mainly on the output. -- The analysis of the world's most famous university rankings shows obvious deflection towards the measurement of the output (at least 66.7% of the overall assessment). The input is measured only in three world rankings - Webometrics, QS ranking and SCImago, but its share does not exceed one third of the overall assessment (max. 33.3% in SCImago ranking). Among the analysed rankings, THE ranking places emphasis on the process (32.5%). CWTS Leiden ranking evaluates exclusively the output. -- According to the second matrix, in the academic rankings most of the indicators and their weights refer to the research (Chart 2). Another interesting fact is that in the QS and Webometrics rankings half of the overall assessment refers to university third mission. CWTS Leiden and SCImago Institutions rankings do not include any indicators of the quality of teaching component, while the lowest share of indicators in the world rankings refers to teaching and learning (10- 30%). -- The national rankings according to the input-process-output matrix included in the analysis in the most cases rank the institutions on the basis of the output. Concerning the input, unlike the world rankings, the national rankings are taking it into account in the range of 23-11%, wherein only the Bulgarian University Ranking System does not take into account the indicators of input. Furthermore, all national rankings also include the indicators of process in the overall ranking, but their share in the overall assessment does not exceed 35% (Bulgarian University Ranking System). The output indicators in all analysed national rankings have a significant impact on the overall assessment and their share does not drop below 43%. -- In the teaching and learning - research – university third mission matrix, an overall assessment in the most of the national rankings is based on the indicators of the quality of teaching and learning. The indicators of the quality of research component are essential in ARRA ranking (50%) and Perspektywy (44%), and the lowest in Bulgaria's ranking system. Interestingly, in overall assessments that form national rankings, the indicators of university third mission have the least weight (Chart 4). -- The comparison of world and selected national rankings according to the inputprocess-output and teaching and learning-research-university third mission matrices showed similarities in two categories: the share of indicators of the output quality which comprises an average of more than two-thirds of all indicators, and the share of indicators of the quality of the university research component. An analysis according to individual ranking methodologies shows that among all world rankings CWTS Leiden Ranking places the highest emphasis on research and output, while the QS ranking is focused on the quality of teaching and learning, and the input. In average, the national rankings slightly more value the input and the quality of teaching and learning, but in this typology as well, as in the world rankings, the highest share have the qualities of research and output as categories primarily used for measuring the scientific component. 2. Hypothesis: Concept of quality in Croatian higher education system is not clearly defined, varies depending on stakeholders and it is focused on measuring the input and process. -- Observing the policy documents cumulatively, the distribution of categories and indicators of the quality assessment between two matrices is uneven. Despite the fact that some of the documents tend towards input (Ordinance on the Content of Licence and Conditions for Issuing Licence for Performing Higher Education Activity, carrying out a Study Programme and Re-accreditation of Higher Education Institutions), or output (Three year funding agreements between the state and public HE institutions (scientific activity), Principles and criteria for evaluation of scientific institution in Croatia, and Criteria for academic promotion), each of the analysed documents includes the evaluation of the process. The process (in more than half of the indicators) are evaluated by Strategy for Science, education and Technology, Three year funding agreements between the state and public HE institutions (higher education) and Criteria for the assessment of quality of higher education institutions within universities. These results point to the inconsistency of defining and measuring the quality in the evaluations carried out in the national system of higher education and science. -- The analysis of indicators according to the teaching and learning, research and university third mission matrix shows that elements of higher education institutions related to teaching and learning, as well as research have a similar weight, while the university third mission is included in each of the analysed documents, with the exception of Three year funding agreements between the state and public HE institutions (higher education). -- In the process of re-accreditation the input is measured in a slightly larger ratio than the output, which can be explained by the fact that the quality assessment in the context of re-accreditation includes the evaluation of the minimum conditions which is, in its essence, a check of the teaching and physical resources (input). The average share of the quality of teaching and learning, as well as research, in the overall quality assessment was 62.4%. The share of indicators and quality assessment of research in the process of re-accreditation is represented by less than a third (28.1%). -- The results of the comparative analysis of existing systems of the external assessment of universities in Croatia and selected world ranking systems show significant differences in shares of quality indicators considering the input- process- output and teaching and learning-research-university third mission. While models of the world and national rankings do not differ in placing the emphasis on output, Croatian systems of the university quality assessment are based on the quality of process, i.e., the activities taking place at the institutions of higher education. This indicates that the national assessment systems differ from the models relevant at global level. -- The comparative analysis of data from the teaching and learning-research-university third mission matrix shows that in both ranking typologies, national and world, the research is most valued, while Croatian assessment systems are focused on the third mission and teaching and learning. The category of university third mission in the world rankings is represented with an average share of 18.46%, while in the selected national university rankings is represented with significantly higher share of 25.25%. 3. Hypothesis: Croatian institutions of higher education have the elements of excellence at the level of individual segments of their activities. -- This research includes Croatian universities from the fields of technology, science and humanities, a total of 39. Despite the fact that the sample is not large enough to draw statistically validated conclusions, using the descriptive statistical analysis it is possible to encompass the characteristics of quality grades for the fields of technology, science and humanities. The process and the input, as well as the quality of teaching and learning are best rated elements in all analysed universities. In the category of research that has a significant share in academic rankings, the higher education institutions in the field of technology are the highest rated (55.40% rated as "mostly implemented") and science (the indicators

    DECriS Project IO2 Report

    Full text link
    With the support of DECriS Project (https://decris.ffos.hr/) team members from: • University of Osijek: Kristina Feldvari, Milijana Mičunović, Tatjana Aparac-Jelušić and Boris Badurina. • University of Barcelona: Gema Santos and Aurora Vall. • Hildesheim University: Thomas Mandl and Lea Wöbbekind. • University of Library Studies and Information Technologies, Sofia: Tania Todorova, Daniela Pavlova and Hristina Bogova. • Sveučilište u Zagrebu Sveučilišni računski centar (SRCE): Sandra Kučina Softić and Anja Đurđević.This research forms part of the Erasmus+ project Digital Education for Crisis Situations: Times when there is no alternative (DECriS). The project is focused on innovative digital practices implemented in Higher Education Institutions (HEI) in the field of Library and Information Science (LIS), their relationship to Digital Education (DE) in general, and the adoption of Open Education Resources (OER) in any learning situation, but especially in crisis situations such as the COVID-19 pandemic. The present report belongs to the Project’s Intellectual Output 2: “Digital Education appraisal and quality perception by students, teachers and trainers at the partner HEI during the COVID-19 crisis”. The aim of this output is to gain insights into the students’ and teachers’ attitudes towards DE and educational resources, in general, and towards canonical OER, in particular, mainly during the COVID-19 crisis and in contrast with their pre-pandemic experiences. A total of 39 interviews with teachers and 10 focus groups with students were conducted in the LIS centres of the five institutions that are project partners. For each partner, a report is presented with the results of the analysis of the teachers’ and students’ transcripts separately, structured in seven blocks: Context, attitudes and expectations; Adaptations; Problems; Advantages; Lessons learned; Good Practices; and Improvements.ERASMUS+ Project DECriS “Digital Education for Crisis Situations: Times When There is no Alternative”. Contract Number: 2020-1- HR01-KA226-HE-094685.This report is an outcome from the ERASMUS+ Project DECriS “Digital Education for Crisis Situations: Times When There is no Alternative”. Contract Number: 2020-1- HR01-KA226-HE-094685. The European Commission support for producing this publication does not constitute endorsement of the contents, which only reflect the views of the authors, and the Commission cannot be held responsible for any use that may be made of the information contained in this report

    Entrepreneurial University Transformation in Indonesia: A Comprehensive Assessment of IPB

    Get PDF
    This article [1] explores university entrepreneurial transformation in Indonesia with a case of Bogor Agricultural University (IPB). Data and information were collected through a content analysis of university policy and educational documents, a structured survey with 331 respondents, in particular staff and students, and 21 in-depth interviews and 5 focus group discussions with 77 people comprising university top-management, faculty, students, and external stakeholders. The European Commission/OECD entrepreneurial university framework was applied for the data analysis. In addition, quantitative indicators were compared with 76 Indonesian and 15 Asian universities. Findings indicate that IPB is an entrepreneurial university from the perspective of research-based technology transfer and innovation. In addition, qualitative information indicates that the entrepreneurial development of the learning and teaching processes needs more attention, however when quantitatively assessed, the student entrepreneurship output is high in relation to many other universities. The results have relevance for the higher education community in terms of understanding the complexity of transforming knowledge institutions into more entrepreneurial organizations. The authors demonstrate a holistic assessment methodology and subsequently propose objective measurements for assessing the entrepreneurial status of a university
    corecore