4 research outputs found

    Extending the Global Sensitivity Analysis of the SimSphere model in the Context of its Future Exploitation by the Scientific Community

    Get PDF
    In today’s changing climate, the development of robust, accurate and globally applicable models is imperative for a wider understanding of Earth’s terrestrial biosphere. Moreover, an understanding of the representation, sensitivity and coherence of such models are vital for the operationalisation of any physically based model. A Global Sensitivity Analysis (GSA) was conducted on the SimSphere land biosphere model in which a meta-modelling method adopting Bayesian theory was implemented. Initially, effects of assuming uniform probability distribution functions (PDFs) for the model inputs, when examining sensitivity of key quantities simulated by SimSphere at different output times, were examined. The development of topographic model input parameters (e.g., slope, aspect, and elevation) were derived within a Geographic Information System (GIS) before implementation within the model. The effect of time of the simulation on the sensitivity of previously examined outputs was also analysed. Results showed that simulated outputs were significantly influenced by changes in topographic input parameters, fractional vegetation cover, vegetation height and surface moisture availability in agreement with previous studies. Time of model output simulation had a significant influence on the absolute values of the output variance decomposition, but it did not seem to change the relative importance of each input parameter. Sensitivity Analysis (SA) results of the newly modelled outputs allowed identification of the most responsive model inputs and interactions. Our study presents an important step forward in SimSphere verification given the increasing interest in its use both as an independent modelling and educational tool. Furthermore, this study is very timely given on-going efforts towards the development of operational products based on the synergy of SimSphere with Earth Observation (EO) data. In this context, results also provide additional support for the potential applicability of the assimilation of spatial analysis data derived from GIS and EO data into an accurate modelling framework

    Partial order investigation of multiple indicator systems using variance based sensitivity analysis

    No full text
    Partial order tools can be used in multiple criteria analysis to prioritize and rank a set of objects. In this setting the starting point is generally a matrix Mnxk\mathbf{M}_{n \mathrm{x} k} of kk observed indicators on nn objects. Given that indicators are measured at least at the ordinal level, from matrix M\mathbf{M} its corresponding partially ordered set - poset - is set up to form the basis of multi-criteria ranking. The partial order may be very complex even when the number of objects to be compared is relatively small. The reason of such complexity is often due to the intrinsic nature of partial order which is exclusively based on the ordinal properties of the data matrix. Incomparabilities between objects can be due even to very small differences in the observed values of indicators thus causing \lq irrelevant' incomparabilities. Also, objects may have been characterized with a redundant set of variables so that the change in the values of some indicators should not seriously affect the structure of the poset. These two opposite conditions are directly linked to the indicator level of influence and call for an indicator value related sensitivity analysis for testing the robustness of posets. In this work we propose a method to carry out a sensitivity analysis for posets by using variance-based sensitivity indices to detect main effects and interactions between indicators. To this aim, we characterize the poset structure with scalar measures and compute variance-based sensitivity indices according to the most recent practice for a fully exploratory sensitivity analysis. These indices allow for detecting least and most influencing indicators.JRC.DG.G.3-Econometrics and applied statistic

    Sensitivity Analysis: A Necessary Ingredient for Measuring the Quality of a Teaching Activity Index

    Full text link
    In recent years, and following the introduction of the European Higher Education Area, universities have developed measurement mechanisms to ensure improvement in the quality of their teaching and teaching staff. One of the measurement tools increasingly used in Higher Education to implement continuous improvement policies for university teaching is composite indicators, which are a mathematical aggregation of a selected set of suitably weighted indicators. Composite indicator building should be accompanied by sensitivity analysis to ensure good practice. However, this is rarely done. Sensitivity analysis helps to improve the understanding and, ultimately, the soundness of the composite. In most cases, sensitivity analysis shows that the weights assigned to indicators do not reflect the actual importance of those indicators in the aggregation to the composite because of the heteroskedasticity of, and correlation between the underlying indicators. This paper proposes a composite indicator for the teaching activity of academic staff in a Spanish university. As we shall see in the paper, the desired weights stated by developers rarely represent the effective importance of the components. Hence, we propose sensitivity analysis as a necessary tool for readjusting weights in order to achieve the desired level of importance for each component indicator.This research has been supported by the Valencian Regional Government with a BFPI grant. The authors are grateful for the useful comments and helpful suggestions made by the following researchers at the Joint Research Centre: Paola Annoni, Andrea Saltelli, Marco Ratto and Michaela Saisana.Bas Cerdá, MDC.; Tarantola, S.; Carot Sierra, JM.; Conchado Peiró, A. (2016). Sensitivity Analysis: A Necessary Ingredient for Measuring the Quality of a Teaching Activity Index. Social Indicators Research. 1-16. doi:10.1007/s11205-016-1297-2116ANECA. (2006). DOCENTIA (Support Programme for Teaching Activity Assessment). Evaluation Model. Working Paper. Academic Staff Evaluation Unit 2006 (Vol. 1.0). http://www.aneca.es/eng/Programmes/DOCENTIA .Annoni, P., Brüggemann, R., & Saltelli, A. (2011). Partial order investigation of multiple indicator systems using variance-based sensitivity analysis. Environmental Modelling and Software, 26, 950–958.Bana, C., & Oliveira, M. (2011). A multicriteria decision analysis model for faculty evaluation. OMEGA The International Journal of Management Science, 40, 424–436.Bird, S. M., Cox, D., Farewell, V. T., Goldstein, H., Holt, T., & Smith, P. C. (2005). Performance indicators: Good, bad, and ugly. Journal of the Royal Statistical Society: Series A, 168, 1–27.Byrne, M., & Flood, B. (2003). Assessing the teaching quality of accounting programmes: An evaluation of the course experience questionnaire. Assessment and Evaluation in Higher Education, 28, 135–145.Cano, J. J., Carot, J. M., Fernández-Prada, M. A., Fargueta, F. (2009). An evaluation model of the teaching activity of academic staff. In Proceedings of I conference quality of teaching in higher education. Turkey.De Witte, K., Rogge, N., Cherchye, L., & Van Puyenbroeck, T. (2013). Economies of scope in research and teaching: A non-parametric investigation. OMEGA The International Journal of Management Science, 41, 305–314.Diseth, A., Pallesen, S., Hovland, A., & Larsen, S. (2006). Course experience, approaches to learning and academic achievement. Education and Training, 48, 156–169.European Commission Education and Training. (2009). The Bologna process-towards the european higher education area. http://ec.europa.eu/education/policy/higher-education/bologna-process_en.htm .European Commission-DG ENTR. (2001). Summary Innovation Index. http://www.proinno-europe.eu/page/summary-innovation-index-0 .European Commission-DG MARKT. (2001). Internal Market Scoreboard. http://ec.europa.eu/internal_market/score/index_en.htm .Fulei, W. (2010). The research on college teacher performance evaluation based on fuzzy-AHP method. In Proceedings of international workshop on education technology and computer science (pp. 561–564) China.Hativa, N. (1995). The department-wide approach to improving faculty instruction in higher education: A qualitative evaluation. Research in Higher Education, 36, 377–413.Jansen, E., Van der Meer, J., & Fokkens-Bruinsma, M. (2013). Validation and use of the CEQ in the Netherlands. Quality Assurance in Education, 21, 330–343.Medina, R. (2005). Misiones y funciones de la Universidad en el Espacio Europeo de Educación Superior. Revista Española de Pedagogía, 230, 17–42.Mora, J. G. (1999). Indicadores y decisiones de las universidades. Indicadores en la Universidad: Información y decisiones. Plan Nacional de Evaluación de la Calidad de las Universidades. Consejo de Universidades.Murias, P., Miguel, J Cd, & Rodríguez, D. (2008). A composite indicator for university quality assessment: The case of Spanish higher education system. Social Indicators Research, 89, 29–146.Nardo, M., Saisana M., Saltelli A., Tarantola S., Hoffman A., Giovannini, E. (2008). Handbook on constructing composite indicators: Methodology and User guide. OECD-JRC joint publication, OECD Statistics Working Paper, STD/DOC. http://www.oecd.org/std/42495745.pdf .OECD. (2002). Aggregated Environmental Indices. Review of aggregation methodologies in use. OECD Statistics Working Paper, ENV/EPOC/SE(2001)2/FINAL. http://www.oecd.org/officialdocuments/publicdisplaydocumentpdf/?cote=ENV/EPOC/SE(2001)2/FINAL&docLanguage=En .OECD. (2003). Composite Indicators of country performance: A critical assessment. OECD Statistics Working Paper, DSTI/DOC(2003)16. http://www.oecd-ilibrary.org/science-and-technology/composite-indicators-of-country-performance_405566708255?crawler=true .Paruolo, P., Saltelli, A., & Saisana, M. (2013). Ratings and rankings: Voodoo or science? Journal of the Royal Statistical Society: Series A, 176, 609–634.Pearson, K. (1905). On the general theory of skew correlation and non-linear regression. Cambridge: Dulau and Company.Pozo, C. (2010). Análisis de indicadores de evaluación de la calidad docente en la Universidad Pública Española. Diseño de una guía de buenas prácticas docentes. EA2009-0125 Working Paper.Ramón, N., Ruiz, J. L., & Sirvent, I. (2010). Using data envelopment analysis to assess effectiveness of the processes at the university with performance indicators of quality. International Journal of Operations and Quantitative Management, 16, 87–103.Ratto, M., Pagano, A., & Young, P. (2007). State dependent parameter metamodelling and sensitivity analysis. Computer Physics Communications, 177, 863–876.Richardson, J. T. E. (2005). Students’ perceptions of academic quality and approaches to studying in distance education. British Educational Research Journal, 31, 7–27.Rogge, N. (2011). Granting teachers the “benefit of the doubt” in performance evaluations. International Journal of Educational Management, 25, 590–614.Saisana, M., Saltelli, A., & Tarantola, S. (2005). Uncertainty and sensitivity analysis techniques as tools for the quality assessment of composite indicators. Journal of the Royal Statistical Society A, 168, 1–17.Saltelli, A. (2002). Making best use of model evaluations to compute sensitivity indices. Computer Physics Communications, 145, 280–297.Saltelli, A. (2007). Composite indicators between analysis and advocacy. Social Indicators Research, 81, 65–77.Saltelli, A., Annoni, P., Azzini, I., Campolongo, F., Ratto, M., & Tarantola, S. (2010). Variance based sensitivity analysis of model output. Design and estimator for the total sensitivity index. Computer Physics Communications, 181, 259–270.Saltelli, A., Chan, K., & Scott, E. M. (2000). Sensitivity analysis. New York: Wiley.Saltelli, A., Ratto, M., Andres, T., Campolongo, F., Cariboni, J., Gatelli, D., et al. (2008). Global sensitivity analysis. The premier. New York: Wiley.Saltelli, A., & Tarantola, S. (2009). On the relative importance of input factors in mathematical models: Safety assessment for nuclear waste disposal. Journal of American Statistical Association, 97, 702–709.Shahsavani, D., & Grimvall, A. (2011). Variance-based sensitivity analysis of model outputs using surrogate models. Environmental Modelling and Software, 26, 723–730.Shen, K., Wang, Z. (2010). Studying teaching quality monitoring index system of college teachers based on AHP. In Proceedings of international workshop on education technology and computer science (pp. 740–743) China.Stiglitz, J. E., Sen A., Fitoussi J. (2009). Report by the commission on the measurement of economic performance and social progress. technical report. Commission on the Measurement of Economic Performance and Social Progress, Paris. www.stiglitz-sen-fitoussi.fr .Tarantola, S., Gatelli, D., & Mara, T. (2006). Random balance designs for the estimation of first order global sensitivity indices. Reliability Engineering and System Safety, 91, 717–727.Vicerrectorado de Calidad y Evaluación de la Actividad Académica. (2014). Manual de evaluación de la actividad docente del profesorado de la Universidad Politécnica de Valencia. http://www.upv.es/entidades/VECA/menu_urlc.html?/entidades/VECA/info/U0668892.pdf .Yin, H., Wang, W., & Han, J. (2016). Chines undergraduates’ perceptions of teaching quality and the effects on approaches to studying and course satisfaction. Higher Education, 71, 39–57
    corecore