97 research outputs found

    Sofia City Tourist Image In Selected Social Media

    Get PDF
    The pandemic situation greatly changed the entire tourism sector and influenced the digital search and importance of the tourist destination image and presence in social media. Moreover, the virtual trips, open galleries and museums for virtual tours are expected to generate future interest and enhance the tourist arrivals in post-pandemic era. At the same time the identification of recognizable image-making attractions at destination level is complex process and difficult measure and manage. The purpose of the study is to identify and evaluate the presence of major tourist attractions in different digital channels – social media and platforms, as recognizable image-making sites about capital city. The methodology is primary data analysis of freely shared digital content based on key words and hashtags used in social media such as Instagram, Pinterest, Flikr. The data was gathered manually from 15th Apr 2015-15th May 2021 and 17000 posts in total were analysed for this study. The impact of the content was evaluated by type of pictures, major attractions and number of likes, comments and shares on the post. The results showed that there is limited number of digital contents related to the selected specific key words. There is no clear image of the Sofia city in the analysed social channels

    Measurements of the Most Significant Software Security Weaknesses

    Full text link
    In this work, we provide a metric to calculate the most significant software security weaknesses as defined by an aggregate metric of the frequency, exploitability, and impact of related vulnerabilities. The Common Weakness Enumeration (CWE) is a well-known and used list of software security weaknesses. The CWE community publishes such an aggregate metric to calculate the `Most Dangerous Software Errors'. However, we find that the published equation highly biases frequency and almost ignores exploitability and impact in generating top lists of varying sizes. This is due to the differences in the distributions of the component metric values. To mitigate this, we linearize the frequency distribution using a double log function. We then propose a variety of other improvements, provide top lists of the most significant CWEs for 2019, provide an analysis of the identified software security weaknesses, and compare them against previously published top lists.Comment: 12 page

    Structural and mechanistic insights into a Bacteroides vulgatus retaining N-acetyl-β-galactosaminidase that uses neighbouring group participation

    Get PDF
    Bacteroides vulgatus is a member of the human microbiota whose abundance is increased in patients with Crohn's disease. We show that a B. vulgatus glycoside hydrolase from the carbohydrate active enzyme family GH123, BvGH123, is an N-acetyl-β-galactosaminidase that acts with retention of stereochemistry, and, through a 3-D structure in complex with Gal-thiazoline, provide evidence in support of a neighbouring group participation mechanism

    Beyond Crowd Judgments: Data-driven Estimation of Market Value in Association Football

    Get PDF
    Association football is a popular sport, but it is also a big business. From a managerial perspective, the most important decisions that team managers make concern player transfers, so issues related to player valuation, especially the determination of transfer fees and market values, are of major concern. Market values can be understood as estimates of transfer fees—that is, prices that could be paid for a player on the football market—so they play an important role in transfer negotiations. These values have traditionally been estimated by football experts, but crowdsourcing has emerged as an increasingly popular approach to estimating market value. While researchers have found high correlations between crowdsourced market values and actual transfer fees, the process behind crowd judgments is not transparent, crowd estimates are not replicable, and they are updated infrequently because they require the participation of many users. Data analytics may thus provide a sound alternative or a complementary approach to crowd-based estimations of market value. Based on a unique data set that is comprised of 4217 players from the top five European leagues and a period of six playing seasons, we estimate players’ market values using multilevel regression analysis. The regression results suggest that data-driven estimates of market value can overcome several of the crowd’s practical limitations while producing comparably accurate numbers. Our results have important implications for football managers and scouts, as data analytics facilitates precise, objective, and reliable estimates of market value that can be updated at any time

    Living at the Extremes: Extremophiles and the Limits of Life in a Planetary Context

    Get PDF
    Prokaryotic life has dominated most of the evolutionary history of our planet, evolving to occupy virtually all available environmental niches. Extremophiles, especially those thriving under multiple extremes, represent a key area of research for multiple disciplines, spanning from the study of adaptations to harsh conditions, to the biogeochemical cycling of elements. Extremophile research also has implications for origin of life studies and the search for life on other planetary and celestial bodies. In this article, we will review the current state of knowledge for the biospace in which life operates on Earth and will discuss it in a planetary context, highlighting knowledge gaps and areas of opportunity

    Development of a reliable, valid, multi-dimensional measure of student engagement in group projects

    No full text
    The Study Purpose: Assess SGPEQ reliability and internal consistency; establish criterion-related and construct validity, for measure to be usable with confidence for quantitative analysis. Data: All students form 25 ITEC 620 sections and 15 TLMN sections from two semesters in 2012 were invited to participate. Data was gathered from 260 students, from which four declined to fill in the survey and 24 did not answer the questions. Method: • Study on SGPEQ initial data reliability and validity. • Initial item reduction and factors identification through exploratory factor analysis • Further verification of validity through a study on relationships between factors and students’ self-reported engagement and endorsement of self-theories.Development of a Reliable, Valid, Multi-Dimensional Measure of Student Engagement in Group Projects Irena Bojanova, Ph.D. The Graduate School University of Maryland University College Main Result A reliable, valid, multi-dimensional measure of student engagement in group projects was created. Empirical evidence of its usefulness allows the measure to be easily administrated, while providing a comprehensive snapshot of students’ engagement in group projects. Agenda 1.Introduction and Background. 2.The Study 3.Initial Data Reliability and Internal Consistency. 4.Explanatory Factor Analysis and Reliability Estimates. 5.Further Verification of Validity 6.Conclusion Introduction and Background •Graduate programs have to emphasize on group projects •How to optimize students’ learning experiences and projects’ outcomes? •“Student engagement in group projects” was identified as a distinguishable construct and Student Group Project Engagement Questionnaire (SGPEQ) instrument was developed, and face and content validated. Immersive vs. Traditional Project Definitely Not No Maybe Yes Definitely Yes All Yes 3 |3 9 | 7 15 | 22 52 | 43 21 | 26 73 | 69% 3 |4 3 | 12 15 | 19 55 | 42 24 | 23 79 | 65% 3 |3 12 | 7 6 | 11 42 | 46 37 | 34 79 | 80% 3 | 3 9 | 8 30 | 22 33 | 50 25 | 18 58 | 68% 3 | 3 6 | 3 27 | 16 46 | 52 18 | 26 64 | 78% 0 | 1 12 | 5 18 | 16 49 | 54 21 | 23 70 | 77% 0 | 1 9 | 3 12 | 11 46 | 57 34 | 28 79 | 85% 0 | 1 0 | 1 3 | 19 30 | 51 67 | 27 97 | 78% 0 | 1 0 | 8 0 | 32 39 | 39 61 | 19 100 | 58% 0 | 4 3 | 16 15 | 30 36 | 31 46 | 19 82 | 50% Survey Questions Communication/ presentation skills Technical skills Team-building skills Leadership skills Understand better course material Academically challenging Develop critical thinking& problem solving Provoke curiosity and sense of discovery Engaging and fun experience Would like similar in other classes The Study Purpose: •Assess SGPEQ reliability and internal consistency. •Establish criterion-related and construct validity, for measure to be usable with confidence for quantitative analysis. Data: •All students form 25 ITEC 620 sections and 15 TLMN sections from two semesters in 2012 were invited to participate. •Data was gathered from 260 students, from which: four declined to fill in the survey 24 did not answer the questions. Gender Female 24.8% Male 75.2% Age 20-30 years 16.2% 31-40 years 34.7% 41-50 years 36% 51-60 years 10.4% Over 60 years 2.7% Method •Study on SGPEQ initial data reliability and validity. •Initial item reduction and factors identification through exploratory factor analysis •Further verification of validity through a study on relationships between factors and students’ self-reported engagement and endorsement of. Initial Data Reliability and Internal Consistency •Reliability refers to the ability of the measurement instrument to give similar results for similar inputs. •Reliability analysis was conducted based on the following models: Cronbach’s Alpha (model of internal consistency, based on the average inter-item correlation) and Split-Half (model splits the scale into two parts and examines the correlation between the parts). •The Cronbach's alpha (0.940) and Split Half (0.894 vs 0. .884) values for the analyzed survey instrument confirms high level of reliability –Appendix C. •The Inter-Item correlations (783 of the 1225 are larger than 0.30) also confirm suitability of the data for factor analysis – Appendix D. Explanatory Factor Analysis and Reliability Estimates •Item reduction and factors identification: Principal axis factoring with Varimax rotation was performed on the 35 student engagement in group projects items. •The questions were: "How many components (factors) are needed to represent the variables?" and "What do these components represent?" •Analysis of a six-factor (Appendix E) and a four-factor (Appendix F) solution and the inspection of a scree plot (Appendix G) 4 factors were retained. Interpretability was difficult after four factors and the scree plot confirmed the slope decreases a little after four factors. •Only one question, “Stepped in when a teammate was not performing”, was dropped due to low communality and low loading on all factors. Explanatory Factor Analysis and Reliability Estimates •The four factors accounted for 59.15% of the variance. Appendix I shows the factor solution and the items for each factor. •Each factor comprised 19.69%, 14.04%, 11.69%, and 8.23% of the variance correspondingly. •The factors were titled: Effort, Teamwork, Motivation, and Organization. •Factors reliability and preliminary evidence of discriminant validity: The four student engagement in group projects factors show: reasonable reliability (.926, .838, .889, .766 correspondingly) the inter-item correlations (the highest is .088) support the discriminant validity of the SGPEQ measure. Further Verification of Validity SGPEQ validity was further verified through a study on the relationship between the identified factors and: Students’ self-reported engagement Endorsement of self-theories Appendix B and Appendix K. Self-Reported Engagement •Factors predicting absolute engagement (in this group project) and relative engagement (compared with other projects) were determined. •For that two analyses that regressed each of the two dependent variables on the four factors of SGPEQ were performed. Absolute engagement “How engaged were you in this group project?” Regression of relative engagement on the four factors shows: the factors account for 63.1% (R Square from Model Summary table) of the variance in absolute engagement 70% of absolute engagement was explained by the model: F(4, 166)=70.81, p < .001 (F and Sig. from ANOVA table). Factor 1 (β=.611); Factor 2 (β=.174), Factor 3 (β=.325), Factor 4 (β=.236) are all positive predictors of absolute engagement. Self-Reported Engagement Relative engagement “How engaged were you in this group project compared to other group projects you worked on during the same semester?” Regression of relative engagement on the four factors shows: the factors account for 44% of the variance in absolute engagement 33% of relative engagement was explained by the model: F(4, 166)=32.54, p < .05. Factor 1 (β=.387); Factor 2 (β=.259), Factor 3 (β=.326), Factor 4 (β=.233) are all positive predictors of absolute engagement. Incremental Self-Theories •Incremental and entity self-theories: “You have a certain amount of intelligence and you cannot do much to change it.” Regression of belief in incremental theory on the four factors shows: the factors account for 59% of the variance in absolute engagement 3% of belief in incremental theory was explained by the model: F(4,166)=2.6, p < .001 Factor 1 (β=.024); Factor 2 (β=.141)Factor 4 (β=.171) are positive predictors of belief in incremental theory. The Validated SGPEQ Effort •Did good work on my part •Communicated clearly and effectively •Was creative and productive •Was organized and prepared •Fulfilled the assigned role •Presented the final research product clearly and effectively •Contributed to discussions with ideas and opinions •Applied critical thinking and problem solving •Completed all assigned tasks on time •Attended all group meetings •Put forth effort •Developed leadership skills Motivation •Thought about project activities between meetings •Was inspired to learn and contribute •Found ways to make project interesting to me •Found project activities relevant to my life •Found project academically challenging •Was motivated and enthusiastic Teamwork •Had fun during team activities •Trusted teammates will do well on their project parts •Felt presence of team members during meetings/ presentations (as if in person) •Got to know teammates' strengths •Preferred team-work than working on my own •Would like to have similar projects in other classes •Experienced sense of discovery and accomplishment •Was confident that we can learn and do well on the project •Incorporated teammates ideas and opinions •Preferred to work on my own Organization •Helped/ tutored teammates during project activities •Found ways to involve non-participating team members •Took detailed notes during discussion meetings •Rehearsed for project presentation •Wished my teammates were working harder than me •Worked on the project on a regular basis References [1] AERA, APA, NCME (1999). Standards for Educational and Psychological Testing, http://teststandards.org [3] Carini, R., Kuh, G., Klein, S. (2006). Student Engagement and Student Learning: Testing the Linkages, Research in Higher Education, Vol. 47, No. 1, http://gov.alaska.edu/faculty/StudentSuccess/TintoReview-Carini-Kuh-Klein.pdf [4] Clark, L. & Watson, D. (1995). Constructing validity: Basic issues in objective scale development. Psychological Assessment, 7, 309-319. [5] Community College Survey of Student Engagement. University of Texas at Austin, www.ccsse.org. [6] Cooperative Institutional Research Program (CIRP) Surveys. Higher Education Research Institute (HERI), http://heri.ucla.edu/herisurveys.php [7] DeVellis, R. (2003). Scale development: Theory and applications. Thousand Oaks, CA: Sage Publications. [8] Dweck, C. (1999). Self-Theories: Their role in motivation, personality, and development. Philadelphia: The Psychology Press. [9] Handelsman, M, Briggs, W., Sullivan, N., & Towler, A. (2005). A Measure of College Student Engagement. Journal of Educational Research, Vol. 98, No. 3, pp. 184-191; http://www.stanford.edu/dept/SUSE/projects/ireport/articles/self-regulation/self-course%20engement%20measure.pdf [10] Hinkin, T. (1998). A brief tutorial on the development of measures for use in survey questionnaires. Organizational Research Methods, 1, 104–121. [12] Kerlinger, 1F.N., (1986). Foundations of Behavioral Research. Holt, Rinehart and Winston. [13] Likert, R. (1931). A technique for measurement of attitudes. Archives of Psycology. New York: Columbia University Press. [14] Litwin, M. S. (2003). How to assess and interpret survey psychometrics, 2nd edition. Thousand Oaks, CA: Sage Publications. [15] McMillan, J. & Schumacher, S. (2001). Research in education: A conceptual introduction. New York: Longman. [16] Molinari, J. & Huonker, J. (2010). Diagnosing student engagement in the business school classroom. Journal of the Academy of Business Education. [17] National Survey of Student Engagement (NSSE). Center for Postsecondary Research, Indiana University in Bloomington, http://nsse.iub.edu/ [18] O'Donnell, A., Reeve, J., and Smith, J. (2009) Educational Psychology: Reflection for Action, Chapter 11. Wiley. [19] Pike, G., Kuh, G. (2005) A Typology of Student Engagement for American Colleges and Universities, Research in Higher Education, Vol. 46, No. 2,, http://cpr.iub.edu/uploads/Pike,%20Kuh%20(2005)%20A%20Typology%20of%20Student%20Engagement%20for%20American%20Colleges%20and%20Universities.pdf [20] Robinson C. &Hullinger, H. (2008). New Benchmarks in Higher Education: Student Engagement in Online Learning. Journal of Education for Business. http://cyber.law.harvard.edu/communia2010/sites/communia2010/images/Robinson_et_al_2008_New_Benchmarks_in_Higher_Education_Student_Engagement_in_Online_ [21] Schoenfeldt, L. F. (1984). Psychometric properties of organizational research instruments. In T. S. Bateman & G. R. Ferris (Eds.), Method & analysis in organizational research (pp.68-80). Reston, VA: Reston. [22] Skinner, E.A., and Belmont, M.J. (1993). .Motivation in the classroom: Reciprocal effects of teacher behavior and student engagement across the school year. Journal of Educational Psychology, 85(4). p. 572. Questions and Answer

    Conducting immersive group projects for core graduate level IT courses

    No full text
    Abstract: Academia has to emphasize on developing virtual teamwork and leadership skills, as they are increasingly essential in the new era of globalization and information technology. An innovative approach for conducting immersive group projects has been developed and implemented for increasing students' engagement, satisfaction, and team-building, presentation, and collaboration skills development. The instructor and the teams utilize virtual worlds and cloud computing technologies as enhancements to the standard online class-environment. Second Life, as representative of virtual worlds, is used for avatar-based real-time kick-off and regular meetings, team-building simulation games and scavenger hunts, research on related to course-objectives virtual demonstrations and simulations, and final project presentations. BatchGeo and Google Docs, as representatives of Cloud Computing, are used for collaboration on teams' formation, meetings scheduling, and preparation of current project deliverables and final project presentations.Conducting Immersive Group Projects for Core Graduate Level IT Courses Irena Bojanova, Ph.D. Graduate School of Management and Technology University of Maryland University CollegeMain Finding A study on student experiences from immersive and traditional online group projects revealed that: There is need of a reliable and valid measure of student engagement in group projects.Agenda 1. Introduction and Background. 2. Study 1: Student experiences from immersive and traditional group projects. 3. Study 2: Conducted research and available measures of student engagement. 4. Further work.Introduction and Background • Graduate programs have to emphasize on group projects. • ITEC620 offers immersive and traditional group projects.Immersive Group Project • Virtual team-building and research activities. • Avatar-based meetings and presentations, virtual tours, simulation games, and scavenger hunts. • Focus on enhancing course objectives.Research Objectives • Examine student experiences in immersive vs. traditional group projects and identify a distinguishable construct. • Review conducted research on the identified construct and the available measurement instruments. • Develop a preliminary scale for a reliable and valid measure of the identified construct.Study 1 Examination of student experiences in immersive vs. traditional group projects. Purpose: • Analyze student experiences from immersive and traditional online group project. • Identify a distinguishable construct worth to be studied further via a reliable, valid, multi-dimensional measure. Data • Invited all students from 7 ITEC 620 sections. • Collected from 107 students (33–immersive, 74 –traditional): more experts in immersive (33% / 3%), also more newbies (6% / 3%) more not working in immersive (12% / 3 %) almost same percentages of female vs. male (30% / 27%) more of age over 35 in immersive (60% / 47%).Method • Develop a multiple-item instrument with Likert-type scaling. • Administer questionnaire during last week of semesters. • Measure instrument reliability and internal consistency with: Cronbach's αcoefficient of reliability Pearson correlation coefficient, r. Cronbach’s α Definition: Cronbach's αis a scale reliability coefficient [0.0, +1.0] –used for testing surveys’ internal consistency: , where: K –number of testlets –variance of i‐thtestlet –variance of total test scores. Cronbach’s αfor the Study 1 surveyare all > .721: Project Type Cronbach'sα Mean for Test Standard Deviation for Test Immersive .906 4.036 .647 Traditional .925 3.831 .723 Cronbach's α Internal Consistency α ≥ .9 Excellent .9 > α ≥ .8 Good .8 > α ≥ .7 Acceptable .7 > α ≥ .6 Questionable .6 > α ≥ .5 Poor .5 > α UnacceptablePearson Coefficient Definition: Pearson product-moment correlation coefficient, r, [-1.0, +1.0] reflects the extent of a linear relationship between two variables Xand Y: Pearson’s coefficients for the Study 1 survey are all > 0: Pearson coefficients: Immersive / Traditional Project Q1 Q2 Q3 Q4 Q5 Q6 Q7 Q8 Q9 Q10 Q1: Communication/ presentation skills 1 .713 .876 .695 .618 .474 .487 .508 .607 .677 Q2: Technical skills .755 1 .692 .601 .583 .518 .490 .399 .595 .613 Q3: Team‐building skills .764 .722 1 .752 .608 .428 .463 .470 .632 .658 Q4: Leadership skills .649 .540 .746 1 .524 .278 .429 .386 .496 .460 Q5: Understand better course material .459 .635 .497 .490 1 .600 .526 .410 .379 .483 Q6: Academically challenging .459 .696 .513 .438 .633 1 .710 .560 .451 .483 Q7: Develop critical thinking and problem solving .453 .646 .525 .464 .655 .705 1 .765 .463 .449 Q8: Provoke curiosity and sense of discovery .256 .334 .342 .434 .321 .335 .519 1 .617 .543 Q9: Engaging and fun experience .396 .225 .377 .579 .467 .220 .439 .720 1 .700 Q10: Would like similar in other classes .368 .271 .383 .597 .491 .272 .359 .611 .770 1Results • Immersive projects: engaging and fun, help develop communication/ presentation andtechnical skills, want to have in other classes. • Traditional projects: academically challenging, help understand material and develop critical thinking, problem solving and leadership skills. • Both types of projects: help develop team-building skills. Immersive vs. Traditional Project Definitely Not No Maybe Yes Definitely Yes All Yes 3 /3% 9 /7% 15 /22% 52 /43% 21 /26% 73 /69% 3 /4% 3 /12% 15 /19% 55 /42% 24 /23% 79 /65% 3 /3% 12 /7% 6 /11% 42 /46% 37 /34% 79 /80% 3 /3% 9 /8% 30 /22% 33 /50% 25 /18% 58 /68% 3 /3% 6 /3% 27 /16% 46 /52% 18 /26% 64 /78% 0 /1% 12 /5% 18 /16% 49 /54% 21 /23% 70 /77% 0 /1% 9 /3% 12 /11% 46 /57% 34 /28% 79 /85% 0 /1% 0 /1% 3 /19% 30 /51% 67 /27% 97 /78% 0 /1% 0 /8% 0 /32% 39 /39% 61 /19% 100 /58% 0 /4% 3 /16% 15 /30% 36 /31% 46 /19% 82 /50% Survey Questions Communication/ presentation skills Technical skills Team‐building skills Leadership skills Understand better course material Academically challenging Develop critical thinking and problem solving Provoke curiosity and sense of discovery Engaging and fun experience Would like similar in other classesCorrelations Students want immersive projects because of the engaging and fun experience–correlated to provoking curiosity and sense of discovery. Project Type Strong Correlations (r > .7) Immersive only “Technical skills” and “Team‐building skills” “Engaging and fun experience” and “Provoke curiosity and sense of discovery” Traditional only None Both “Academically challenging” and “Develop critical thinking and problem solving” “Communication/presentation skills” and “Technical skills” “Communication/presentation skills” and “Team‐building skills” “Leadership skills” and “Team‐building skills” “Would like similar in other classes” and “Engaging and fun experience”Additional Correlations Additional correlations reveal some intriguing facts: Project Type Intriguing Findings Immersive "Time spent on research" does not correlate to “Academically challenging” (.0.024), but negatively correlates to “Team‐building skills” (‐.371), “Leadership skills” (‐.316) and “Engaging and fun experience“ (‐0.306). "Time spent on presentations" correlates negatively to student technical level. "Time spent on all research, collaboration, and presentations" correlates negatively to " Develop critical thinking and problem solving” (‐0.436) and “Engaging and fun experience” (‐0.312). Traditional "Time spent on research, collaboration, and presentations" correlates to “Would like similar in other classes” (0.302). 44% of students, who worked on traditional projects, had the technology to work on immersive projects; 34% of them did not check if they have the technologyMain Findings 1. Students’ engagement in group projects is a distinguishable construct worth to be studied further. 2. There is a need of a reliable and valid measure of student engagement in group projects. All Yes 73 /69% 79 /65% 79 /80% 58 /68% 64 /78% 70 /77% 79 /85% 97 /78% 100 /58% 82 /50% Immersive vs. Traditional Project Survey Questions Communication/ presentation skills Technical skills Team‐building skills Leadership skills Understand better course material Academically challenging Develop critical thinking and problem solving Provoke curiosity and sense of discovery Engaging and fun experience Would like similar in other classesStudy 2 Examination of conducted student engagement research and available measures. First stage: Student Group Project Engagement Questionnaire (SGPEQ) scale Second stage (further work): SGPEQ explanatory factor analysis and reliability estimates. Purpose of first stage: • Reviewconductedresearch and available measurement instruments of student engagement. • Start development of a reliable, valid, multidimensional measure of student engagement in group projects.Data and Method Data: • Invited: Same group of students as for Study 1 and All 19 active ITEC620 faculty members. • Collected data from 40 students and 12 faculty members. Method: • Review publications on student engagement research for definitions, approaches, and valid measurement instruments. • Administer instrument asking students and faculty to describe what engaged in group projects students do, feel, and think. • Apply inductive approach to capture the many potential dimensions of student engagement in group projects. • Refine developed measure via experts panel discussions.Definition of Student Engagement • Student engagement is defined as “the extent of a student’s behavioral intensity, emotional quality, and personal investment in a learning activity.” O'Donnell, A., Reeve, J., and Smith, J. (2009). Educational Psychology: Reflection for Action, Chapter 11. Wiley. • Engaged students “show sustained behavioral involvement in learning activities accompanied by a positive emotional tone. They select tasks at the border of their competencies, initiate action when given the opportunity, and exert intense effort and concentration in the implementation of learning tasks; they show generally positive emotions during ongoing action, including enthusiasm, optimism, curiosity, and interest.” Skinner, E.A., and Belmont, M.J. (1993). Motivation in the classroom: Reciprocal effects of teacher behavior and student engagement across the school year. Journal of Educational Psychology, 85(4). p. 572.Measuring Student Engagement • Research on student engagement date to the mid-1980s • Student engagement has been assessed at: Institutional and program level (e.g. HERI, NSSE, CCSSE) Course level and specifically in online courses. • However, no reliable and valid measure of student engagement in group projects was identified. Developing a New Measure • Administered a questionnaire, asking students and faculty to describe what engaged in group projects students do, feel, and think. Students Survey Questions What do you do when you are truly engaged in a group project work? What do you feel when you are truly engaged in a group project work? What do you think when you are truly engaged in a group project work? Faculty Survey Questions What do truly engaged (in a group project work) students do? What do truly engaged (in a group project work) students feel? What do truly engaged (in a group project work) students think?Results • Applied inductive approach to capture the many potential dimensions of student engagement in group projects.Preliminary SGPEQ Scale Preliminary scale was further refined by a focus group: To what extent do the following behaviors, thought, and feelings described you in this group project? Please rate each of them on the following scale: 1 = Very little; 2 = Some; 3 = Quite a bit’; 4 = Very much SGPEQ Questions Effort Worked on the project on a regular basis Put forth effort Preferred to work on my own Took detailed notes during discussion meetings Wished my teammates were working harder than me Completed all assigned tasks on time Rehearsed for project presentation Relevance Was motivated and enthusiastic Found project activities relevant to my life Thought about project activities between meetings Found ways to make project interesting to me Was inspired to learn and contribute Felt presence of team members during meetings/ presentations (as if in person) Found project academically challenging Would like to have similar projects in other classes Participation/ Collaboration Fulfilled the assigned role Contributed to discussions with ideas and opinions Got to know teammates' strengths Had fun during team activities Incorporated teammates ideas and opinions Helped/ tutored teammates during project activities Preferred team‐work than working on my own Experienced sense of discovery and accomplishment Trusted teammates will do well on their project parts Found ways to involve non‐participating team members Stepped in when a teammate was not performing (+ N/A option if eve Performance Was organized and prepared Communicated clearly and effectively Attended all group meetings Applied critical thinking and problem solving Did good work on my part Was creative and productive Developed leadership skills Presented the final research product clearly and effectively Was confident that we can learn and do well on the project Further work Second stage : Explanatory factor analysis and reliability estimates Data: 180 students form 11 ITEC 620 sections will be invited. Method: A study on SGPEQ initial data reliability and validity: • Initial item reduction: exploratory factor analysis and reliability estimates. • Further verification of measure’s validityvia students’ self-reported engagement, endorsement of self-theories, and goal orientation. Additional Validity Questions Global Engagement How engaged were you in this group project? (1=not at all, 6=extremely) How engaged were you in this group project compared to other group projects you worked on during the same semester?(1=less engaged, 6=more engaged) Incremental theory I have a certain amount of intelligence and I cannot do much to change it. (1=do not agree, 6=strongly agree) Motivational goals (learning vs. performance) If I hadto choose between getting a good grade and being challenged by the group project activities, I would choose: ___ ’good grade’ ___ being challenged. General How much time did you spend on: Discussions; Research; Group project overall What is your general opinion about group projects?References [1] Bojanova, I. (2010). Immersive Group Projects for Graduate IT Courses. Sloan‐C. 17 Sloan‐C International Conference on Online Learning. [2]Bojanova, I. (2011). Team‐Building with Virtual Simulations and Scavenger Hunts. 9thInternational Conference on Education and Information Systems, Technologies, and Applications (IESTA). [3] Carini, R., Kuh, G., Klein, S. (2006). Student Engagement and Student Learning: Testing the Linkages, Research in Higher Education, 47 (1). [4] Clark, L. & Watson, D. (1995). Constructing validity: Basic issues in objective scale development. Psychological Assessment,7, 309‐319. [5] Community College Survey of Student Engagement. University of Texas at Austin. [6] Cooperative Institutional Research Program (CIRP) Surveys. Higher Education Research Institute (HERI). [7] DeVellis, R. (2003). Scale development: Theory and applications. Thousand Oaks, CA: Sage Publications. [8] Dweck, C. (1999). Self‐Theories: Their role in motivation, personality, and development. Philadelphia: The Psychology Press. [9] Handelsman, M., Briggs, W., Sullivan, N., Towler, A. (2005). A Measure of College Student Engagement. Journal of Educational Research, 98 (3). [10] Hinkin, T. (1998). A brief tutorial on the development of measures for use in survey questionnaires. Organizational Research Methods, 1. [11] Holbeche, L. (2005).The High Performance Organization. Elsevier. [12] Likert, R. (1931). A technique for measurement of attitudes. Archives of Psycology. New York: Columbia University Press. [13] Litwin, M. S. (2003). How to assess and interpret survey psychometrics, 2nd edition. Thousand Oaks, CA: Sage Publications. [14] McMillan, J., Schumacher, S. (2001). Research in education: A conceptual introduction. New York: Longman. [15] Molinari, J., Huonker, J. (2010). Diagnosing student engagement in the business school classroom. Journal of the Academy of Business Education. [16] National Survey of Student Engagement (NSSE). Center for Postsecondary Research, Indiana University in Bloomington. [17] O'Donnell, A., Reeve, J., and Smith, J. (2009) Educational Psychology: Reflection for Action, Chapter 11. Wiley. [18] Pike, G., Kuh, G. (2005) A Typology of Student Engagement for American Colleges and Universities, Research in Higher Education, 46, 2. [19] Robinson C., Hullinger, H. (2008). New Benchmarks in Higher Education: Student Engagement in Online Learning. Journal of Education for Business. [20] Skinner, E.A., and Belmont, M.J. (1993). Motivation in the classroom: Reciprocal effects of teacher behavior and student engagement across the school year. Journal of Educational Psychology, 85(4). p. 572.Questions and Answer
    • …
    corecore