240 research outputs found

    Freshmen Who Plan to Transfer (Analysis)

    Get PDF
    On the 2009 WELS baseline survey of incoming fall 2009 freshmen, thirteen percent indicate some likelihood of transferring prior to graduation. Western administrators are interested in the retention rate of these students, as well as demographic and educational history characteristics. This is a brief exploratory analysis of these questions

    Academic Performance of Native and Transfer Students

    Get PDF
    In the Fall quarter of 2009, Western Washington University enrolled about 900 transfer students, one-third of the incoming freshmen class that quarter. More transfers were later admitted in the winter and spring quarters. Given the large numbers of transfer students attending Western and the likelihood of increased reliance upon transfers in the future, it is important to understand what, if any, performance differences exist between transfer and native students. This report compares academic success of natives and transfers using two measures: grades earned after achieving 90 credits and earning a Western degree. In order to make as precise comparisons as possible, this paper pools transfers and native (non-running start) students over a 7-year period (Fall, 2002 through Fall, 2009). To make transfers and natives as comparable as possible, excluded are all natives who fail to earn 90 credits and all transfers who arrive at Western with less than 90 credits. The remaining 23,951 observations of students are at roughly the same place in their academic careers; both groups need to earn about 90 credits to graduate and both should begin to be focusing on their major and upper division coursework. Basic descriptive statistics suggest that natives hold a significant advantage over transfers in their probability of graduating. Of the 11,784 native students who achieved 90 credits at Western, 64.6% eventually graduated while 51.6% of transfer students who came to Western with 90 or more credits graduated. On average, native students also earn higher GPAs than transfers. In courses taken after their 90th credit, natives average GPAs of 3.13 while transfers who come to Western with 90 credits average a GPA of 3.02. However, if one restricts the sample to students who attempt 30 credits at Western (after earning their 90th), native and transfer GPAs are statistically identical (3.15 v. 3.14). What appears to be happening over those first 30 credits is that transfers perform significantly worse than natives and many transfers dropout. Those that remain perform as well as natives. The differences in overall GPA also occur in selected “gateway” courses. Fourteen courses were chosen by Institutional Research as being courses which are required for large numbers of students to enter into one or many majors. In six of these fourteen courses natives hold a statistical edge in GPA relative to transfers. In the other eight, transfers and natives are statistically indistinguishable. While both measures of academic success suggest a native-transfer difference in GPAs, one must take care when making these types of comparisons. As a group, transfers differ significantly from natives in ways other than academic performance. Transfers to Western are 50% more likely to be first generation college students than natives. Transfers are older, more likely to be from disadvantaged racial groups, are less sure of their field of study, and are interested in different academic fields than natives who completed 90 or more Western credits. Given these differences, this paper explores if academic success is driven by a true native-transfer difference or if transfers underperform relative to natives because they have different backgrounds (for instance, they are more likely to come from environments that undervalues higher education). After controlling for these observables using various statistical methods, there is no evidence to suggest that transfers and natives differ in their conditional performance in either the fourteen gateway courses or in their overall Western GPA. Said another way, despite natives averaging higher GPAs and performing better in select gateway courses, these differences are explained by the fact that transfers are more likely to be first generation (among other categories) and, after accounting for these initial differences, transfers and natives average similar GPAs. Despite the similarity in native and transfers average GPAs, this does not mean that the distribution of GPAs is the same across both groups. This paper provides evidence that past academic performance is positively correlated with GPAs earned. However, the relationship between past performance and Western GPA differs between natives and transfers. Specifically, natives earning a high GPA on their first 90 credits average a significantly higher GPA on their subsequent 90 credits than does a transfer student who earned the same high initial GPA at their prior institution. Interestingly, students transferring to Western with a low GPA earn higher Western GPAs than natives earning the same initial low GPA. A few hypotheses strike me as plausible and, in order to save space, I suggest only one here: strong natives may more quickly identify their field of study and, because of their interest in this field, earn higher grades than similarly strong transfer students. Even though average GPAs are no different between natives and transfers, a large difference in the likelihood of graduating exists between natives and transfers even after controlling for observables like first generation status. One might expect that this occurs because a new transfer student, unused to the rigors of Western and its attendant stresses, would be likely to dropout shortly after arriving at Western. Yet, even after excluding transfers who failed to attempt 30 credits at Western (their 120th higher education credit), the probability of a native student graduating is 9.9% higher than that of a transfer. This native advantage remains even after controlling for a student’s background, prior academic performance, and field of study. Not only are natives more likely to graduate than transfers, they are likely to do so faster. After controlling for observables, natives are 23.4% more likely to graduate within 2.5 years of earning their 90th credit than are transfer students. A number of factors may contribute this including the ability to gain direction during a native’s early years on campus, greater difficulty encountered among transfer students when obtaining necessary courses to declare a major, and a higher propensity among transfer students to dropout of the university after receiving poor grades early after obtaining their 90th credit. In addition to comparing native and transfer academic performance, the data used in this paper provides the opportunity to compare transfer students by their originating institution. Among community college students, there are large differences in performance upon arrival at Western, as measured both by GPA and likelihood to graduate. For instance, North Seattle Community College students average a 3.24 Western GPA while Bellingham Tech students average a 2.53. Half of Spokane Community College students graduate from Western while over 82% of Lower Columbia Community College students do. However, after controlling for observables, there are few community colleges that produce students who perform better or worse than others upon arriving at Western. Nor are there differences between 2-year public community college students and students who transfer from 4-year public or private schools. The one group of students to consistently underperform at Western are those who arrive from 2-year private schools. Yet, even these students are primarily products of one institution: the Northwest Indian College. Distinguishing between the success of students from this particular college and their peers from other 2-year private schools is beyond the scope of this work

    Western Educational Longitudinal Study (WELS) Baseline Survey of Students Entering Western in the Fall, 2009

    Get PDF
    The 2009 WELS survey of incoming freshmen (the Survey) continues the Office of Survey Research’s (OSR) efforts to collect information on all students prior to the start of their academic careers at Western Washington University. This survey represents the initial contact in a longitudinal process that makes additional inquiries of students at the end of their sophomore year, when they graduate from the university, and one to two years after graduation. The purpose of the incoming freshmen survey is threefold: (1) to assess student needs based upon their self-reported characteristics, perceptions, and concerns; (2) to provide data that can assist university assessment and accreditation activities; and (3) to provide baseline observations of students prior to their Western experience which can be used to forecast and enhance student success. The OSR uses a mixture of online and telephone survey methodologies to obtain survey responses. Incoming freshmen who attended Western’s Summerstart program were provided an opportunity to complete this survey as part of their Summerstart experience. Students not attending Summerstart and those who chose not to complete the survey while at Summerstart, were invited to complete the survey via e-mail. After the initial e-mail, OSR sent e-mail reminders to non-responders twice. These reminders were followed by phone calls placed by WWU students encouraging completion of the survey. The survey was then left open until the weekend before Fall quarter courses began on campus. Of the 2,696 Fall 2009 freshmen, 2,454 responded to the survey (a response rate of 91.0%). Of the 2,454 respondents, 2,306 provided responses to the final question of the survey suggesting low attrition during the survey. This report provides data from all questions that lent themselves to a numerical summary and lists the open ended questions asked of students. Responses to the open-ended questions are available upon request

    Fall 2013 Survey of Non-Returning Students: Descriptive Statistics

    Get PDF
    The 2013 Non-Returning Student Survey (NRS) is Western’s first large-scale survey of students who dropped or stopped out since OIART conducted a Survey of Non-Returning Students in June, 2001. The goal of the NRS was to identify reasons for students’ failure to continuously enroll, and to identify improvements Western could implement to aid at-risk students. The NRS was designed in conjunction with Western’s Office of Institutional Research and Division of Enrollment and Student Services. The sample for the NRS covers all undergraduate, degree-seeking students who were enrolled during fall, winter, or spring quarter between fall, 2011 and spring, 2013. From these, OSR removed all students who graduated, all post-baccalaureate students, and any student who was dropped from the university due to poor academic performance. From the remainder, OSR identified 2,333 students who failed enroll at Western during fall quarter, 2013. These 2,333 students are OSR’s survey sample. A unique feature of the NRS is that all students, not just those who complete OSR’s surveys, can be tracked after leaving Western through the National Student Clearinghouse (Clearinghouse), a service that follows individual students as they enroll in nearly any U.S. 2- or 4-year institution of higher education. Section A.1 of this report provides information on Western’s non-returning students gleaned from the Clearinghouse. Of the 2,333 students failing to return to Western, 1,329 (57%) were recorded by the Clearinghouse as attending at least one institution after leaving Western. Of these, just over one-half (52%, 690 students) attended a public, two-year university while 42% (561 students) attended a public, four-year university. All but one of the remainder (6%) attended a private, four-year school. The most common 2-year schools were Whatcom Community College (193 students), Bellevue (85), Everett (54), Skagit (40), Bellingham Tech (34) and Olympic College (34). The most common 4-year schools attended by former Western students were the University of Washington (86), Washington State University (64), Eastern Washington University (34), and Central Washington University (26). Western has admission index measures for 1,729 of the 2,333 non-returning students in the sample. Thirty-five percent had an AI greater than 60 and nearly another 25% had AIs between 50 and 60. By the time of their withdrawal, nearly one-third of non-returning students earned cumulative WWU GPAs greater than 3.0. On the other hand, 19% of non-returning students earned cumulative WWU GPAs less than 2.0. Interestingly, 219 students left Western having already accumulated more than 180 credits, a group that perhaps, with a little effort, could be convinced to return to earn their degrees. Using the Clearinghouse data, OSR can also identify the types of students that attend other institutions after leaving Western. Of the 2,333 non-returning students, 214 had either declared a major, pre-major, or had expressed interest in Physical Education, Health & Recreation, 172 in Psychology, 133 in Biology, 124 in Engineering Technology, and 106 in Elementary Education. It is important to note that many of these students may have simply expressed an interest in these subjects and had yet to declare a major in them. Among those who had actually declared a major, 81 students who failed to return to Western were in PEHR, 43 in English, 37 from Fairhaven, and 33 from Art. Beginning on October 8, 2013 OSR sent e-mail invitations to the sample using the last known external e-mail address of these students. OSR also attempted to use internal e-mail addresses of students on the off chance that some students continued to use their Western accounts. Students failing to respond to e-mail solicitations were then called at their last known cell phone or permanent phone number. The survey concluded on November 22, 2013. OSR received survey responses from 946 students, a response rate of 40.5%. Of these 946, 212 responded to the survey over the telephone. As with any survey, readers should be concerned with sample selection bias; that is bias that arises because respondents are often a non-random selection of the population of potential respondents. While sample selection bias is mitigated by proper survey techniques and a relatively high survey response rate, this is of special concern in a survey of individuals who have left Western because many have varying degrees of commitment to the university. For instance, 24% of respondents claim to be “Very likely” to return to Western to continue their education. Of course, it is exactly these type of students one would expect to respond to a survey e-mail or phone call. To explore sample selection bias, section A.2 of this report lists a number of observable characteristics between respondents, non-respondents, and students who remained at Western. As usual in surveys, respondents were much more likely to be female (52.4% of respondents were female whereas 54.8% of all non-returning students were female). As mentioned in the introduction, this survey spanned two academic years (2011- 2012 and 2012-2013). Students having attended in the most recent academic year were more likely to respond to the survey than those in the prior year. Fully 63% of the responses come from the most recent attendees whereas they make up 57% of the population. Respondents also tended to be slightly stronger academically with average AIs of 54.3 and average Western GPAs of 2.65 relative to the entire population which averaged AIs of 53.3 and Western GPAs of 2.53. At the same time, respondents and non-respondents had similar measures of accumulated credits, Running Start and transfer student status, racial profiles, and Washington State residency status. Turning to the survey results, 91% of respondents originally enrolled at Western in hopes of earning a degree from Western. Nearly all of the remainder enrolled in hopes of transferring to another institution. Of those originally wishing to transfer, almost three-fourths had enrolled in their preferred institution. When asked about their current activity, nearly one-half of non-returning students had enrolled at another institution while 40% were working for pay. As noted above, 24% of survey respondents indicated they were “very likely” to return to Western. As of the beginning of Winter quarter (2014), 142 respondents, or 6% of the non-returning students, had re-enrolled at Western. Respondents were allowed to choose as many reasons as they liked to describe why they left Western. The three most common “broad” reasons given were finances, academics, and family/personal reasons. When asked to be more specific about financial reasons, nearly one-in-five students claimed their student loans were too large and about a similar amount stated that their or their family’s financial situation changed. About one-in-eight claimed not receiving financial aid contributed to their reason for leaving Western. Forty percent of respondents cited academic reasons for leaving Western. Of those, nearly one-third claimed to have academic problems at Western, about one-third believed another school has a better program in their field, and a similar number remained unsure about what they wanted to study. A smaller fraction of students left because Western did not offer an academic program of interest. The most common program mentioned, by 17 students, was nursing and 23 students have since enrolled in a nursing program. The NRS concludes with two open ended questions asking students if there was “anything that could have been done to make your experience at Western more successful?” and “Is there anything else you would like us to know about your experiences at Western?” Of the 623 individuals who responded to the first of these questions, nearly one-half of them indicated that there was nothing Western could have done to make them more successful. Of the remainder, the most frequent (45) response was to provide better advising or access to advising, greater access to financial aid (39) and lower tuition (39). The NRS data is linkable to other Western data sources by unique student identifier. OSR welcomes and encourages campus researchers to utilize this data in their further investigation of issues impacting student success

    Fall 2013 Survey of Non-Returning Students: Descriptive Statistics

    Get PDF
    The 2013 Non-Returning Student Survey (NRS) is Western’s first large-scale survey of students who dropped or stopped out since OIART conducted a Survey of Non-Returning Students in June, 2001. The goal of the NRS was to identify reasons for students’ failure to continuously enroll, and to identify improvements Western could implement to aid at-risk students. The NRS was designed in conjunction with Western’s Office of Institutional Research and Division of Enrollment and Student Services. The sample for the NRS covers all undergraduate, degree-seeking students who were enrolled during fall, winter, or spring quarter between fall, 2011 and spring, 2013. From these, OSR removed all students who graduated, all post-baccalaureate students, and any student who was dropped from the university due to poor academic performance. From the remainder, OSR identified 2,333 students who failed enroll at Western during fall quarter, 2013. These 2,333 students are OSR’s survey sample. A unique feature of the NRS is that all students, not just those who complete OSR’s surveys, can be tracked after leaving Western through the National Student Clearinghouse (Clearinghouse), a service that follows individual students as they enroll in nearly any U.S. 2- or 4-year institution of higher education. Section A.1 of this report provides information on Western’s non-returning students gleaned from the Clearinghouse. Of the 2,333 students failing to return to Western, 1,329 (57%) were recorded by the Clearinghouse as attending at least one institution after leaving Western. Of these, just over one-half (52%, 690 students) attended a public, two-year university while 42% (561 students) attended a public, four-year university. All but one of the remainder (6%) attended a private, four-year school. The most common 2-year schools were Whatcom Community College (193 students), Bellevue (85), Everett (54), Skagit (40), Bellingham Tech (34) and Olympic College (34). The most common 4-year schools attended by former Western students were the University of Washington (86), Washington State University (64), Eastern Washington University (34), and Central Washington University (26). Western has admission index measures for 1,729 of the 2,333 non-returning students in the sample. Thirty-five percent had an AI greater than 60 and nearly another 25% had AIs between 50 and 60. By the time of their withdrawal, nearly one-third of non-returning students earned cumulative WWU GPAs greater than 3.0. On the other hand, 19% of non-returning students earned cumulative WWU GPAs less than 2.0. Interestingly, 219 students left Western having already accumulated more than 180 credits, a group that perhaps, with a little effort, could be convinced to return to earn their degrees. Using the Clearinghouse data, OSR can also identify the types of students that attend other institutions after leaving Western. Of the 2,333 non-returning students, 214 had either declared a major, pre-major, or had expressed interest in Physical Education, Health & Recreation, 172 in Psychology, 133 in Biology, 124 in Engineering Technology, and 106 in Elementary Education. It is important to note that many of these students may have simply expressed an interest in these subjects and had yet to declare a major in them. Among those who had actually declared a major, 81 students who failed to return to Western were in PEHR, 43 in English, 37 from Fairhaven, and 33 from Art. Beginning on October 8, 2013 OSR sent e-mail invitations to the sample using the last known external e-mail address of these students. OSR also attempted to use internal e-mail addresses of students on the off chance that some students continued to use their Western accounts. Students failing to respond to e-mail solicitations were then called at their last known cell phone or permanent phone number. The survey concluded on November 22, 2013. OSR received survey responses from 946 students, a response rate of 40.5%. Of these 946, 212 responded to the survey over the telephone. As with any survey, readers should be concerned with sample selection bias; that is bias that arises because respondents are often a non-random selection of the population of potential respondents. While sample selection bias is mitigated by proper survey techniques and a relatively high survey response rate, this is of special concern in a survey of individuals who have left Western because many have varying degrees of commitment to the university. For instance, 24% of respondents claim to be “Very likely” to return to Western to continue their education. Of course, it is exactly these type of students one would expect to respond to a survey e-mail or phone call. To explore sample selection bias, section A.2 of this report lists a number of observable characteristics between respondents, non-respondents, and students who remained at Western. As usual in surveys, respondents were much more likely to be female (52.4% of respondents were female whereas 54.8% of all non-returning students were female). As mentioned in the introduction, this survey spanned two academic years (2011- 2012 and 2012-2013). Students having attended in the most recent academic year were more likely to respond to the survey than those in the prior year. Fully 63% of the responses come from the most recent attendees whereas they make up 57% of the population. Respondents also tended to be slightly stronger academically with average AIs of 54.3 and average Western GPAs of 2.65 relative to the entire population which averaged AIs of 53.3 and Western GPAs of 2.53. At the same time, respondents and non-respondents had similar measures of accumulated credits, Running Start and transfer student status, racial profiles, and Washington State residency status. Turning to the survey results, 91% of respondents originally enrolled at Western in hopes of earning a degree from Western. Nearly all of the remainder enrolled in hopes of transferring to another institution. Of those originally wishing to transfer, almost three-fourths had enrolled in their preferred institution. When asked about their current activity, nearly one-half of non-returning students had enrolled at another institution while 40% were working for pay. As noted above, 24% of survey respondents indicated they were “very likely” to return to Western. As of the beginning of Winter quarter (2014), 142 respondents, or 6% of the non-returning students, had re-enrolled at Western. Respondents were allowed to choose as many reasons as they liked to describe why they left Western. The three most common “broad” reasons given were finances, academics, and family/personal reasons. When asked to be more specific about financial reasons, nearly one-in-five students claimed their student loans were too large and about a similar amount stated that their or their family’s financial situation changed. About one-in-eight claimed not receiving financial aid contributed to their reason for leaving Western. Forty percent of respondents cited academic reasons for leaving Western. Of those, nearly one-third claimed to have academic problems at Western, about one-third believed another school has a better program in their field, and a similar number remained unsure about what they wanted to study. A smaller fraction of students left because Western did not offer an academic program of interest. The most common program mentioned, by 17 students, was nursing and 23 students have since enrolled in a nursing program. The NRS concludes with two open ended questions asking students if there was “anything that could have been done to make your experience at Western more successful?” and “Is there anything else you would like us to know about your experiences at Western?” Of the 623 individuals who responded to the first of these questions, nearly one-half of them indicated that there was nothing Western could have done to make them more successful. Of the remainder, the most frequent (45) response was to provide better advising or access to advising, greater access to financial aid (39) and lower tuition (39). The NRS data is linkable to other Western data sources by unique student identifier. OSR welcomes and encourages campus researchers to utilize this data in their further investigation of issues impacting student success

    Electronic Course Evaluations at Western Washington University: A Report of the Spring Quarter, 2010 Pilot

    Get PDF
    Electronic course evaluations are becoming a popular, inexpensive substitute for traditional paper course evaluations. Electronic evaluations are easy to implement, reduce the impact on instructor time, are more uniform in their administration, and can reduce printing and paper costs. Further, some usually unexpected benefits can accrue from electronic evaluations. For instance, students appear to respond in more detail to open ended electronic questions than they would to the same question posed in paper format. While there are clear benefits from electronic course evaluations, there also exist pitfalls. Research suggests students view electronic evaluations as less anonymous thereby bringing into question the validity of student responses. Two other common and related concerns are that electronic course evaluations receive fewer student responses and those who do respond are not representative of the population of enrolled students. Student response rates and the impact of electronic course evaluations on instructor ratings are the focus of this report. The Office of Survey Research (OSR) conducted a controlled pilot of electronic course evaluations during Spring Quarter, 2010. This pilot provided the opportunity to learn about OSR’s ability to implement large scale electronic evaluations and simultaneously investigate the impact of these evaluations relative to traditional paper evaluations. OSR piloted electronic evaluations with 21 WWU instructors teaching 23 different CRNs. Of these 23 CRNs, 3 were part of large, multiple CRN courses whose other CRNs were evaluated with the traditional paper thus providing a control group with which to measure the impact of electronic course evaluations. Seven CRNs were taught by instructors who were simultaneously teaching at least one different section of the same course. These other CRNs serve as a control group. Thirteen CRNs were taught by instructors who taught the same course in a previous quarter; the courses in the prior quarters serve as a control group for these instructors. Student response rates on the electronic evaluations were considerably lower than the response rate in the paper evaluation control groups. 74.2% of enrolled students completed the paper evaluations while 56.8% completed electronic evaluations. This lower response rate is quantitatively consist with the best peer-reviewed research estimate OSR could locate (an estimated decline of about 12%) and qualitatively consistent with the findings of institutional research directors at local colleges and universities. When within-instructor response rates estimates are computed, the student response rate difference rises to almost 20%; thus OSR’s best estimate of the impact of electronic evaluations on student responses is that an additional one-in-five students will choose not to complete an electronic evaluation relative to a traditional paper evaluation. Given that student responses to any evaluation system are voluntary, it is interesting to ask if student participation (or lack thereof) in electronic evaluations is random or systematic. One can think of arguments why a decline in participation is not random. OSR’s electronic evaluations were completed on a student’s own time. Students who felt strongly (either positively or negatively) would be more likely to use their time to complete an evaluation. Students who feel less strongly about a course would be less likely to complete an evaluation. As a result, the student evaluations may become bi-modal. While OSR did not link individual student responses with the identifying student information, OSR did track responses to specific evaluation questions like question #20 of the teaching evaluation form: “{The} Instructor’s contribution overall to the course was:” Relative to their control groups, the overall variance of responses to this question was considerably larger for electronic evaluations; a result consistent with response distributions becoming more bi-modal. Further, the average electronic response to question #20 was two-tenths of a point lower (on a five point scale) than the paper evaluations. Similar differences occurred in the other questions investigated. In summary, it appears that electronic evaluations reduce response rates by about 20%, reduce the average instructor scores by a small amount (two-tenths of a point), and increase the variance of the responses. While these differences may be attributable to the electronic format, some care should be taken in using these numbers. First, there is a psychological literature on the Hawthorne effect which points out that individuals are more likely to participate in an experiment because they believe they are helping in the creation of knowledge. If this occurred in our pilot, then one might expect even lower response rates after electronic evaluations are adopted. Further, the instructors participating in the experiment may not be representative of the population. If these instructors volunteered to participate because of their enthusiasm for electronic evaluations, then their enthusiasm may have been transmitted to their students thus increasing response rates. A less enthusiastic instructor might receive fewer responses and possibly different ratings in fields like question #20. The remainder of this report documents a list serve discussion regarding electronic course evaluations that took place between members of the Pacific Northwest Association of Institutional Researchers. This discussion involves many local institutions who have experimented or implemented electronic course evaluations. This is followed by a literature review and a complete discussion of the Western Washington University pilot. This report concludes with an estimate of what it would take OSR to implement a campus-wide electronic course evaluation system. To summarize the final section, OSR estimates that it would require a technically skilled employee to spend about 40 hours in initial setup time and about 50 hours per quarter to implement electronic course evaluations. However, this time commitment would serve only to program and e-mail the electronic evaluations to students. Additional time and computing storage space would be needed to store and disseminate results. Of course, these costs may be offset by the elimination of paper surveys

    Spring 2011 Follow-up Survey of Freshmen Who Entered Western in Fall of 2009

    Get PDF
    The Spring 2011 Follow-Up Survey of Freshmen Who Entered Western in 2009 (2nd Year Survey) is part of a longitudinal effort to survey students with a goal to improve educational programs and provide self-assessment data. Together with the Vice Provost for Undergraduate Education, OSR designed this survey in an attempt to shed light on the efficacy and satisfaction with first year and GUR programs. In addition, a number of campus offices submitted questions to help assess their programs. Among these are the Math Center, the Honors Program, Western Libraries, University Residences, Environmental Health and Safety, and the Renewable Energy Degree initiative

    Western Educational Longitudinal Study (WELS) Baseline Survey of Transfer Students Entering Western in the Fall, 2013: Descriptive Statistics

    Get PDF
    The WELS Baseline Survey of Transfers Entering Western in the Fall, 2013 (Transfer Survey) is the companion survey to the Office of Survey Research’s (OSR) survey of incoming freshmen. Together, these surveys elicit information from students prior to the start of their Western academic careers and provide an initial contact in a longitudinal survey design that follows students through graduation and into their initial years as alumni. The Transfer Survey is designed with three purposes in mind: (1) to provide baseline observations of students prior to the Western experience that can be used to forecast and enhance student success; (2) to provide data that can assist university assessment and accreditation endeavors; and (3) to assess student needs based upon their selfreported characteristics, perceptions, and concerns. To accomplish these, the Transfer Survey integrates questions into seven sections: prior engagement and experiences, the college application process, course scheduling, academic skills and goals, major choice, expenses and employment, and demographics. In addition to these, various Western offices submitted questions that dealt with academic advising and the use of technology. The questions on the Transfer Survey are a mixture of open-ended, numerical and multiple choice types. This report lists all questions and reports basic descriptive statistics from equations which lend themselves to numerical analysis. Responses to open ended questions are available upon request

    Exit Survey of Undergraduate Students Completing Degrees in Summer 2011, Fall 2011, Winter 2012, and Spring 2012: Descriptive Statistics

    Get PDF
    The Exit Survey of Undergraduate Students Completing Degrees in Summer, 2011 through Spring of 2012 (Exit Survey) is the fourth survey of graduating students conducted at Western Washington University. This survey is designed to illuminate departmental-, college-, and university-level information on student satisfaction, barriers to success, experiences in upper division courses, and postgraduation plans. The exit survey also includes questions submitted to the Office of Survey Research (OSR) by the Division of Enrollment and Student Services, University Residences, and the Vice Provost of Undergraduate Education. The Exit Survey consists of a mixture of open-ended, multiple-choice and numerical response questions. This report provides descriptive statistics of the multiple choice and numerical response questions. In previous exit surveys, OSR surveyed only spring graduates. This exit survey includes responses from students graduating in summer 2011, fall 2011 and winter 2012 quarters, in addition to spring 2012. This means that OSR contacted every student graduating between summer 2011 and spring 2012. Because of this, the sample size of all graduates contacted nearly doubled from 1,574 in the 2011 survey of spring quarter graduates to 2,964 graduates between summer, 2011 and spring, 2012. OSR is pleased to note that 2,150 students responded to the survey, representing 72.6% of all graduates, a response rate nearly 10% higher and covering a broader base than a year ago. OSR initiated the Exit Survey during the fifth week of each quarter with an e-mail sent by the chair of the recipient’s major department. This e-mail requested that respondents complete the Exit Survey using a link embedded within the e-mail. A follow-up e-mail from OSR was typically sent three days later to non-respondents and then the process was repeated to non-respondent’s off-campus e-mail address about one week later. OSR then sent a reminder to internal email addresses the following week, and again to external addresses the subsequent week. Non-respondents were then contacted with phone call requests for their participation. This process ended the day before each quarter’s graduation exercises. As with any voluntary survey, readers should be concerned about sample selection bias; that is bias which arises because survey respondents are not a random selection from the population of survey recipients. While sample selection bias for the Exit Survey is mitigated through proper survey techniques and a high response rate, its presence should be considered when evaluating the data. Section A of this document reports basic demographic and academic statistics of the 2012 spring graduates who responded to the survey and compares these to non-respondents. As found in the general literature on surveys, women were more likely to complete the survey (64.1% of respondents were women whereas 61% of graduates were women). Respondents appear to be slightly better students as measured by the admissions index (average of 58.7 for respondents versus 57.6 for all graduates) and WWU GPA (average of 3.20 for respondents versus 3.17). In other ways, respondents and non-respondents were remarkably similar. For instance, 23.2% of respondents were minorities while 21.6% of graduates were minorities. The average and median age of respondents and non-respondents were nearly identical and measures of first generation, transfer status, Washington residency, cumulative WWU credits earned, and credits taken during their final quarter on campus were very similar. Because graduates of summer, fall, and winter quarters are potentially different than students who graduate in the spring, the inclusion of these students may make comparisons with prior surveys that excluded these students difficult. Indeed, Section A.3 demonstrates that spring quarter graduates tend to have higher admissions indices than those graduating in other quarters (60.0 for spring quarter, 55.9 for summer quarter), have higher WWU GPAs, are less likely to be former running start students (especially relative to those who graduate in the fall), are more likely to be in a non-Bellingham program. Because of these differences, for comparison purposes OSR will provide statistics on 2012 spring quarter graduates to those who request them. The remainder of this report contains university-level summary statistics of each question asked (Section B). This data is then disaggregated by college (Section C) and disaggregated again by department (Sections D through J). Section K presents data from all questions by transfer/native-freshmen status. The appendices to this report present count data on two of the open ended questions: “In what ways has Western exceeded your expectations?” and “In what ways has Western fallen short of your expectations?” Hopefully, this disaggregation of data will aid colleges and departments in their self-assessment efforts. While OSR will leave it to the reader to decide what is informative or striking in this report, we undertake to point out some findings which the wider campus may find interesting. If provided the opportunity to start over, 84% of respondents would attend Western again; a number similar to those reported each year since OSR initiated exit surveys in 2009. Of those who would not attend Western again, the most frequently given reasons were that another school has a better program in the student’s field of study and the student felt like they settled for a second-rate experience when they should have tried harder to get into a better school. It is important to note that these responses varied considerably across colleges with relative few CBE or Woodring students claiming another school had a better program. When asked about the length of time it took to graduate relative to their expectations at the time of enrollment, 65% of students claimed it took “less time than expected” or “as long as expected.” However, among those who graduated in the spring quarter, this rises to 74%, a result similar to prior surveys. For students who took longer to graduate than expected, the most frequently cited reasons for the delays were “I could not get the classes I needed” and “I changed my major.” This survey represents the first time the response regarding unavailable classes was one of the top two reasons listed for delayed graduation. This is reinforced by only 63% of students feeling “very satisfied” or “satisfied” with course availability within their major. Again, these data vary considerably by college with Fairhaven students demonstrating highest average satisfaction with course availability and Huxley students demonstrating the least satisfaction. When asked about their upper division studies, 90% of students expressed a positive level of satisfaction with the knowledge and expertise of faculty and 85% were positively satisfied with the quality of instruction and the level of academic challenge. Sixty-five percent of students collaborated with a professor on a research or creative project outside, an increase of 7% over the prior year. 75% of these students indicated that this experience contributed “quite a bit” or “a lot” to their learning. The average student graduated with an educational debt of just over 14,000,anincreaseof14,000, an increase of 1,000 over the prior year. However, this average hides the fact that 40% of graduates completed their education with no debt whatsoever. The average debt of those who did borrow was 25,445,anamountalmost25,445, an amount almost 2000 greater than the prior year. Eighteen percent of students indicated that their student loans impacted their decision to pursue a particular career. Fifty-nine percent of students indicated that their principal activity upon graduation was full-time employment while 17% expect to work part-time, increases of 3% and 4% over the past year. Of those expecting to work, 45% were looking for, but unable to find a job at the time of survey completion, a number similar to the prior year. Thirteen percent of graduates hoped to attend a graduate program and of these, 36% had accepted an offer of admission. One feature of the Exit Survey is that respondents are tracked using their W number which provides OSR the opportunity to merge the student data with Western’s records and past OSR surveys. This ability profoundly opens the door to analysis of longitudinal issues that would otherwise be impossible. OSR is happy to share data or provide survey services upon request

    Spring 2010 Follow-up Survey of Freshmen Who Entered Western in Fall of 2008: Descriptive Statistics

    Get PDF
    The Spring 2010 Follow-Up Survey of Freshmen Who Entered Western in Fall 2008 (2nd Year Survey) holds particular importance to Western in that it focuses on student experiences in first year programs and GUR courses. Together with the Vice Provost for Undergraduate Education and the Committee for Undergraduate Education, the Office of Survey Research (OSR) created this survey in an attempt to shed light on the efficacy of and satisfaction with programs designed to foster student success early in their Western careers. The 2nd Year Survey consists of a mixture of open ended, multiple choice, and numerical response questions. This survey targeted native freshmen (including running start students) who entered Western in the Fall of 2008. These students were completing their second complete year on campus at the time of the survey (Spring 2010). As part of OSR’s efforts to paint a longitudinal portrait of Western’s students, these students also were surveyed immediately prior to beginning their Western careers (Fall 2008 Baseline of Incoming Freshmen.) In an attempt to measure the success of pre-calling, OSR initiated the 2nd Year Survey by first calling potential respondents and informing them that they were to receive an e-mail survey shortly. These pre-calls were made on May 3, 2010 and the e-mail with embedded link to the survey was sent on May 5th. Electronic reminders were sent to non-respondents three days later and phone call reminders were placed on May 12th. For nonrespondents who provided the university with an external e-mail address, additional invitations and reminders were sent on May 13th and 17th. This was followed by a final phone call reminder during the following week. Of the 2,148 valid 2nd year students, OSR received survey responses from 1,355, a response rate of 63.1%. In addition to the contributions of the Vice Provost for Undergraduate Research and the Committee for Undergraduate Education, a number of other campus offices contributed questions to the 2nd Year Survey. Among these contributors are the Math Center, University Residences, and the Office of Sustainable Transportation. Because of the large number of questions these offices included, OSR assigned a number of questions to be randomly skipped by students. In effect, this random skipping shortened the time it took for any individual student to complete the survey while still allowing for a large number of questions to receive a quantity of responses necessary for statistical analysis. For each question, this report notes when randomization occurs. OSR did encounter a flaw in its programming of this randomization routine. On May 11th, after 569 completed surveys and 65 partially completed surveys had been received, OSR realized that the randomized questions were not being asked. This problem was corrected and the remaining 721 responses successfully received the appropriate randomized questions. As with any survey, readers should be concerned about sample selection bias; that is bias which occurs because survey respondents are not a random selection from a population of survey recipients. While sample selection bias for the 2nd Year Survey is mitigated through proper survey techniques and a high response rate, its presence should be considered when evaluating data. Section A of this document reports basic demographic and academic statistics for all students who responded to the survey and compares them to non-respondents. As is consistent with OSR’s experience surveying Western students, women were more likely to respond to the 2nd Year Survey than men (63% of respondents were women whereas 59.2% of the population are women.) Likewise, respondents hold slightly better Western grade point averages; the average respondent earned a cumulative GPA of 3.04 compared to a population average of 2.97. Respondents are also more likely to live on campus (32.1% of respondents versus 29.5% of the population) and attempted more credits during spring quarter than non-respondents (average of 14.7 credits for respondents and 14.5 for the population.) Despite these differences, respondents and non-respondents were very similar in terms of race, running start status, residency in Washington, and first generation status. The remainder of this report is composed of twelve additional sections each focusing on an aspect of student experience. Rather than describing each of these sections, here we focus on two: Section L (DepartmentLevel Data) and Section M (Items Requested by Departments, Offices, and Programs). Some of the questions on this survey are better analyzed at the departmental level. For instance, questions about the quality of advising within a major or the likelihood of staying in a major are most helpful to specific departments. However, because some of these questions were randomly excluded from surveys and because many 2nd year students have yet to declare a major, only a small number of departments received enough responses to warrant documentation here. For departments which did receive a large number of responses, we included department-level breakdowns of questions dealing specifically with departmental issues in Section L. OSR will happily share data with departments who are not listed in this section. Section M contains summary data for questions submitted by other offices and departments. These include questions from the Math Center, the Western Reads Program, University Residences, and the Office of Sustainable Transportation. It is OSR’s intent to expand this section of the survey in the future as other departments add their own questions. While we leave it to the reader to decide what is informative or striking in this report, we undertake to highlight some findings which the wider campus may find interesting. Ninety-one percent of students claimed to be “satisfied” or “very satisfied” with their Western experience but only 66% of students claimed that they were “very unlikely” to leave Western prior to graduation. For the sixty-one students who thought it probable that they would transfer from Western prior to graduating, the most common reasons given were that another school has a better program in their field, that they wanted to go somewhere new and different, or that Western doesn’t offer a major that is of interest. In order to gauge the difficulty of registering for courses, students were asked how many classes which they wanted to take in the spring quarter were full. The average number of full courses were 1.66, about half of which were GUR courses and about 90% were required for their major or pre-major. Students were also asked about courses that were too large. The three courses most frequently cited by 2nd year students as being too large were PSY 101, ESCI 101, and BIOL 101. Sadly, only 8% of students strongly agreed with the statement “Taking GUR courses gives me useful skills” and only 7% of students are “very satisfied” with GUR course availability. For students who have already declared a major, 63% claimed they were very unlikely to change their major prior to graduation. For undeclared students, 57% were certain as to what their major would be and 34% had some idea. Sixty percent of non-declared majors had contacted someone within a major department about their interest. The most common reason for not having declared a major was a need to take more courses to qualify for their major. Over the entire academic year, 35% of students claim they did not write a single paper longer than 5 pages and 56% claimed to write between one and four such papers. Given that 34% of students claimed that learning writing skills is very important to them, this lack of writing experience may explain why only 15% of students are “very satisfied” with the writing skills they have developed. Two features of the 2nd Year Survey are worth mentioning. First, this is part of a longitudinal cohort which began with an OSR baseline survey prior to the beginning of the freshmen year. OSR is happy to package this data and share it with interested researchers. Researchers may view the contents of the baseline survey given to these students at: http://www.wwu.edu/socad/osr/wels Secondly, each respondent in the 2nd Year Survey is tracked with a unique tracking number which OSR can match with university records. This ability profoundly opens the door for research in issues which impact students and the university. OSR will happily provide such data to researchers, departments, and offices upon request
    • …
    corecore