48 research outputs found

    Freshmen Who Plan to Transfer (Analysis)

    Get PDF
    On the 2009 WELS baseline survey of incoming fall 2009 freshmen, thirteen percent indicate some likelihood of transferring prior to graduation. Western administrators are interested in the retention rate of these students, as well as demographic and educational history characteristics. This is a brief exploratory analysis of these questions

    Electronic Course Evaluations at Western Washington University: A Report of the Spring Quarter, 2010 Pilot

    Get PDF
    Electronic course evaluations are becoming a popular, inexpensive substitute for traditional paper course evaluations. Electronic evaluations are easy to implement, reduce the impact on instructor time, are more uniform in their administration, and can reduce printing and paper costs. Further, some usually unexpected benefits can accrue from electronic evaluations. For instance, students appear to respond in more detail to open ended electronic questions than they would to the same question posed in paper format. While there are clear benefits from electronic course evaluations, there also exist pitfalls. Research suggests students view electronic evaluations as less anonymous thereby bringing into question the validity of student responses. Two other common and related concerns are that electronic course evaluations receive fewer student responses and those who do respond are not representative of the population of enrolled students. Student response rates and the impact of electronic course evaluations on instructor ratings are the focus of this report. The Office of Survey Research (OSR) conducted a controlled pilot of electronic course evaluations during Spring Quarter, 2010. This pilot provided the opportunity to learn about OSR’s ability to implement large scale electronic evaluations and simultaneously investigate the impact of these evaluations relative to traditional paper evaluations. OSR piloted electronic evaluations with 21 WWU instructors teaching 23 different CRNs. Of these 23 CRNs, 3 were part of large, multiple CRN courses whose other CRNs were evaluated with the traditional paper thus providing a control group with which to measure the impact of electronic course evaluations. Seven CRNs were taught by instructors who were simultaneously teaching at least one different section of the same course. These other CRNs serve as a control group. Thirteen CRNs were taught by instructors who taught the same course in a previous quarter; the courses in the prior quarters serve as a control group for these instructors. Student response rates on the electronic evaluations were considerably lower than the response rate in the paper evaluation control groups. 74.2% of enrolled students completed the paper evaluations while 56.8% completed electronic evaluations. This lower response rate is quantitatively consist with the best peer-reviewed research estimate OSR could locate (an estimated decline of about 12%) and qualitatively consistent with the findings of institutional research directors at local colleges and universities. When within-instructor response rates estimates are computed, the student response rate difference rises to almost 20%; thus OSR’s best estimate of the impact of electronic evaluations on student responses is that an additional one-in-five students will choose not to complete an electronic evaluation relative to a traditional paper evaluation. Given that student responses to any evaluation system are voluntary, it is interesting to ask if student participation (or lack thereof) in electronic evaluations is random or systematic. One can think of arguments why a decline in participation is not random. OSR’s electronic evaluations were completed on a student’s own time. Students who felt strongly (either positively or negatively) would be more likely to use their time to complete an evaluation. Students who feel less strongly about a course would be less likely to complete an evaluation. As a result, the student evaluations may become bi-modal. While OSR did not link individual student responses with the identifying student information, OSR did track responses to specific evaluation questions like question #20 of the teaching evaluation form: “{The} Instructor’s contribution overall to the course was:” Relative to their control groups, the overall variance of responses to this question was considerably larger for electronic evaluations; a result consistent with response distributions becoming more bi-modal. Further, the average electronic response to question #20 was two-tenths of a point lower (on a five point scale) than the paper evaluations. Similar differences occurred in the other questions investigated. In summary, it appears that electronic evaluations reduce response rates by about 20%, reduce the average instructor scores by a small amount (two-tenths of a point), and increase the variance of the responses. While these differences may be attributable to the electronic format, some care should be taken in using these numbers. First, there is a psychological literature on the Hawthorne effect which points out that individuals are more likely to participate in an experiment because they believe they are helping in the creation of knowledge. If this occurred in our pilot, then one might expect even lower response rates after electronic evaluations are adopted. Further, the instructors participating in the experiment may not be representative of the population. If these instructors volunteered to participate because of their enthusiasm for electronic evaluations, then their enthusiasm may have been transmitted to their students thus increasing response rates. A less enthusiastic instructor might receive fewer responses and possibly different ratings in fields like question #20. The remainder of this report documents a list serve discussion regarding electronic course evaluations that took place between members of the Pacific Northwest Association of Institutional Researchers. This discussion involves many local institutions who have experimented or implemented electronic course evaluations. This is followed by a literature review and a complete discussion of the Western Washington University pilot. This report concludes with an estimate of what it would take OSR to implement a campus-wide electronic course evaluation system. To summarize the final section, OSR estimates that it would require a technically skilled employee to spend about 40 hours in initial setup time and about 50 hours per quarter to implement electronic course evaluations. However, this time commitment would serve only to program and e-mail the electronic evaluations to students. Additional time and computing storage space would be needed to store and disseminate results. Of course, these costs may be offset by the elimination of paper surveys

    Western Educational Longitudinal Study (WELS) Baseline Survey of Students Entering Western in the Fall, 2009

    Get PDF
    The 2009 WELS survey of incoming freshmen (the Survey) continues the Office of Survey Research’s (OSR) efforts to collect information on all students prior to the start of their academic careers at Western Washington University. This survey represents the initial contact in a longitudinal process that makes additional inquiries of students at the end of their sophomore year, when they graduate from the university, and one to two years after graduation. The purpose of the incoming freshmen survey is threefold: (1) to assess student needs based upon their self-reported characteristics, perceptions, and concerns; (2) to provide data that can assist university assessment and accreditation activities; and (3) to provide baseline observations of students prior to their Western experience which can be used to forecast and enhance student success. The OSR uses a mixture of online and telephone survey methodologies to obtain survey responses. Incoming freshmen who attended Western’s Summerstart program were provided an opportunity to complete this survey as part of their Summerstart experience. Students not attending Summerstart and those who chose not to complete the survey while at Summerstart, were invited to complete the survey via e-mail. After the initial e-mail, OSR sent e-mail reminders to non-responders twice. These reminders were followed by phone calls placed by WWU students encouraging completion of the survey. The survey was then left open until the weekend before Fall quarter courses began on campus. Of the 2,696 Fall 2009 freshmen, 2,454 responded to the survey (a response rate of 91.0%). Of the 2,454 respondents, 2,306 provided responses to the final question of the survey suggesting low attrition during the survey. This report provides data from all questions that lent themselves to a numerical summary and lists the open ended questions asked of students. Responses to the open-ended questions are available upon request

    Fall 2013 Survey of Non-Returning Students: Descriptive Statistics

    Get PDF
    The 2013 Non-Returning Student Survey (NRS) is Western’s first large-scale survey of students who dropped or stopped out since OIART conducted a Survey of Non-Returning Students in June, 2001. The goal of the NRS was to identify reasons for students’ failure to continuously enroll, and to identify improvements Western could implement to aid at-risk students. The NRS was designed in conjunction with Western’s Office of Institutional Research and Division of Enrollment and Student Services. The sample for the NRS covers all undergraduate, degree-seeking students who were enrolled during fall, winter, or spring quarter between fall, 2011 and spring, 2013. From these, OSR removed all students who graduated, all post-baccalaureate students, and any student who was dropped from the university due to poor academic performance. From the remainder, OSR identified 2,333 students who failed enroll at Western during fall quarter, 2013. These 2,333 students are OSR’s survey sample. A unique feature of the NRS is that all students, not just those who complete OSR’s surveys, can be tracked after leaving Western through the National Student Clearinghouse (Clearinghouse), a service that follows individual students as they enroll in nearly any U.S. 2- or 4-year institution of higher education. Section A.1 of this report provides information on Western’s non-returning students gleaned from the Clearinghouse. Of the 2,333 students failing to return to Western, 1,329 (57%) were recorded by the Clearinghouse as attending at least one institution after leaving Western. Of these, just over one-half (52%, 690 students) attended a public, two-year university while 42% (561 students) attended a public, four-year university. All but one of the remainder (6%) attended a private, four-year school. The most common 2-year schools were Whatcom Community College (193 students), Bellevue (85), Everett (54), Skagit (40), Bellingham Tech (34) and Olympic College (34). The most common 4-year schools attended by former Western students were the University of Washington (86), Washington State University (64), Eastern Washington University (34), and Central Washington University (26). Western has admission index measures for 1,729 of the 2,333 non-returning students in the sample. Thirty-five percent had an AI greater than 60 and nearly another 25% had AIs between 50 and 60. By the time of their withdrawal, nearly one-third of non-returning students earned cumulative WWU GPAs greater than 3.0. On the other hand, 19% of non-returning students earned cumulative WWU GPAs less than 2.0. Interestingly, 219 students left Western having already accumulated more than 180 credits, a group that perhaps, with a little effort, could be convinced to return to earn their degrees. Using the Clearinghouse data, OSR can also identify the types of students that attend other institutions after leaving Western. Of the 2,333 non-returning students, 214 had either declared a major, pre-major, or had expressed interest in Physical Education, Health & Recreation, 172 in Psychology, 133 in Biology, 124 in Engineering Technology, and 106 in Elementary Education. It is important to note that many of these students may have simply expressed an interest in these subjects and had yet to declare a major in them. Among those who had actually declared a major, 81 students who failed to return to Western were in PEHR, 43 in English, 37 from Fairhaven, and 33 from Art. Beginning on October 8, 2013 OSR sent e-mail invitations to the sample using the last known external e-mail address of these students. OSR also attempted to use internal e-mail addresses of students on the off chance that some students continued to use their Western accounts. Students failing to respond to e-mail solicitations were then called at their last known cell phone or permanent phone number. The survey concluded on November 22, 2013. OSR received survey responses from 946 students, a response rate of 40.5%. Of these 946, 212 responded to the survey over the telephone. As with any survey, readers should be concerned with sample selection bias; that is bias that arises because respondents are often a non-random selection of the population of potential respondents. While sample selection bias is mitigated by proper survey techniques and a relatively high survey response rate, this is of special concern in a survey of individuals who have left Western because many have varying degrees of commitment to the university. For instance, 24% of respondents claim to be “Very likely” to return to Western to continue their education. Of course, it is exactly these type of students one would expect to respond to a survey e-mail or phone call. To explore sample selection bias, section A.2 of this report lists a number of observable characteristics between respondents, non-respondents, and students who remained at Western. As usual in surveys, respondents were much more likely to be female (52.4% of respondents were female whereas 54.8% of all non-returning students were female). As mentioned in the introduction, this survey spanned two academic years (2011- 2012 and 2012-2013). Students having attended in the most recent academic year were more likely to respond to the survey than those in the prior year. Fully 63% of the responses come from the most recent attendees whereas they make up 57% of the population. Respondents also tended to be slightly stronger academically with average AIs of 54.3 and average Western GPAs of 2.65 relative to the entire population which averaged AIs of 53.3 and Western GPAs of 2.53. At the same time, respondents and non-respondents had similar measures of accumulated credits, Running Start and transfer student status, racial profiles, and Washington State residency status. Turning to the survey results, 91% of respondents originally enrolled at Western in hopes of earning a degree from Western. Nearly all of the remainder enrolled in hopes of transferring to another institution. Of those originally wishing to transfer, almost three-fourths had enrolled in their preferred institution. When asked about their current activity, nearly one-half of non-returning students had enrolled at another institution while 40% were working for pay. As noted above, 24% of survey respondents indicated they were “very likely” to return to Western. As of the beginning of Winter quarter (2014), 142 respondents, or 6% of the non-returning students, had re-enrolled at Western. Respondents were allowed to choose as many reasons as they liked to describe why they left Western. The three most common “broad” reasons given were finances, academics, and family/personal reasons. When asked to be more specific about financial reasons, nearly one-in-five students claimed their student loans were too large and about a similar amount stated that their or their family’s financial situation changed. About one-in-eight claimed not receiving financial aid contributed to their reason for leaving Western. Forty percent of respondents cited academic reasons for leaving Western. Of those, nearly one-third claimed to have academic problems at Western, about one-third believed another school has a better program in their field, and a similar number remained unsure about what they wanted to study. A smaller fraction of students left because Western did not offer an academic program of interest. The most common program mentioned, by 17 students, was nursing and 23 students have since enrolled in a nursing program. The NRS concludes with two open ended questions asking students if there was “anything that could have been done to make your experience at Western more successful?” and “Is there anything else you would like us to know about your experiences at Western?” Of the 623 individuals who responded to the first of these questions, nearly one-half of them indicated that there was nothing Western could have done to make them more successful. Of the remainder, the most frequent (45) response was to provide better advising or access to advising, greater access to financial aid (39) and lower tuition (39). The NRS data is linkable to other Western data sources by unique student identifier. OSR welcomes and encourages campus researchers to utilize this data in their further investigation of issues impacting student success

    Fall 2013 Survey of Non-Returning Students: Descriptive Statistics

    Get PDF
    The 2013 Non-Returning Student Survey (NRS) is Western’s first large-scale survey of students who dropped or stopped out since OIART conducted a Survey of Non-Returning Students in June, 2001. The goal of the NRS was to identify reasons for students’ failure to continuously enroll, and to identify improvements Western could implement to aid at-risk students. The NRS was designed in conjunction with Western’s Office of Institutional Research and Division of Enrollment and Student Services. The sample for the NRS covers all undergraduate, degree-seeking students who were enrolled during fall, winter, or spring quarter between fall, 2011 and spring, 2013. From these, OSR removed all students who graduated, all post-baccalaureate students, and any student who was dropped from the university due to poor academic performance. From the remainder, OSR identified 2,333 students who failed enroll at Western during fall quarter, 2013. These 2,333 students are OSR’s survey sample. A unique feature of the NRS is that all students, not just those who complete OSR’s surveys, can be tracked after leaving Western through the National Student Clearinghouse (Clearinghouse), a service that follows individual students as they enroll in nearly any U.S. 2- or 4-year institution of higher education. Section A.1 of this report provides information on Western’s non-returning students gleaned from the Clearinghouse. Of the 2,333 students failing to return to Western, 1,329 (57%) were recorded by the Clearinghouse as attending at least one institution after leaving Western. Of these, just over one-half (52%, 690 students) attended a public, two-year university while 42% (561 students) attended a public, four-year university. All but one of the remainder (6%) attended a private, four-year school. The most common 2-year schools were Whatcom Community College (193 students), Bellevue (85), Everett (54), Skagit (40), Bellingham Tech (34) and Olympic College (34). The most common 4-year schools attended by former Western students were the University of Washington (86), Washington State University (64), Eastern Washington University (34), and Central Washington University (26). Western has admission index measures for 1,729 of the 2,333 non-returning students in the sample. Thirty-five percent had an AI greater than 60 and nearly another 25% had AIs between 50 and 60. By the time of their withdrawal, nearly one-third of non-returning students earned cumulative WWU GPAs greater than 3.0. On the other hand, 19% of non-returning students earned cumulative WWU GPAs less than 2.0. Interestingly, 219 students left Western having already accumulated more than 180 credits, a group that perhaps, with a little effort, could be convinced to return to earn their degrees. Using the Clearinghouse data, OSR can also identify the types of students that attend other institutions after leaving Western. Of the 2,333 non-returning students, 214 had either declared a major, pre-major, or had expressed interest in Physical Education, Health & Recreation, 172 in Psychology, 133 in Biology, 124 in Engineering Technology, and 106 in Elementary Education. It is important to note that many of these students may have simply expressed an interest in these subjects and had yet to declare a major in them. Among those who had actually declared a major, 81 students who failed to return to Western were in PEHR, 43 in English, 37 from Fairhaven, and 33 from Art. Beginning on October 8, 2013 OSR sent e-mail invitations to the sample using the last known external e-mail address of these students. OSR also attempted to use internal e-mail addresses of students on the off chance that some students continued to use their Western accounts. Students failing to respond to e-mail solicitations were then called at their last known cell phone or permanent phone number. The survey concluded on November 22, 2013. OSR received survey responses from 946 students, a response rate of 40.5%. Of these 946, 212 responded to the survey over the telephone. As with any survey, readers should be concerned with sample selection bias; that is bias that arises because respondents are often a non-random selection of the population of potential respondents. While sample selection bias is mitigated by proper survey techniques and a relatively high survey response rate, this is of special concern in a survey of individuals who have left Western because many have varying degrees of commitment to the university. For instance, 24% of respondents claim to be “Very likely” to return to Western to continue their education. Of course, it is exactly these type of students one would expect to respond to a survey e-mail or phone call. To explore sample selection bias, section A.2 of this report lists a number of observable characteristics between respondents, non-respondents, and students who remained at Western. As usual in surveys, respondents were much more likely to be female (52.4% of respondents were female whereas 54.8% of all non-returning students were female). As mentioned in the introduction, this survey spanned two academic years (2011- 2012 and 2012-2013). Students having attended in the most recent academic year were more likely to respond to the survey than those in the prior year. Fully 63% of the responses come from the most recent attendees whereas they make up 57% of the population. Respondents also tended to be slightly stronger academically with average AIs of 54.3 and average Western GPAs of 2.65 relative to the entire population which averaged AIs of 53.3 and Western GPAs of 2.53. At the same time, respondents and non-respondents had similar measures of accumulated credits, Running Start and transfer student status, racial profiles, and Washington State residency status. Turning to the survey results, 91% of respondents originally enrolled at Western in hopes of earning a degree from Western. Nearly all of the remainder enrolled in hopes of transferring to another institution. Of those originally wishing to transfer, almost three-fourths had enrolled in their preferred institution. When asked about their current activity, nearly one-half of non-returning students had enrolled at another institution while 40% were working for pay. As noted above, 24% of survey respondents indicated they were “very likely” to return to Western. As of the beginning of Winter quarter (2014), 142 respondents, or 6% of the non-returning students, had re-enrolled at Western. Respondents were allowed to choose as many reasons as they liked to describe why they left Western. The three most common “broad” reasons given were finances, academics, and family/personal reasons. When asked to be more specific about financial reasons, nearly one-in-five students claimed their student loans were too large and about a similar amount stated that their or their family’s financial situation changed. About one-in-eight claimed not receiving financial aid contributed to their reason for leaving Western. Forty percent of respondents cited academic reasons for leaving Western. Of those, nearly one-third claimed to have academic problems at Western, about one-third believed another school has a better program in their field, and a similar number remained unsure about what they wanted to study. A smaller fraction of students left because Western did not offer an academic program of interest. The most common program mentioned, by 17 students, was nursing and 23 students have since enrolled in a nursing program. The NRS concludes with two open ended questions asking students if there was “anything that could have been done to make your experience at Western more successful?” and “Is there anything else you would like us to know about your experiences at Western?” Of the 623 individuals who responded to the first of these questions, nearly one-half of them indicated that there was nothing Western could have done to make them more successful. Of the remainder, the most frequent (45) response was to provide better advising or access to advising, greater access to financial aid (39) and lower tuition (39). The NRS data is linkable to other Western data sources by unique student identifier. OSR welcomes and encourages campus researchers to utilize this data in their further investigation of issues impacting student success

    Spring 2011 Follow-up Survey of Freshmen Who Entered Western in Fall of 2009

    Get PDF
    The Spring 2011 Follow-Up Survey of Freshmen Who Entered Western in 2009 (2nd Year Survey) is part of a longitudinal effort to survey students with a goal to improve educational programs and provide self-assessment data. Together with the Vice Provost for Undergraduate Education, OSR designed this survey in an attempt to shed light on the efficacy and satisfaction with first year and GUR programs. In addition, a number of campus offices submitted questions to help assess their programs. Among these are the Math Center, the Honors Program, Western Libraries, University Residences, Environmental Health and Safety, and the Renewable Energy Degree initiative

    Western Educational Longitudinal Study (WELS) Baseline Survey of Freshmen Entering Western in the Fall, 2013: Descriptive Statistics

    Get PDF
    The Fall, 2013 Baseline Survey of Freshmen Entering Western (Freshmen Survey) continues the Office of Survey Research’s (OSR) efforts to collect information on all students prior to the start of their academic careers at Western Washington University. This survey represents the initial contact in a longitudinal process that makes inquiries of students at the end of their sophomore year, when they graduate from the university, and one to two years after graduation. The Freshmen Survey is designed with three purposes in mind: (1) to provide baseline observations of students prior to their Western experience which can be used to forecast and enhance student success; (2) to provide data that can assist university assessment and accreditation efforts; (3) to assess student needs based upon their self-reported characteristics, perceptions, and concerns. To accomplish these, the Freshmen Survey integrates questions into five major sections: pre-collegiate engagement and experiences, the college application process, class scheduling and expectations, skills, goals, and expectations, and expenses and employment. In addition to these, Western’s Division of Enrollment and Student Services submitted questions regarding the expected use of technology. The questions on the Freshmen Survey are a mix of open-ended, numerical, and multiple choice responses. This report lists all questions and reports basic descriptive statistics from questions which lend themselves to numerical analysis. Responses to the open ended questions are available upon request

    Western Educational Longitudinal Study (WELS) Baseline Survey of Freshmen Entering Western in the Fall, 2010: Descriptive Statistics

    Get PDF
    The Fall, 2010 Baseline Survey of Freshmen Entering Western continues the Office of Survey Research’s (OSR) efforts to collect information on all students prior to the start of their academic careers at Western Washington University. This survey represents the initial contact in a longitudinal process that makes possible additional inquiries of students at the end of their sophomore year, when they graduate from the university, and one to two years after graduation. The Freshmen Survey is designed with three purposes in mind: (1) to provide baseline observations of students prior to their Western experience which can be used to forecast and enhance student success; (2) to provide data that can assist university assessment and accreditation services; and (3) to assess student needs based upon their self-reported characteristics, perceptions and concerns. To accomplish these purposes, the Freshmen Survey integrates questions into five major sections: Pre-collegiate engagement and experiences; the college application process; familiarity and comfort with Western; academic skills, goals, and expectations; and expenses and employment. The questions on the Freshmen Survey were a mixture of open-ended, numerical, and multiple choice responses. This report lists all questions and reports basic descriptive statistics from questions which lend themselves to numerical analysis. Responses to the open ended questions are available upon request. OSR used a mixture of online and telephone survey methodologies to obtain responses. Incoming freshmen who attended Western’s Summerstart program were provided an opportunity to complete this survey as part of their Summerstart experience. Students not attending Summerstart and those who chose not to complete the survey while at Summerstart were invited to complete the survey online. E-mails were initially sent to the student’s external e-mail address. After the initial e-mail, OSR sent e-mail reminders to non-responders twice. The survey was then left open online until the weekend before Fall quarter courses began on campus. Of the 2,920 Fall 2010 freshmen, 2,427 responded to the survey (a response rate of 83.1%). As with any survey, readers should be concerned with sample selection bias; that is bias which arises because survey respondents are not a random selection of the population of survey recipients. While sample selection bias for Western’s exit survey is mitigated through proper survey techniques and a high response rate, its presence should be considered when evaluating data. Section A of this document compares respondents to all incoming freshmen. Relative to all freshmen, respondents were more likely to be female (61.4% of respondents versus 59% of all freshmen), averaged a slightly higher admission index (57.2 versus 56.7), and were more likely to be first generation college students (32% of respondents versus 30.2% of all freshmen). On the other hand, respondents were nearly identical to non-respondents in measures of age, SAT, and high school percentile. OSR is excited to share its individual survey results with campus researchers so they may answer their own questions. To familiarize readers with the content of the survey, here we make a few observations regarding the survey results. Out of a list of 16 possibilities, the three most important reasons why students claimed to come to Western were the recreational opportunities in the area, Western’s good academic reputation, and Western’s size. Almost two-thirds of students first learned about Western through a relative or friend and almost nine out of ten learned of Western prior to their senior year in high school. Including Western, the median student applied to three colleges and was accepted to two of them. Besides Western, the three schools most commonly applied to were the University of Washington, Washington State, and Central Washington although other common schools include the University of Oregon, University of Portland, Gonzaga, and Seattle University. Among the schools to which they applied, 69% of incoming freshmen claimed that Western was their first choice; the next closest was the University of Washington (14%). Nearly one-third of students claim to be certain about their major and another half of students have some idea of what they will study. About three-fourths of students expect to graduate in four years or less; no students expect to take longer than five years to graduate. Twelve percent of students claimed some positive likelihood that they would transfer from Western prior to graduation and an additional 26% were unsure if they would transfer or not. Among those likely to transfer, the most common reasons given were that Western did not offer a degree program that interested the student, a perceived lack of prestige, and friends/family attend a different school. For students who attended Summerstart, 77% were either “very” or “somewhat” satisfied with their class schedule. For those expressing some level of dissatisfaction, the most common reason given was that needed classes were full. All of OSR’s survey data is linked by a unique student identification number allowing for merging of the survey data with Western’s data warehouse or with data collected by future surveys. Using this identifier, OSR can provide open ended responses or specific data to departments who want to investigate further

    Spring 2010 Follow-up Survey of Freshmen Who Entered Western in Fall of 2008: Descriptive Statistics

    Get PDF
    The Spring 2010 Follow-Up Survey of Freshmen Who Entered Western in Fall 2008 (2nd Year Survey) holds particular importance to Western in that it focuses on student experiences in first year programs and GUR courses. Together with the Vice Provost for Undergraduate Education and the Committee for Undergraduate Education, the Office of Survey Research (OSR) created this survey in an attempt to shed light on the efficacy of and satisfaction with programs designed to foster student success early in their Western careers. The 2nd Year Survey consists of a mixture of open ended, multiple choice, and numerical response questions. This survey targeted native freshmen (including running start students) who entered Western in the Fall of 2008. These students were completing their second complete year on campus at the time of the survey (Spring 2010). As part of OSR’s efforts to paint a longitudinal portrait of Western’s students, these students also were surveyed immediately prior to beginning their Western careers (Fall 2008 Baseline of Incoming Freshmen.) In an attempt to measure the success of pre-calling, OSR initiated the 2nd Year Survey by first calling potential respondents and informing them that they were to receive an e-mail survey shortly. These pre-calls were made on May 3, 2010 and the e-mail with embedded link to the survey was sent on May 5th. Electronic reminders were sent to non-respondents three days later and phone call reminders were placed on May 12th. For nonrespondents who provided the university with an external e-mail address, additional invitations and reminders were sent on May 13th and 17th. This was followed by a final phone call reminder during the following week. Of the 2,148 valid 2nd year students, OSR received survey responses from 1,355, a response rate of 63.1%. In addition to the contributions of the Vice Provost for Undergraduate Research and the Committee for Undergraduate Education, a number of other campus offices contributed questions to the 2nd Year Survey. Among these contributors are the Math Center, University Residences, and the Office of Sustainable Transportation. Because of the large number of questions these offices included, OSR assigned a number of questions to be randomly skipped by students. In effect, this random skipping shortened the time it took for any individual student to complete the survey while still allowing for a large number of questions to receive a quantity of responses necessary for statistical analysis. For each question, this report notes when randomization occurs. OSR did encounter a flaw in its programming of this randomization routine. On May 11th, after 569 completed surveys and 65 partially completed surveys had been received, OSR realized that the randomized questions were not being asked. This problem was corrected and the remaining 721 responses successfully received the appropriate randomized questions. As with any survey, readers should be concerned about sample selection bias; that is bias which occurs because survey respondents are not a random selection from a population of survey recipients. While sample selection bias for the 2nd Year Survey is mitigated through proper survey techniques and a high response rate, its presence should be considered when evaluating data. Section A of this document reports basic demographic and academic statistics for all students who responded to the survey and compares them to non-respondents. As is consistent with OSR’s experience surveying Western students, women were more likely to respond to the 2nd Year Survey than men (63% of respondents were women whereas 59.2% of the population are women.) Likewise, respondents hold slightly better Western grade point averages; the average respondent earned a cumulative GPA of 3.04 compared to a population average of 2.97. Respondents are also more likely to live on campus (32.1% of respondents versus 29.5% of the population) and attempted more credits during spring quarter than non-respondents (average of 14.7 credits for respondents and 14.5 for the population.) Despite these differences, respondents and non-respondents were very similar in terms of race, running start status, residency in Washington, and first generation status. The remainder of this report is composed of twelve additional sections each focusing on an aspect of student experience. Rather than describing each of these sections, here we focus on two: Section L (DepartmentLevel Data) and Section M (Items Requested by Departments, Offices, and Programs). Some of the questions on this survey are better analyzed at the departmental level. For instance, questions about the quality of advising within a major or the likelihood of staying in a major are most helpful to specific departments. However, because some of these questions were randomly excluded from surveys and because many 2nd year students have yet to declare a major, only a small number of departments received enough responses to warrant documentation here. For departments which did receive a large number of responses, we included department-level breakdowns of questions dealing specifically with departmental issues in Section L. OSR will happily share data with departments who are not listed in this section. Section M contains summary data for questions submitted by other offices and departments. These include questions from the Math Center, the Western Reads Program, University Residences, and the Office of Sustainable Transportation. It is OSR’s intent to expand this section of the survey in the future as other departments add their own questions. While we leave it to the reader to decide what is informative or striking in this report, we undertake to highlight some findings which the wider campus may find interesting. Ninety-one percent of students claimed to be “satisfied” or “very satisfied” with their Western experience but only 66% of students claimed that they were “very unlikely” to leave Western prior to graduation. For the sixty-one students who thought it probable that they would transfer from Western prior to graduating, the most common reasons given were that another school has a better program in their field, that they wanted to go somewhere new and different, or that Western doesn’t offer a major that is of interest. In order to gauge the difficulty of registering for courses, students were asked how many classes which they wanted to take in the spring quarter were full. The average number of full courses were 1.66, about half of which were GUR courses and about 90% were required for their major or pre-major. Students were also asked about courses that were too large. The three courses most frequently cited by 2nd year students as being too large were PSY 101, ESCI 101, and BIOL 101. Sadly, only 8% of students strongly agreed with the statement “Taking GUR courses gives me useful skills” and only 7% of students are “very satisfied” with GUR course availability. For students who have already declared a major, 63% claimed they were very unlikely to change their major prior to graduation. For undeclared students, 57% were certain as to what their major would be and 34% had some idea. Sixty percent of non-declared majors had contacted someone within a major department about their interest. The most common reason for not having declared a major was a need to take more courses to qualify for their major. Over the entire academic year, 35% of students claim they did not write a single paper longer than 5 pages and 56% claimed to write between one and four such papers. Given that 34% of students claimed that learning writing skills is very important to them, this lack of writing experience may explain why only 15% of students are “very satisfied” with the writing skills they have developed. Two features of the 2nd Year Survey are worth mentioning. First, this is part of a longitudinal cohort which began with an OSR baseline survey prior to the beginning of the freshmen year. OSR is happy to package this data and share it with interested researchers. Researchers may view the contents of the baseline survey given to these students at: http://www.wwu.edu/socad/osr/wels Secondly, each respondent in the 2nd Year Survey is tracked with a unique tracking number which OSR can match with university records. This ability profoundly opens the door for research in issues which impact students and the university. OSR will happily provide such data to researchers, departments, and offices upon request

    Western Educational Longitudinal Study (WELS) Baseline Survey of Transfer Students Entering Western in the Fall, 2013: Descriptive Statistics

    Get PDF
    The WELS Baseline Survey of Transfers Entering Western in the Fall, 2013 (Transfer Survey) is the companion survey to the Office of Survey Research’s (OSR) survey of incoming freshmen. Together, these surveys elicit information from students prior to the start of their Western academic careers and provide an initial contact in a longitudinal survey design that follows students through graduation and into their initial years as alumni. The Transfer Survey is designed with three purposes in mind: (1) to provide baseline observations of students prior to the Western experience that can be used to forecast and enhance student success; (2) to provide data that can assist university assessment and accreditation endeavors; and (3) to assess student needs based upon their selfreported characteristics, perceptions, and concerns. To accomplish these, the Transfer Survey integrates questions into seven sections: prior engagement and experiences, the college application process, course scheduling, academic skills and goals, major choice, expenses and employment, and demographics. In addition to these, various Western offices submitted questions that dealt with academic advising and the use of technology. The questions on the Transfer Survey are a mixture of open-ended, numerical and multiple choice types. This report lists all questions and reports basic descriptive statistics from equations which lend themselves to numerical analysis. Responses to open ended questions are available upon request
    corecore