Guidelines for Conducting Student Surveys

Purpose of guidelines

The purpose of providing guidelines for conducting and sharing results of student surveys is to assist student affairs practitioners in determining when to administer a survey and how widely to share data and findings that may be relevant to others in the campus community. (Download this worksheet as a MS Word document.)


What is a student survey?

For purposes of these guidelines, there are two types of surveys typically conducted within student support areas 鈥 general student surveys and programmatic student surveys.

General Student Survey
A general student survey is an assessment tool comprised primarily of survey questions (e.g., multiple choice, Likert scale, open-ended response) in which individual students answer questions about their experience as a 向日葵视频student. For example, a general student survey could include questions about the students鈥 perceptions, beliefs, demographics, satisfaction, activities, future plans, etc. General student surveys go beyond asking questions that are inherently tied to a particular program, initiative or service offered by Marquette.

Programmatic Student Surveys
A programmatic student survey is inherently tied to a specific program or initiative in which a student participates. The questions on this type of survey assess students鈥 learning and/or satisfaction with participation in an event or experience.

Considerations in determining if and how frequently general student surveys should be administered

  • Step 1: Prior to conducting a survey, determine what information is needed from students.
  • Step 2: Consult the Assessment website for related executive summaries and ask colleagues for results of previously administered surveys that may contain helpful information that you are seeking.
  • Step 3: If applicable, consult colleagues in the Office of Institutional Research and Analysis, and/or members of the Division of Student Affairs Assessment team to ask if there are others on campus who have sought similar information from students.
  • Step 4: If the information you seek doesn鈥檛 already exist, determine your method of collecting data (see 鈥Selecting a Method鈥). Whenever possible, collaborate with colleagues on development and administration of assessments.
  • Step 5: Review Marquette鈥檚 Online Survey Policy to determine if your project requires review by the Online Survey Review Group.

Reporting data and executive summaries

Sharing data and information gathered from a variety of assessment projects is important for a number of reasons, including:

  • To uphold the integrity of the assessment process.
  • To honor the information students provide by using data to inform decision-making when relevant.
  • To provide transparency in decision-making within departments and across units.
  • To advocate for students needs and identify trends across units.
  • To avoid asking students to respond to the same questions on multiple instruments.

Sharing of results of surveys in an executive summary is required when the project meets the following criteria:

A) The survey was approved by the Online Survey Review Group.

and/or

B) The information gathered from the survey may be useful in informing decision-making or resource allocation within other units.

and/or

C) The survey was administered to students from underrepresented backgrounds or with factors that may indicate additional resources needed from other units (e.g. lower retention rates, lower graduation rates, etc).

If your survey meets one or more of the criteria above (A, B, and/or C), results should be made public to the 向日葵视频community via an executive summary on the Division of Student Affairs Assessment website. Units that may benefit from the information provided in the executive summary should also receive a copy of the executive summary.

While executive summaries are not required of programmatic surveys, sharing of results of programmatic surveys is important for the reasons indicated above.

Ways in which results may be shared that do not include a public posting of the executive summary include:

  • In an annual departmental report
  • As basis for discussion in staff meetings or with colleagues
  • On departmental or program websites
  • Share findings with your participants

A note about response rates: If the response rate of your survey was especially low, consider how the response rate may affect the validity of your results and/or issues of confidentiality of respondents. Consult the Office of Institutional Research if you are unsure if you should share results based on low response rate.

 

Selecting a Method

Things to consider
Before selecting a method, take a few minutes to reflect on the following questions to assist you in selecting a method:

  1. What type of assessment are you planning on conducting?
    • Usage Numbers 鈥 track participation in programs or services
      • Consider the following methods: existing data, tracking system, calendar system, KPI
    • Student Needs 鈥 keeps you aware of student body or specific populations
      • Consider the following methods: survey, focus group, visual methods
    • Program Effectiveness 鈥 level of satisfaction, involvement, effectiveness, helpfulness, etc.
      • Consider the following methods: survey, focus group, observation
    • Cost Effectiveness 鈥 How does a program/service being offered compare with cost?
      • Consider the following methods: existing data, comparative data, KPI
    • Campus Climate or Environment 鈥 assess the behaviors/attitudes on campus
      • Consider the following methods: focus group, document analysis, survey, existing data, case study, observation
    • Comparative (Benchmarking) 鈥 comparing a program/service against a comparison group
      • Consider the following methods: survey, rubric, existing data, KPI
    • Using National Standards or Norms (e.g, CAS) 鈥 comparing a program/service with a set of pre-established standards (e.g., CAS, Information Literacy) or normative data (e.g. , ACT scores)
      • Consider the following methods: survey, document analysis, existing data
    • Learning Outcomes 鈥 assess how a participant will think, feel, or act differently as a result of your program/course/service

Overall, your assessment method should be a reflection of the learning that you are seeking to assess. Thinking about Bloom鈥檚 taxonomy, the different levels of thinking would require different assessment methods. In other words, a more in-depth thinking level would necessitate more in-depth assessment.

    • For example, an assessment of the synthesis and evaluation levels would be more in-depth and require more complex assessment methods such as rubrics, content analysis or interviews/focus groups, compared to knowledge or comprehension levels that are less complex and can be assessed using surveys and quizzes
    • Consider the following methods: survey/quiz, rubric, portfolio, one-minute assessment
  1. If you are assessing learning, do you need direct or indirect evidence of learning?
    • Direct Methods 鈥 any process employed to gather data that requires students to display their knowledge, behavior, or thought processes.
      • e.g. Where on campus would you go, or who would you consult with if you had questions about which courses to register for in the fall?
      • Direct measures of learning are usually accomplished through assessment methods such as a鈥渜uiz鈥 type survey, rubric, document analysis, observation, portfolio, visual methods, one-minute assessment, and/or case study.
    • Indirect Methods 鈥 any process employed to gather data that asks students to reflect upon their knowledge, behaviors, or thought processes.
      • e.g. I know where to go on campus if I have questions about which courses to register for in the fall. (strongly agree, moderately agree, neither agree nor disagree, moderately disagree, strongly disagree)
      • Indirect measures of learning are usually accomplished through assessment methods such as a survey, focus group, document analysis and/or one-minute assessment.
  2. Do you need quantitative data, qualitative data or both?
    • Both methods can produce data/information that can be presented in number or narrative form. So at this point, your decision should be made on the depth of the information that you need.
    • Quantitative Methods 鈥 produce data that shares simple facts or figures
      • Looks at questions that concern who, what, where, when
      • Matches with outcomes about knowledge and comprehension (define, classify, recall, recognize)
      • Examples of quantitative methods: survey, existing data, rubric (if assigning numbers), tracking system, observation, document analysis, KPI
    • Qualitative Methods 鈥 produce data with more depth and description
      • Looks at questions that concern why and/or how
      • Matches with outcomes about application, analysis, synthesis, evaluation
      • Examples of qualitative methods: focus group/interview, portfolio, rubric (if descriptive), visual methods, one-minute assessment, open-ended survey question, observation, document analysis, case study
    • Mixed Methods 鈥 assessment is not always completed with just one method
      • For example, a social responsibility outcome such as 鈥渟tudent articulates the unfair, unjust or uncivil behavior of other individuals or groups,鈥 might best be assessed through interview or focus group and through rating a role play exercise on a rubric.
  3. Based on the possible methods you may use, weigh the advantages and challenges to the specific methods below to help determine your best possible choice(s).

Existing Data: Data that has already been collected, usually from previous assessment projects, student information or office systems or tracking systems.

Strengths:

  • No time needed to collect data
  • No risk of survey fatigue, response rate issues
  • Data mines current collection processes/systems
  • Capitalizes on previous assessment efforts
  • Unobtrusive in nature

Challenges:

  • Reliant on the reliability/validity or trustworthiness of the source
  • Non-responsive in nature (no follow-up option)
  • Response rates are pre-determined by the data that exists
  • Gaining access to data that may be housed elsewhere
  • Creating internal systems for collecting data you need may require adjusting current systems
  • Data may not be sufficient, may require follow up

Things to consider:

  • How do you gain access to data?
  • Will you have the ability to analyze/manipulate the data in the way you need?
  • Where is the data coming from and what form will you receive it in?  This will lead to decisions on how you analyze data. If you need to know how to conduct a document analysis, use a database or know how to analyze data in Excel or SPSS.

Next steps: Once you have the data, submit it to be uploaded to Campus Labs Baseline.  Be sure to link your data with program goals and objectives through the management system, as well as key performance indicators.

 

Survey: Asking open- and closed-ended questions on a questionnaire type format. A survey is a self report of anything, including opinion, actions and observation.

Strengths:

  • Include large numbers
  • Relatively fast and easy to collect data
  • Lots of resources available
  • Requires minimal resources
  • Fast to analyze
  • Good for surface level or basic data

Challenges:

  • Survey fatigue and response rates
  • Non-responsive
  • Limited in type of questions asked
  • Lacks depth in data
  • Need skill sets in both designing questions and analyzing data properly

Resources needed:

  • What is the best administration method (paper, web, mobile, etc.)?
  • How will you draft and review the questions?
  • Do you want to offer incentives for completing the survey?
  • Do you have a data analysis plan? Do you need to use comparative tools?

Rubric: A scorecard used to rate student learning either through observation or artifacts.  Includes a scale, key dimensions and descriptions of each dimension on the scale.

Strengths:

  • Clearly states standards and expectations
  • Can be used for a learning and assessment tool
  • Provides for consistency in rating/grading
  • Participant can use rubric to gauge his/her own performance
  • Provides both individual and program-level feedback
  • Provides both numbers and descriptive information

Challenges:

  • Developing a rubric takes time
  • Training of raters is needed
  • Limited in use for just student learning outcomes
  • Beware of inter-rater and intra-rater reliability
  • Depending on technology resources, combining aggregate data can take time

Resources needed:

  • How will you design and test your rubric?
  • How will you train raters?
  • What learning opportunities do you have to observe? Or what mechanism will you use to collect artifacts?

Focus Groups or Interview: Asking face-to-face, open-ended questions in a group or one-on-one setting. Questions are meant to be a discussion.

Strengths:

  • Clearly states standards and expectations
  • Can be used for a learning and assessment tool
  • Provides for consistency in rating/grading
  • Participant can use rubric to gauge his/her own performance
  • Provides both individual and program-level feedback
  • Provides both numbers and descriptive information

Challenges:

  • Getting participants (think of time/places)
  • Data collection and analysis takes time
  • Data is as good as the facilitator
  • Beware of bias in analysis reporting
  • Meant to tell story, may not help if numbers are needed
  • Data is not meant to be generalizable

Resources needed:

  • How will you develop questions and protocols?
  • Who is the best facilitator of the interview or focus group? What level of objectivity does he/she need and what knowledge of the subject/situation?
  • How will notes be taken? Do you have recording devices?
  • What logistics do you need to consider as far as finding space, etc.?
  • Do you need consent forms?

Portfolio: A collection of artifacts or work that provides evidence of student learning or program improvement.

Strengths:

  • Unobtrusive 鈥 does not require participant engagement
  • Requires seeing beyond nature perspective
  • Often effective with physical plant and watching for student trends
  • Useful for gathering initial data to couple with survey or focus group
  • Provides both numbers and descriptive information

Challenges:

  • Requires planning ahead (e.g., protocols, charts, journals)
  • Non-responsive in nature
  • Limited in the type of data it can collect
  • Need trained observers
  • Need system of collecting information

Resources needed:

  • Do you have a protocol?
  • Do you need to train observers?
  • What is your timeline?

 

Document Analysis: A form of qualitative research in which documents are used to give voice, interpretation and meaning. Any document can be used, common documents may be: application materials, student newspaper or publications, marketing materials, meeting minutes, strategic planning documents, etc.

Strengths:

  • Documents are readily available
  • Documents are already collected or easily collected
  • Low costs
  • Documents are a stable data source (they don鈥檛 change)
  • Can be collected on a quick timeline

Challenges:

  • Non-responsive in nature
  • Documents are context and language specific
  • Documents are often disconnected from their creator
  • All documents are written through a lens, need to be aware of lens in order to assess objectivity
  • Data analysis takes time

Resources needed:

  • How do you gain access to the documents?
  • Do you know how to set up a coding system?

 

One-Minute Assessment: Very short assessments of what a participant is 鈥渢aking away鈥 from his/her experience. Should be targeted at a specific learning or program outcome.

Strengths:

  • Provides a quick summary of take away from student perspective
  • Quickly identifies areas of weakness and strengths for formative assessment
  • Can track changes over time (short-term)
  • Non-verbal (provides classroom feedback from all students)
  • Captures student voice
  • Short time commitment
  • Provides immediate feedback

Challenges:

  • Non-responsive
  • Short (so you may lose specifics)
  • Sometimes hard to interpret
  • Need very specific prompts in order to get 鈥済ood鈥 data
  • Plan logistics ahead of time and leave time during program/course
  • May need to be collected over time

Resources needed:

  • Do you have a strong prompt?
  • Have you reserved time to collect data?
  • Do you have a system for collecting data in a non-rushed manner?

 

Visual methods: Captures images as a main form of data collection, usually also includes captions or a journal to accompany images.  Most often used for photo journals, video projects and visual art projects.

Strengths:

  • More detail and depth to data
  • Visual aspect allows for depth in sharing results
  • High levels of student investment
  • Can use images captured for multiple uses
  • Very descriptive in nature

Challenges:

  • Beware of threats to alterations of images (especially with technology)
  • Usually smaller number of perspectives
  • Time for implementation and follow-through
  • Analysis takes time
  • Resources may be needed in order to capture images

Resources needed:

  • How will your participants capture images (resources)?
  • What prompt will you use to make sure participants have a clear direction?
  • Do you have time to gather and process information in your timeline?
  • Have you accounted for time for member checking?

 

Case Study: A form of qualitative descriptive research, the case study looks intensely at an individual, culture, organization or event/incident.

Strengths:

  • More detail and depth to data
  • Multiple perspectives are gathered
  • Tells a story
  • Very descriptive in nature

Challenges:

  • Takes significant time to gather information and analyze
  • More perspectives equals more time
  • Narrow purpose as far as sharing data afterward
  • Analysis takes time
  • Resources may be needed in order to capture data
  • Not meant to be generalizable but can be transferrable

Resources needed:

  • How will you capture data?
  • Do you have a clear understanding of what you are profiling and why?
  • Do you have time to gather and process information?
  • Have you allocated time for member checking?

Key Performance Indicator: Helps an organization define and measure progress toward organizational goals. Usually broad-picture, quick snapshots of information.

Strengths:

  • Provides information on direction of organization
  • Identifies trends
  • Focuses on 鈥渒ey鈥 measures
  • Concise in communicating (especially 鈥渦pward鈥)
  • Often already available

Challenges:

  • Determining measures
  • Deciding how to collect information
  • Lack of context
  • Identifies trends but often lacks ability to be attached to specific programs, courses or services

Resources needed: 

  • How will you capture data?
  • Do you have a clear understanding of your measures and how they are linked with goals?

Additional Tips for Choosing Methods:

  • Build up your assessment toolbox by getting experience with different methods and knowing when it is appropriate to use them.
  • Keep it simple! Assessment is 鈥済ood enough鈥 research. Choose a method that is manageable so you can complete the project.
  • Start with the ideal design for your assessment and then work backward to what is possible. There is always more than one source for collecting data. Use what works best for you knowing that you can add on other sources later.
  • Start off small to get experience; don鈥檛 try to complete a 鈥渄issertation鈥 sized project the first time around.
  • Get feedback from colleagues, peers and your Campus Labs Baseline consultant. A new set of eyes on your methods may reveal an important piece that you have not seen.
  • Read the literature and attend conferences through a new lens; look for ideas on how others conduct assessment and how you may also use the same methods.
  • Ask if the data already exists somewhere else before choosing a different method that will use valuable resources.
  • Look for potential to collaborate with other divisions and units.
  • Include culturally sensitive language and facilitators when using assessment methods. If you are not sure about language, ask someone to look over your assessment method.
  • Include stakeholders from the beginning; this builds credibility in your methods and assessment results.
  • Keep in mind how the method you choose will affect your results and make note of that for your report.
  • Reflect on the process/results of assessment and do not be afraid to change your method. Assessment is an ongoing process.