SUNY Cortland the online guide Miller Building
Home Site Search Skip quick links
or visit our quicklinks page  
  Skip the Navigation Bar
SUNY logo
GEAR Home Button
GEAR Conference Button
GEAR News Button
GEAR Members
Best Practices Button
Documents and Resources Button
FAQ Button
Feedback Button

Frequently Asked Questions about GE Assessment

GEAR group members often receive questions about assessment in general, assessing general education, and/or assessment of the specifics of the SUNY GE program. This page is designed to air frequently asked questions and answers to them so that others involved in assessing the SUNY GE Learning Outcomes may benefit from this knowledge base.
If you have a question that you would like to submit to the GEAR group for their consideration, please contact us by clicking the Submit Question button. Submit Question Button

"Jump To" Index

Assessment Cycle Critical Thinking/ Information Management External Review/ Measures Grades Listserv
Process Reporting Sampling Standards Students


Questions & Answers

Assessment Cycle

Q: Do we have to assess each of the 12 Knowledge and Skills Areas and Competencies every year?

A: No. GEAR's Guidelines allow campuses to focus on fewer outcomes per year, as long as each of the 12 areas is addressed once in a three-year cycle. Typically, campuses will do 4 a year. top

Assessment Cycle

Q: It looks as though our plan will be a three-year plan, assessing 4 of the 12 areas each year, and the plan submitted to GEAR will articulate which of the four will be assessed in each year. How much detail do you need in terms of the second and third year?

A: There needs to be sufficient detail about all 12 areas so that GEAR can determine if the assessment plan is sound. You certainly don't have to spell out the individual courses that will be sampled for each outcome, but a general methodology that makes it clear what measures will be used and how data will be collected, scored and aggregated should be

Critical Thinking and Information Management

Q: We are beginning to discuss how best to assess Critical Thinking and Information Management. One thought we had was to assess these areas in courses that most students are required to take (like English 101, for example), so that we do not have to come up with measures for these areas in every Knowledge and Skills Area. Any ideas?

A: There's lots of information on the Web with respect to how institutions are assessing Critical Thinking, in particular. You might want to consider "weaving it" into the measurement of other outcomes, since it would be fairly easy and efficient to do. For example, in measuring Information Management within the Social Science outcomes, you might require students to do an application in Excel to demonstrate their

External Review/Measures

Q: Is external peer review required in the assessment of general education?

A: No. It's entirely up to the faculty who teach courses that are a part the campus' general education plan to determine the best way of assessing student learning. (External reviewers are required for assessment of the major, but GEAR is not involved in that process.)top

External Review/Measures

Q: How critical is it to incorporate assessment methods external to the campus, such as employer surveys, student surveys, and the like?

A: The Guidelines call for assessment measures that will directly measure student learning (i.e., as differentiated from the perception that learning has taken place). While you may wish to include such things as employer and student surveys in your broader effort to understand more about your general education program, they are not required in your assessment


Q: If we use course-embedded assessment, can we just use grades as assessment results?

A: Course grades are seldom a workable proxy for assessment practices designed to measure learning. A course grade usually, and often necessarily, includes many things other than an assessment of a student's ability to demonstrate learning for a particular outcome, and rightly so. Grades generally reflect papers, examinations, class presentations, attendance, class participation, extra credit, etc. in addition to, most importantly, a whole body of knowledge that represents the course's goals, not simply a set of particular learning outcomes. In other words, course grades are, by definition, global, not specific. It would be very difficult to parse a grade in such a way as to back out all non-relevant factors to determine what the achievement level was for a particular competency. A suggestion: if there are assignments or tests that are very closely linked to end-of-course achievement with respect to certain learning outcomes, why not use those as assessment measures. Be sure that they reflect the ultimate goal within the course, not an intermediate


Q: How can I become a participant in the assessment listserv?

A: Simply send a request to Patricia Francis, Assistant Provost. We'd be pleased to add you to the


Q: Has GEAR established any guidelines and format for the assessment plan?

A: GEAR has approved a set of Guidelines, which have been distributed on the ASSESS-L listserv and are available on our Web page, There is no specified format for the plan-we simply want campuses to address the criteria in the


Q: Does the assessment plan need to include an assessment plan for each course?

A: You do not have to include an assessment plan for each course--that would be totally unreasonable. GEAR is looking for an overall plan that makes it clear how each learning outcome will be assessed through the whole general education


Q: Has GEAR established any guidelines and format for the reporting of assessment results?

A: GEAR will follow the recommendations of the Task Force Report, which indicate that campus reports should include, at a minimum, the percentages of students exceeding, meeting, approaching and not meeting the learning outcomes. We will discuss the reporting format at our December meeting, and we expect to also discuss the possibility that a campus report might also include additional narrative and qualitative


Q: There are a number of the Provost's Advisory Council on General Education (PACGE)-approved General Education courses for each of the Knowledge and Skills Areas. If, for example, there are 22 courses approved for Mathematics, does that mean that we will have to assess student learning in all 22 courses?

A: In order to be truly representative and to ensure that all courses intended to focus or cover a particular outcome are, in fact, doing so, the assessment process should potentially focus on all courses approved for that outcome. But, that doesn't mean that all courses need be assessed every cycle. For example, out of 22 sections covering a particular outcome, 11 might be randomly selected for assessment in one cycle, and the others in the


Q: If we decide to assess a random sample of students, what is a desirable percent to select?

A: Some experts feel that around 20% of students enrolled in a particular learning experience is a good number to aim for. That means you may have to include a few more, since you always lose data in various ways. Also, keep in mind that you do need a critical mass in any assessment you're doing, and we suggest not having less than 30 students in a given


Q: Our Assessment Committee has asked if the assessment plan need describe only the measurement tool, or if performance standards should also be included. For example, if a multiple-choice examination is used to determine mastery of terminology in a discipline, does the plan have to indicate a performance standard of, say, 60% or above?

A: The Guidelines call for campuses to include, for each learning objective, the standard defining what level of student performance the faculty considers as "exceeding," "meeting," "approaching," and "not meeting" standards. So, if the examination you describe is a major indicator of your efforts to assess learning, you should define the standards, as you see


Q: Has there been any discussion about the implications of informing students of assessment activities? Many of our faculty feel that students should know that we are assessing their learning, but we are not sure what to tell them and how it might affect their motivation.

A: Many campuses across the country publicize the fact widely to students that they will be expected to take part in assessment activities while at the college. This fall, SUNY Cortland, for example, will begin distributing a pamphlet to students that includes its campus philosophy of assessment, as well as information related to the process. If campuses are relying exclusively on course-embedded assessment, this isn't much of an issue because the same information you're using to grade students is also helping you assess your program's effectiveness. It really only becomes an issue if you're requiring students to take part in assessment activities outside their courses and programs. Generally, it's a good idea to let students know: after all, you're doing this to improve teaching and learning. If the pamphlet idea seems too much for your campus, you might consider simply including a statement in your catalog about assessment and the college's expectation that students will