<?xml version="1.0" encoding="UTF-8"?>
<item xmlns="http://omeka.org/schemas/omeka-xml/v5" itemId="9141" public="1" featured="0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://omeka.org/schemas/omeka-xml/v5 http://omeka.org/schemas/omeka-xml/v5/omeka-xml-5-0.xsd" uri="https://caspercollege.cvlcollections.org/exhibits/show/school-of-social-and-behaviora/item/9141?output=omeka-xml" accessDate="2026-04-04T04:48:21+00:00">
  <fileContainer>
    <file fileId="9492">
      <src>https://caspercollege.cvlcollections.org/files/original/f354f9812d38ff91045a1550862c28ab.pdf</src>
      <authentication>7b104bbb572dc64b8069eeeabfe92e7c</authentication>
      <elementSetContainer>
        <elementSet elementSetId="4">
          <name>PDF Text</name>
          <description/>
          <elementContainer>
            <element elementId="92">
              <name>Text</name>
              <description/>
              <elementTextContainer>
                <elementText elementTextId="96796">
                  <text>IMPROVING THE ASSESSMENT OF STUDENT LEARNING:
ADVANCING A RESEARCH AGENDA IN SOCIOLOGY
*
tn the last two decades, format assessment of student learning in higher
education has become institutionalized. This paper summarizes current re­
search and writing about the key components of assessment plans (statement
of purpose, goals and outcomes objectives, and assessment mechanisms) and
about the work involved in conducting an annual assessment program. We
discuss the evolution of assessment within sociology and the paucity of both
descriptive and explanatory research on assessment of student learning. We
also pose important research questions that sociologists could pursue to
enhance understanding of the context, content, process, and effects of
assessment. The paper also examines the assessment movement itself: forces
that have stimulated the movement, the demonstrated benefits of conscien­
tious assessment of student learning, sources of resistance to assessment, and
the genera! status of assessment in higher education today.

Gregory L. Weiss

Janet R. Cosbey
Eastern Illinois University

Roanoke College

Shelly K. Habel

ChadM. Hanson

Georgetown University

Casper College

Carolee Larsen
Millsaps College

In the last two decades, an assessment student learning, (3) the systematic collec­
movement has emerged and spread through­ tion of information relative to the extent to
out a variety of social sectors including which the objectives are being accom­
businesses, social services, and education. plished, and (4) based on the information
Wi±in higher education, assessment can obtained, collective efforts to identify and
occur at the institutional, divisional, depart­ implement specific program changes to en­
mental, program, and class level and on hance student learning. Assessment is
both the academic and administrative sides grounded in the belief that effective institu­
of the institution. While the layers of assess­ tions and departments engage in a systematic
ment are naturally inter-related, this paper and continuous process of improvement in
focuses on the assessment of student learn­ order to better achieve their goals and objec­
ing by academic departments. As such, the tives.
focus is on a process that includes (1) the
Work on this paper began when the five of
department’s development of an explicit us collaborated at a conference on The
mission or purpose statement, (2) the formu­ Scholarship of Teaching and Learning in
lation of broadly stated goals and more Sociology held in Harrisonburg, Virginia in
specifically stated outcomes objectives for July 2000. The workshop brought together
more than 40 sociologists from all types of
*The authors wish to thank two anonymous
institutions to review and extend current
reviewers for their very helpful comments on an
scholarship in sociology related to (1) the
earlier draft of this paper. Please address all
correspondence to Gregory L. Weiss. Depanintegration of styles of teaching and styles of
ment of Sociology, Roanoke College. Salem.
learning, (2) the assessment of faculty, (3)
VA 24153; e-mail: weiss@roanoke.edu
partnerships between community and
Editor’s note: The reviewers were, in alpha­
academy, (4) technology and its uses in
betical order, Charles Powers and Stephen
leaching and learning, (5) the impact of
Sharkey.

Teaching Sociology. Vol. 30, 2002 (January:63-79)

63

�64

TEACHING SOCIOLOGY

institutional contexts on teaching and learn­
ing, and (6) the sociology curriculum and
assessment of student learning. The authors
of this paper worked to identify knowledge
that already existed about the assessment of
student learning and what still needed to be
known. In the course of doing this work we
hoped to establish and promote a research
agenda for sociologists interested in con­
tributing to an understanding of assessment
and its effects on teaching and learning.
FORCES PROMOTING THE
ASSESSMENT MOVEMENT

The primary stimulus for the assessment
movement occurred in the early 1980s
through increasingly vocal public dissatis­
faction with the quality of higher education
and increased calls for institutional account­
ability for educational promises. Wellpublicized exposes of students graduating
from college without fundamental reading,
writing, and mathematics skills contributed
to the suspicion that institutions were not
fulfilling their obligations to students. When
these complaints were coupled with requests
by institutions for more public funds, legis­
lators, parents, and students began demand­
ing empirical evidence of the real—as op­
posed to claimed—outcomes and benefits of
attending a college or university (Terenzini
1989).
These public outcries caught the attention
of two especially important groups: the six
regional accrediting agencies for colleges
and universities and several slate legislatures
(Schechter, Testa, and Eder 2000). Though
the terminology used by the accrediting
agencies differs, all of them now place some
emphasis on assessment (Maki 1999). For
example, the Southern Association of Col­
leges and Schools (SACS) has placed ex­
tremely high priority on assessment (in re­
cent years assessment has been the area in
which institutions have been most likely to
be judged in non-compliance by SACS), and
places it in the context of institutional effec­
tiveness. SACS’ Criteria for Accreditation
(1998:19) describes institutional effective­

ness as being “at the heart of the commis­
sion’s philosophy of accreditation” and
“central to institutional programs and opera­
tions.” It expresses an expectation that
“each member institution develop a broad­
based system to determine institutional ef­
fectiveness appropriate to its own context
and purpose, to use the purpose statement as
the foundation of planning and evaluation, to
employ a variety of assessment methods,
and to demonstrate use of the results of the
planning and evaluation process for the im­
provement of both educational programs and
support services.”
State legislatures have also become cen­
trally involved in the assessment and ac­
countability movement. Several legislatures
have applied significant pressure on public
institutions of higher education to engage in
a systematic process of assessment of their
programs and to document the outcomes and
value of the education provided. Some states
have created specific performance standards
to evaluate institutions and upon which to
base budget allocations (Burke, Modarresi,
and Serban 1999; Wellman 2001).
The same forces have in part been respon­
sible for stimulating considerable interest in
assessment throughout a wide variety of
academic groups, academic foundations, and
disciplinary associations. Groups such as the
American Association for Higher Education
(AAHE) now promote the value of assess­
ment as a legitimate means to improve
student learning, encourage institutions to
shape their own programs to be maximally
beneficial, and sponsor annual assessment
workshops.
Disciplinary associations, including the
American Sociological Association (ASA),
also support and encourage assessment ac­
tivity and have become advocates for the
development of sound assessment programs.
For example, the widely-read Liberal Learn­
ing and the Sociology Major (ASA 1992:2223), prepared by the ASA in conjunction
with the Association of American Colleges
(AAC), identifies assessment as one of 13
recommendations for all sociology depart­
ments:

�ASSESSMENT OF STUDENT LEARNING
Departments
should
assess
the
major
(curriculum, courses, and instruction) on a
regular basis using multiple sources of data. To
implement this recommendation, departments
should routinely collect data by:
• examining the department’s goals, missions,
needs, facilities, access to resources, etc.;
• examining the faculty’s goals, needs, re­
sources, and perspectives on instruction;
• surveying present students, both majors and
non-majors, on needs, goals, levels of satis­
faction with courses and advising, social
networks, career goals and actual plans,
etc.;
• surveying graduates on. similar issues, as
well as on their identification with sociol­

ogy:
• monitoring similar data in other “sibling”
institutions and departments;
• aniculating the findings’ implications for
departmental programs.

RATIONALE FOR ASSESSMENT

The literature contains four primary argu­
ments for conscientious assessment pro­
grams as they relate to student learningincreased faculty conversation about teach­
ing and learning, improved classroom teach­
ing, effective curricular reform, and, most
importantly, enhanced student learning.
Conducting meaningful assessment re­
quires faculty colleagues to engage in seri­
ous conversation about teaching and learn­
ing: about the mission of the department or
program, about explicit goals and objectives
held for students, about ways to determine
the best manner to assess the extent to which
students are achieving the objectives, and
about ways that department organization,
curriculum, and course instruction can be
modified to enhance student learning. While
many departments never engage in this type
of discussion, proponents of assessment ar­
gue that they should and that the teaching­
learning process will inevitably benefit
(Howery 1992; Sharkey 1990).
The assessment literature suggests that
conversation about teaching and learning and
the self-reflection that it engenders leads to
a second benefit: improved teaching. The
process of mission, goal, and objective artic­

65

ulation forces faculty to think about their
own courses and course components in a
more focused way (Good and Brophy 1994;
Posner 1995). It creates a greater awareness
of how individual courses fit into the cur­
riculum, and it helps guide course construc­
tion and delivery.
This benefit may be especially true in the
case of classroom-embedded assessment
techniques such as the use of primary-trait
analysis (that is, the development of
rubrics), which uses graded assignments that
directly correspond to department and
course learning objectives (Huba and Freed
2000), and with classroom assessment tech­
niques (CAT), which use ungraded feedback
mechanisms to monitor the progress of stu­
dent learning during a course (Angelo and
Cross 1993; Brookfield 1995; Nilson 1998;
Tebo-Messina and Van Waller 1998). Be­
cause meaningful classroom assessment oc­
curs only when we lest what we teach (Cross
1999; Hilton, 1993; Lovell-Troy 1989) and
when we are willing to continually evaluate
what we are leaching (Angelo and Cross
1993), assessment creates a cycle of feed­
back, self-reflection, and effort to improve
leaching. There is evidence that the use of
these techniques constitutes good teaching as
they both require students to focus on what
they are learning (Eisenbach, Golich, and
Curry 1998) and enable students to more
actively monitor their own learning process
(Cross 1999). Students typically respond
favorably to these classroom techniques,
expressing greater satisfaction with courses
that use them (Steadman 1998). For these
reasons, it is not unusual to see discussions
of student assessment and faculty assessment
in tandem, as willingness to engage in stu­
dent assessment is seen as one measure of
faculty affectivity (Centra 1993; Weimer
1990).
A third benefit of assessment cited in the
literature is support for curricular reform.
An obvious sociological insight is that aca­
demic departments rarely equal the sum of
their parts—they are greater or lesser de­
pending upon the ability and willingness of
members to work together for the common

�TEACHING SOCIOLOGY
l«xt, Sunihuly. sociology curricula do not many more such examples exist, but most
timply equal lhe sum or aggregation of are only now finding their way into the
hMllvuliially constructed sociology courses. literature.
Miiiiy oj (he key curricular issues identified
in Liberal Learning and lhe Sociology MaTHE CURRENT STATUS
/fv -jniegration of the curriculum, meaningOF ASSESSMENT
lul sequencing of courses, requiring devel­
opmental levels, and overall curricular co­ In the last 15 years, assessment has been a
herence—cannot be addressed by individual primary movement within higher education.
faculty members, no matter how conscien­ The insistence of state legislatures and ac­
tious they are. Assessment provides a means crediting agencies; the endorsement of
for discussion of the collective departmental higher education associations, foundations,
mission, goals, and objectives; a technique and disciplinary associations; the institution­
for systematically collecting data to deter­ alization of assessment on campuses in the
mine the extent of success in achieving the form of assessment directors and commit­
objectives; and a forum for consideration of tees; and the ever-burgeoning growth of
curricular changes that would strengthen assessment conferences and assessment liter­
student learning (Ellis and Pouts 1993: ature testify to the secure hold that assess­
Howery 1992; Posner 1995).
ment has obtained. Yet, most higher educa­
Ultimately, the legitimacy of assessment tion experts around the country would agree
rests with the final perceived benefit: the that, to date, assessment’s record of accom­
genuine enhancement of student learning. plishment has fallen far short of the ideal or
The theoretical foundation of assessment is expected (Angelo 1999; Burke 1999; Lazerthat the focus of higher education should son, Wagener, and Shumanis 2000; Maki
shift from being teaching-oriented and input- 1999.)
oriented to learning-oriented and outcome­
In 1997 the New England Association of
based. The measure of success shifts from Colleges and Schools (the accrediting
what is being given to students or done to or agency in the New England area) surveyed
for students to what happens to students as a its 188 constituent institutions about activi­
result of their educational experience. Those ties directed toward assessing student out­
who work in assessment agree that this is a comes. A dozen years into the assessment
profound change.
movement, the great majority (92 percent)
Much of the evidence that exists regarding of the institutions indicated that they could
the positive effect of assessment on student “demonstrate not very well or only moder­
learning is based on either (I) the positive ately well the success of their efforts to
effects of clear goal- and objective-setting on assess, verify, and enhance the achievement
student learning or (2) case studies of institu­ of their mission and purposes through stu­
tions and departments that have experienced dent outcomes assessment” (Maki 1999:2).
enhanced student learning as a result of Sixty percent of administrators rated assess­
serious assessment. Examples in sociology ment as extremely important, but they esti­
of the latter include Sharkey’s report (1990) mated that 70 percent of the faculty viewed
on Alverno College’s success in teaching assessment as only somewhat important or
analysis and valuing: Jackson et al.’s report not at all important.
(1992) on Rhode Island College’s success in
A subsequent survey conducted by the
teaching research methods and social theory; National Center for Postsecondary Improve­
and Bradfield’s report (1992) and Eek’s ment (NCPI) of 1,400 public and private
reflection (2001) on James Madison Univer­ institutions examined the nature, extent, and
sity’s success helping students understand impact of student assessment strategies on a
the core paradigms of critical, interpreta­ national basis (Wright 2000). Although the
tive, and naturalistic analysis. Certainly, survey found fairly substantial institutional

�ASSESSMENT OF STUDENT LEARNING
activity on collecting student assessment
data, most of what was collected were insti­
tutional data on students’ academic progress,
basic college readiness skills, academic in­
tentions of students, and satisfaction with the
undergraduate experience. Few institutions
indicated an engagement in more complex
assessment activities focused on evidence of
student learning. Results also indicated that
few institutions were making academic plan­
ning decisions based on the findings from
assessment data. The NCPI report stated
that while many state agencies and institu­
tional accrediting bodies have stimulated the
adoption of assessment activities, these ac­
tivities have had little impact on how institu­
tions have used student assessment data to
improve student performance. While assess­
ment holds much promise, the NCPI con­
cluded that it hardly constitutes an academic
revolution (National Center for Postsec­
ondary Improvement 1999).
Lazerson et al. (2000), commenting on
their survey of 320 institutions that under­
went reaccreditation reviews between 1997
and 1999, acknowledged that some changes
in leaching and learning in higher education
have occurred in recent decades, “but there
is little evidence that the changes add up to a
systematic reconsideration of how and why
students learn or of how institutions, rather
than simply individual professors, can revise
their approaches to teaching.” Palomba and
Banta (1999) charge that although most insti­
tutions are now involved in assessment, their
actions constitute little more than “a thin
veneer of compliance.” Acknowledging that
some academics and professional staff take
their assessment activity seriously, most are
only tangentially involved or not at all in­
volved in genuinely using assessment to
enhance student learning.
Yet, there are many examples of institu­
tions and academic departments that do take
assessment seriously and report positive
findings. Publications such as Assessment
Bulletin and Assessment Update routinely
report specific examples of successful as­
sessment programs. Books such as Banta et
al.’s Assessment in Practice: Putting Princi­

67

ples to Work on College Campuses (1996)
and Nichols’ A Practitioner’s Handbook for
Institutional Effectiveness and Student Out­
comes Assessment Implementation (1995)

offer exemplars of effective assessment ac­
tivity. It seems clear, according to Hutch­
ings and Marchese (1990), that where as­
sessment works, it does so because it is an
integral part of the entire educational experi­
ence. It is not a separate function on its own
but rather a process woven into the daily
fabric of college.
REASONS WHY ASSESSMENT
HAS NOT HAD MORE SUCCESS

What guidance does the literature offer to
help explain why assessment has not reached
a higher level of acceptance nor led to more
frequent genuine enhancements of student
learning? Four of the key factors are sum­
marized here.
First, conscientious assessment constitutes
a significant departure from the traditional
academic culture throughout higher educa­
tion but especially in larger, researchoriented institutions. Traditional academic
culture often is very individualistic, with
maximum emphasis placed on each faculty
member enacting his or her own career
without much interference or influence by
department or college. Teaching and curric­
ular matters may rarely be discussed, and no
one is likely to impinge on what occurs in
individual classrooms. Meaningful assess­
ment requires faculty to discuss matters such
as the mission of the program and its spe­
cific objectives, mechanisms to genuinely
assess student learning, and changes in cur­
riculum, policies, standards, course organi­
zation, and pedagogy that could positively
impact student learning. This requires coor­
dination, collaboration on teaching-learning
matters, and some willingness to prioritize
the common good or the student good over
personal desires.
In a paper presented at the 1999 Assess­
ment Institute, Banta et al. posited that the
very definition of assessment implies collab­
oration, but that higher education contains a

�68
variety of barriers to collaboration which
make genuine commitment to assessment
very difficult. While the reasons for lack of
greater assessment success at large universi­
ties and small colleges may not be identical,
Banta et al. identified the following barriers:
disciplinary traditions, the faculty reward
structure (as it relates to engaging in individ­
ual research projects and securing grants),
and the traditional configuration of teaching
as an individually practiced profession.
Keith and Myers (1992) concluded that
many faculty have little interest in collabora­
tion for the sake of student learning, have
not been convinced of its desirability, and
resent its interference with their own inter­
ests and autonomy.
Second, the manner in which the assess­
ment mandate has been presented to faculties
has further contributed to their resentment of
it. The traditional mistrust between college
faculties and state legislatures comes to the
surface in cases where legislators enact poli­
cies that directly or indirectly dictate to the
professorate (as occurs with performance
standards). While regional accrediting agen­
cies have been among the leaders in support­
ing and requiring assessment, they have
sometimes acted with such heavy­
handedness in working with constituent insti­
tutions that faculties come to see assessment
mostly or entirely as a mandate associated
with reaccrediiation rather than a genuine
technique to enhance student learning
(Sharkey 1990). This factor is further exac­
erbated by the fact that doing good assess­
ment does require time. Faculty members
will inevitably resent having any time­
significant task imposed on their workload
without some corresponding reduction in
other responsibilities.
Finally, Peter Ewell (1997) has suggested
two additional reasons that institutions of
higher education have not had more success
in enhancing student learning. He suggests
that those in higher education lack a clear
understanding of what collegiate learning
really means. This is consistent with the
oft-expressed idea that student learning is
such an obvious and taken-for-granted con­

TEACHING SOCIOLOGY
cept that neither individually nor collectively
have we thought through its meaning. An­
gelo (1999:3-4) argues that “most assess­
ment efforts have resulted in little learning
improvement because they have been imple­
mented without a clear vision of what
“higher” or “deeper” learning is and with­
out an understanding of how assessment can
promote such learning.” This brings us to
Ewell’s other conclusion regarding assess­
ment: that assessment initiatives have, for
the most part, been attempted in a piecemeal
fashion within and across institutions. This
perception of haphazardness also contributes
to a failure to view the teaching/learning
process as part of the social institution of
education.
THE ENACTMENT OF ASSESSMENT:
ASSESSMENT PLANS AND ANNUAL
ASSESSMENT ACTIVITIES

As the assessment movement has evolved
and as the literature on assessment and the
number of workshops and sessions at profes­
sional meetings of academic disciplines and
higher education associations have in­
creased, the basic expectations for assess­
ment have crystallized. Essentially, but with
some important variations from region to
region, academic majors and programs
within higher education institutions must (1)
construct an assessment plan consisting of a
statement of purpose, a set of outcome
objectives for students majoring in the disci­
pline and a roster of mechanisms used to
assess success in achieving the objectives,
and (2) conduct an annual assessment pro­
gram in which a limited number of the
objectives are assessed and the results used
to identify and implement changes to en­
hance student learning. The following sec­
tions summarize current knowledge about
these components of assessment and suggest
questions to which sociological analysis
might be brought to bear.
Sociology is especially well-positioned to
offer both descriptions of what currently
exists with regard to assessment plans and
programs and analyses of factors that con-

�ASSESSMEOT OF STUDENT LEARNING
tribute to or impede effective assessment. A
sociology of assessment practice could easily
draw upon the insights of the discipline with
regard to the institutionalization of a new
paradigm and new activities. Such an ap­
proach would be enhanced both by macro­
level investigations of political, economic,
and socio-organizational factors that affect
the introduction and dissemination of assess­
ment and by micro-level analyses of the
processes by which assessment questions are
framed and assessment results used to mod­
ify curriculum and pedagogy. It could bene­
fit from all major theoretical analyses and
use both quantitative and qualitative methods
to obtain comprehensive understanding of
assessment. Potentially, these analyses can
offer important practical guidance to faculty
and departments as they work and some­
times struggle to understand and effectively
utilize results of assessment to enhance stu­
dent learning of sociology.
The Statement of Purpose
The statement of purpose is designed to be
the foundation and inspiration for all assess­
ment activity. Sociology purpose statements,
like those for all majors and programs, are
intended to articulate the contribution of the
discipline to the mission and goals of the
institution and describe the purpose of study­
ing the discipline (for example, what sociol­
ogy majors ought to learn and be able to do
as a result of studying sociology). While
purpose statements often are written generi­
cally, the best statements are tailored to the
specific programs and emphases of a partic­
ular department (Gardiner 1989).
The Program Assessment Consultation
Team (PACT) at California State University
at Bakersfield (a typical institutional assess­
ment committee but one that has published
an impressive assessment document) notes
that the mission statement should identify the
values and philosophy of the department and
a vision of what the department is doing. It
can include a brief history and philosophy of
the unit, the type of students to be served
and their geographic area, the academic
environment and primary focus of the cur-

69

riculum, faculty roles, contributions to and
connections with the community, the role of
research, and a nondiscrimination statement
(PACT Outcomes Assessment Handbook,
2000). A well-written statement should be
used to guide decision-making about curricu­
lum, policies, and standards and should
provide the framework for the program’s
goals and objectives.
The literature on higher education contains
numerous essays and reflections on the im­
portance of purpose statements, key compo­
nents of well written statements, successful
processes for writing statements, and ways
to link purpose statements with goals and
objectives. However, there has been almost
no systematic research on the content of
purpose or mission statements and on varia­
tions in statements based on type of institu­
tion. Many assessment experts believe that,
despite the attention given to assessment in
the last two decades, institutional purpose
statements remain largely generic and not
substantially different among various types
of institutions—a pattern first reported by
Weick in the mid-1970s (Weick 1976).
This pattern was affirmed in recent re­
search by Delucchi (2000) who analyzed the
academic mission statements of 303 liberal
arts colleges in the United States. Among his
findings was that 70 percent of colleges
making liberal arts claims in their mission
statements actually awarded degrees primar­
ily in professional disciplines. He concluded
that the claim to liberal arts often failed to
reveal the actual motives that shaped the
curriculum, and that it may be both politi­
cally and methodologically more valuable
for sociology faculty to frame their assess­
ment of student learning within the context
of the discipline as opposed to the mission of
a particular institution.
Powers (2000:42) offers an excellent ex­
plication of the role of a guiding mission
statement in his description of Santa Clara
University’s efforts to build a developmental
curriculum. His department’s mission is to
“offer students sociological tools and in­
sights they can use to improve the effective­
ness of the organizations they are a part of

�70

___________ TEACHING SOCIOLOGY

and enhance the quality of the communities identified as characterizing the sociology
they live in.” This mission statement has
program? What elements of educational phi­
become the department’s organizing focal losophy are included? Is attention given to
point for the construction of specific learn­ the purpose of sociology as a general educa­
ing objectives, curriculum design, and as­ tion requirement? To what extent does the
sessment techniques to gain feedback on the content of purpose statements differ in pri­
department’s success.
vate versus public institutions, churchClearly there is an important research
related versus secular institutions, small ver­
agenda for sociologists who want to shed sus large institutions, single-sex versus coed­
light on the context in which sociology
ucational institutions, or by institutional lo­
purpose statements are written, the content cation (e.g., urban versus small town and
of the statements, the process by which they south versus midwest)?
were written, and the effects or outcomes of
Process. What process is used to write the
having a purpose statement in place. The statement? To what extent is it created out of
following offers a beginning list of questions
group discussion and collaboration versus
that sociologists might pursue. Given that the work of the chair or a single individual?
we are at a very early stage of sociological How is the process subsumed within the
research on assessment, most of the identi­ traditional organizational bureaucracy of the
fied questions are descriptive in nature.
department? What occurs when most or
Context. What is the institutional setting in even some faculty refuse to participate or
which the department has been asked to undermine the work of others? How does the
write a purpose statement? Is there adminis­ department assessment leader encourage
trative support for the work of departments follow-through on collective decisions?
in writing a mission statement? What guid­ What strategies are used to encourage fac­
ance do institutional administrators offer, ulty to keep an open mind about or support
and what resources, if any, have been made conscientious assessment?
available to assist in the work? What ratio­
Effects. Is the statement supported by
nale does administrators and department faculty? Is it used as a framework for identi­
chairpersons offer for faculty to take assess­ fying goals and objectives? Is it viewed as a
ment responsibility seriously? How much useful activity? Does the process by which
time is given to draft the statement? What the statement is written affect the level of
institutional rewards and sanctions are used acceptance of the statement? Does the pur­
to stimulate this work? What is the influence
pose statement lead to departmental, curric­
of these factors on the content, process, and ular, or pedagogical decisions that affect the
outcomes of writing the statement?
teaching and learning of sociology? Are
Content. To what extent and in what ways changes made to enhance curricular coher­
are sociology purpose statements linked to ence and student learning? Are improve­
institutional purpose statements? What do
ments made in positive student outcomes?
sociology faculties identify as the fundamen­
tal purpose of their department and its fun­ Goals and Objectives
damental contribution to the instimtion? To “The departmental purpose statement is to
what extent are statements focused on stu­ lead to the formulation of a list of goals and
dents’ learning versus a research mission or outcomes objectives for students majoring in
a community or public service mission? To the field. Goals are statements about general
what extent does the purpose statement focus aims or purposes of education that are
on the cognitive abilities of students versus broad, long-range intended outcomes that
more generic skills (such as critical thinking can be used in policy making and general
and effective writing), value acquisition, and program planning” (Johnson, Potts, and
post-graduation employment?
Hood 1999). They are “general aims or
What major emphases and directions are
purposes of the program and the curriculum;

�ASSESSMENT OF STUDENT LEARNING
include broad, long-range intended out­
on task, and increased motivation for goal
comes, wishes, desires, and intentions, as attainment. Educational treatises today rou­
well as statements about content knowledge,
tinely cite the positive effects of carefully
skills, attributes, broad knowledge/values,
designed, clearly expressed, explicit learn­
and perspectives expected in graduates”
ing goals (see Ellis and Pouts 1993 on clear
(PACT Outcomes Assessment Handbook
goal-setting
contributing to effective
2000:11).
schools; Good and Brophy 1994 on goals as
Each goal has one or more corresponding
motivators; Posner 1995 on the contribution
outcomes objectives. The best-written objec­ of goal setting to coherent curricula; Cross
tives are concise, clear statements that de­
1999 on goal-setting as a motivator specifi­
scribe a measurable learning outcome. They cally among college students; and Johnson.
are focused on the specific types of perfor­ Potts, and Hood 1999 on the contribution of
mance that students are expected to demon­ department goal setting to course instruc­
strate at the end of instruction (Johnson, tion).
Potts, and Hood 1999), Objectives “...flow
There is a small body of literature in
from the goals; [they are] operational defini­ sociology that specifically addresses the im­
tions that let you know if goals are being
portance of goal- and objective-setting in
reached; [they are] langible/observabie out­ shaping the curriculum and in assessing
comes expected in your students” (PACT student learning. Given the demonstrated
Outcomes Assessment Handbook 20(X):ll). value of goal articulation for student learn­
They should include the knowledge, skills, ing, it is surprising that more sociologists
attitudes, behaviors, and achievements ex­ have not addressed this issue. Interestingly,
pected of students in the major.
some sociologists were addressing this issue
The emphasis within the assessment move­ before the assessment movement in higher
ment is on objectives that are clear, measur­ education really became visible. In an early
able. and outcome- or result-oriented rather Teaching Sociology article, Miriampolski
than process-oriented. This is contrary to the (1978) reflected on goals that would be
tradition of most academic departments, appropriate in a humanistic approach to
which are more accustomed to identifying introducing students to sociology. (He rec­
what the faculty will do than what will
ommended understanding social determin­
happen to students. While some departments ism, relativizing culture, instilling a sense of
still include some process-oriented objec­ social realism, and developing skills in criti­
tives, the emphasis now is on identification cal evaluation.)
of what students will be able to do as a result
Among the earliest attempts to discover
of studying sociology.
the content of goals emphasized in sociology
Much education literature exists on the is the work of Vaughan (1980:268). She
importance of clear articulation of goals and analyzed textbooks; study guides and in­
objectives. Empirical research (both labora­ structor s manuals for the introductory soci­
tory and in the field) across a wide range of ology course; course syllabi for the intro­
organizational settings strongly supports the ductory course (that were collected by the
positive effects of goal-setting on learning ASA Section on Undergraduate Education);
and performance improvement. In an early and surveys of departmental chairpersons to
review of 17 empirical studies, Locke et al. assemble a list of Stated Goals for Under­
(1981) found that setting challenging goals graduate Instruction in Sociology. She iden­
versus more vague “do your best” goals tified the following goals: (1) to transmit a
increased performance in every one of the
body of knowledge to the students. (2) to
studies. How did clear goal setting lead to develop cenain substantive understandings
improved performance? It led to more fo­ in the students, (3) to contribute to the
cused attention and action, greater mobiliza­ general intellectual and personal develop­
tion of energy and effort, greater persistence
ment of students, and (4) to contribute to

�72
students’ vocational preparation. Others who
reflected on appropriate goals for sociology
include Bradshaw and McPherron (1980),
Hazzard (1991), McMillan and McKinney
(1985), Rhoades (1980), and Stephan and
Massey (1982).
In 1991, Wagenaar extended and formal­
ized much of this thinking by drafting a list
of 10 goals for undergraduate sociology
(which linked goals and outcomes objec­
tives). These goals later became the basis for
a list of 12 learning goals for the sociology
major that was included in Liberal Learning
and the Sociology Major (ASA 1992). These
two sets of goals have been widely cited and
used as a foundation for their own goal­
setting by sociology programs around the
country. The Liberal Learning goals are
contained in Table 1. Also, in 1992, the
ASA’s Teaching Resources Center published
Assessing Undergraduate Learning in Soci­
ology, edited by Sharkey and Johnson,
which has also been an extremely useful
document to many sociology programs.
Much of the writing by sociologists on
goals and objectives has been prescriptive,
reflecting efforts to recommend goals and

TEACHING SOCIOLOGY
objectives that sociology departments might
consider. What is needed is more empirical
research—both descriptive and explana­
tory—that relates to the context, content,
process, and effects of goal- and objective­
writing.
Context. What is the institutional setting in
which the department has been asked to
write learning goals and objectives? Is there
administrative support for this work? What
guidance is offered by institutional adminis­
trators, and what resources, if any, have
been made available to assist in the work?
What rationale do administrators and depart­
ment chairpersons offer to faculty to justify
this work? How much time is given to write
goals and objectives? What institutional re­
wards and sanctions are used to stimulate
this work?
Content. What are the goals and outcome
objectives that sociology programs have
adopted? How do they compare with the
goals and objectives identified in Liberal
Leaming-nTe they more or less inclusive?
Do they follow from the institutional and
program statement of purpose? How do they
compare with the goals and objectives writ-

Table 1. Learning Goals for the Sociology Major (from Liberal Learning and the SocioIorv Major}
The sociology major should study, review, and reflect on;
(1) the discipline of sociology and its role in contributing to our understanding of social reality, such
that the student will be able to;
•
describe how sociology differs from and is similar to other social sciences, and give examples
of these differences;
•
describe how sociology contributes to a liberal arts understanding of social reality; and
•
apply the sociological imagination, sociological principles, and concepts to her/his own life.
(2) the role of theory in sociology, such that the students will be able to;
•
define theory and describe its role in building sociological knowledge;
•
compare and contrast basic theoretical orientations;
•
show how theories reflect the historical context of times and cultures in which they were
developed;
•
describe and apply some basic theories or theoretical orientations in at least one area of social
reality.
(3) the role of evidence and qualitative and quantitative methods in sociology, such that the student
will be able to;
•
identify basic methodological approaches and describe the general role of methods in building
sociological knowledge;
•
compare and contrast the basic methodological approaches for gathering data;
•
design a research study in an area of choice and explain why various decisions were made; and
•
critically assess a published research report and explain how the study could have been
improved.

�ASSESSMENT OF STUDENT LEARNING

73

Table 1. con’t
(4) basic concepts in sociology and their fundamental theoretical interrelations, such that the student
will be able to:
•
define, give examples, and demonstrate the relevance of the following: culture, social change,
socialization, stratification, social structure, institutions, and differentiations by race/ethnicity,
gender, age, and class.
(5) how culture and social structure operate, such that the student will be able to:
•
show how institutions interlink their effects on each other and on individuals;
•
demonstrate how social change factors such as population or urbanization affect social
structures and individuals;
•
demonstrate how culture and social structure vary across time and place, and the effect of such
variations; and
•
identify examples of specific policy implications using reasoning about social structural
effects.
(6) reciprocal relations between individuals and society, such that the student will be able to:
•
explain how the self develops sociologically;
•
demonstrate how societal and structural factors influence individual behavior and the selfs
development;
•
demonstrate how social interaction and the self influence society and social structure; and
•
distinguish sociological approaches to analyzing the self from psychological, economic, and
other approaches.
(7) the macro/micro distinction, such that the student will be able to:
•
compare and contrast theories at one level with those at another;
•
summarize some research documenting connections between the two; and
•
develop a list of research or analytical issues that should be pursued to more fully understand
the connections between the two.
(8) in depth at least one area within sociology, such that the student will be able to:
•
summarize basic questions and issues in the area;
•
compare and contrast basic theoretical orientations and middle range theories in the area;
•
show how sociology helps understand the area;
•
summarize current research in the area; and
•
develop specific policy implications of research and theories in the area.
(9) the internal diversity of American society and its place in the international context, such that the
student will be able to describe:
•
the significance of variations by race, class, gender, and age; and
•
will know how to appropriately generalize or resist generalizations across groups.
(10) one or more areas within sociology, such that the student will be able to:
•
compare and contrast the basic theoretical orientations in the area;
•
show how sociology helps understand the area;
•
summarize current research in the area; and
•
develop policy implications of the research and theory in the area.

Two more generic goals that should be pursued in sociology are:
(11) To think critically, such that the student will be able to:
•
move easily from recall analysis and application to synthesis and evaluation;
•
identify underlying assumptions in particular theoretical orientations or arguments;
•
identify underlying assumptions particular methodological approaches to an issue;
•
show how patterns of thought and knowledge are directly influenced by political-economic
social structures; and
•
present opposing viewpoints and alternative hypotheses on various issues.
(12) To develop values, such that the student will see;
•
the utility of the sociological perspective as one of several perspectives on social reality; and
•
the importance of reducing the negative effects of social inequality.

�74
ten in other social science departments at the
same institution?
To what extent do they focus on the
cognitive learning of sociology versus skill
development (for example, theory construc­
tion or interviewing skills) versus generic
skills (for example, effective writing and
oral presentation)? Is there any focus on
value socialization or on enabling students to
understand their own education as being part
of a larger social institution that has social,
political, cultural, and economic conse­
quences? To what extent does the content of
goals and objectives differ in private versus
public institutions, church-related versus
secular institutions, small versus large insti­
tutions, single-sex versus coeducational in­
stitutions, or by institutional location (e.g.,
urban versus small town and south versus
midwest)?
Process. What process is used to write the
goals and objectives? To what extent are
they created out of group discussion and
collaboration versus the work of the chair or
a single individual? How is the process
subsumed within the traditional organiza­
tional bureaucracy of the department? What
occurs when faculty refuse to participate or
undermine the work of others? How does the
department assessment leader encourage
follow-through on collective decisions?
What strategies are used to encourage fac­
ulty to keep an open mind about or to
support conscientious assessment?
Effects. Are the goals and objectives sup­
ported by faculty? Does the process by
which they have been written affect their
level of acceptance? Do the goals and objec­
tives genuinely influence decisions about
curriculum, policies, and standards? Are
faculty familiar with them? Are students
familiar with them? Do they influence
course content and pedagogy? Are they peri­
odically reviewed? What factors influence
the degree to which the goals and objectives
are used as guides? To what extent do they
differ in private versus public institutions,
church-related versus secular institutions,
small versus large institutions, single-sex
versus coeducational institutions, or by insti­

TEACHING SOCIOLOGY
tutional location?
Assessment Mechanisms

Being accustomed to conceptualizing, opera­
tionalizing, and measuring human attitudes
and behaviors, sociologists likely have as
much experience as any academicians with
mechanisms for assessing student learning.
Early assessment programs often consisted
of little more than administering test scores
to measure student outcomes (Schilling and
Schilling 1998). Today, however, assess­
ment of student learning occurs through a
wide variety of techniques based on the
collection of information from current stu­
dents, from alumni, from relevant con­
stituencies (for example, employers and
graduate schools), from external reviewers,
and from the monitoring of institutional
data. Table 2 identifies some of the com­
monly used assessment mechanisms in soci­
ology.
Most of the increasingly voluminous liter­
ature on assessment mechanisms focuses on
one of ±ree topics: (1) a discussion about
methodological issues involved in assessing
outcomes (e.g.. Assessment in Higher Edu­
cation: Issues of Access, Quality, Student
Development, and Public Policy IMessick
19991), (2) a description and evaluation of
one or more specific mechanisms (e.g.,
Classroom Assessment Techniques: A Hand­
book for College Teachers [Angelo and
Cross 1993]), and (3) a description of the
mechanisms used in particular institutions
(e.g.. Assessment in Practice: Putting Prin­
ciples to 'Work on College Campuses (Banta
et al. 1996]).
The literature on assessment methodology
typically centers on recommendations for
ensuring the validity and reliability of as­
sessment findings. Common in these recom­
mendations (American Association of Col­
leges 1992; Banta et al. 1996; Schilling and
Schilling 1998) are the following five fea­
tures:
(1)

Assessment mechanisms should provide
answers to genuine questions that faculty
have. If faculty do not care about the

�ASSESSMENT OF STUDENT LEARNING

75

Table 2. A Partial List of Assessment Mechanisms

From Current Students
Classroom-embedded techniques that focus on a direct educational outcome

Performance in senior capstone course
Major papers and projects
Nationally-normed examinations
In-house examination administered in the capstone course
In-house essay administered early and late in major
Student portfolios
Surveys and Focus groups
Awards/grants/publications/presentations/honors
Senior exit interviews

From Alumni
Placement records (education, employment) of graduates
Alumni surveys

From Relevant College and External Constituencies
Focus groups of faculty in related programs and with staff in
Admissions, Academic Services, and the Registrar’s Office
Surveys of employers and faculty in graduate programs in which graduates have matriculated

From Program Reviewers and College Data
External reviews
Monitoring of background/quality of students declaring major,
grades, i&gt;erformance in campus-wide competitions

(2)

(3)

(4)
(5)

results of questions asked, then the mecha­
nism is a poor one.
Assessment mechanisms should actually
measure what they are intended to mea­
sure. Assessment mechanisms should en­
able faculty to draw correct conclusions
about the extent to which objectives are
being met. (One of the best-condensed
discussions of the strengths and weak­
nesses of specific assessment mechanisms
is the Cal Siaie-Bakersfield PACT Out­
comes Assessment Handbook described
earlier.)
Both quantitative (e.g., numerical data
such as scores on comprehensive exams
and number of students doing independent
studies) and qualitative (e.g., assessment
of student portfolios) measures should be
used.
At least two mechanisms should be used
to assess each of the objectives.
Assessment information should be col­
lected from a variety of constituencies.
For example, departments might focus on
current majors and minors, students in the

introductory class, non-majors taking
electives in the department, recent or
older alums, faculty in other departments,
student services staff, etc.

There is significant sociological literature
(primarily in Teaching Sociology) about the
use of various pedagogical strategies and
classroom projects and activities. Some of
these articles include systematic evaluation
of the technique, while others are simply
anecdotal reports. Many have potential to be
considered in an assessment context, though
this connection typically has been implicit.
Nevertheless, there are several good exam­
ples of assessment-related thinking. Watts
and Ellis (1989) discussed using occupa­
tional status and mobility of graduates as an
assessment mechanism; Sharkey (1990) dis­
cussed several issues related to organizing
an entire curriculum around learning out­
comes; and the ASA Teaching Resources
Center publication on assessment, edited by

�76

TEACHING SOCIOLOGY

Sharkey and Johnson (1992) includes pieces
by Thompson on assessing learning in re­
search methods, by Vera on assessing read­
ing ability, and by Hartman on using a
senior-level paper to assess the major.
Given the expertise of many sociologists
in the conceptualization and operationaliza­
tion of social variables and the exercise of
sound research techniques, there is consider­
able potential for greater sociological contri­
butions to understanding assessment mecha­
nisms. Research is needed on the advantages
and disadvantages of each assessment mech­
anism relative to the quality, depth, and
quantity of learning that occurs. In sociol­
ogy, examples of specific questions are: To
what extent do we obtain accurate assess­
ment data from examining performance on
the Educational Testing Service Major Field
Test or on some other nationally-normed
examination? Are helpful in-house compre­
hensive examinations being used? If so,
what can we learn from them? Is perfor­
mance in capstone courses or in internships
and independent studies genuinely reflective
of student learning in the major? How are
student portfolios used to assess sociology
learning? How do sociology faculty compare
assessment mechanisms on the quality of
information collected?
Which mechanisms are most appropriate
to use with particular learning objectives?
Do we receive reliable data when we use
multiple mechanisms to assess a single ob­
jective? Do external assessors (e.g., gradu­
ate schools and employers and outside re­
viewers of a department) confirm other indi­
cators of student learning? What is learned
by studying the observations of alumni grad­
uates? Are their perceptions similar to those
reported by seniors?

pare them to pre-formulated expectations
(that is, criteria of success), and, most
importantly, identify and implement specific
actions designed to enhance student learning
(this final process is referred to as closing
the loop}. This final stage is the ultimate
purpose of assessment and reconnects the
process to its underlying rationale: that gen­
uinely effective departments and programs
continually look for ways to improve student
learning and that they base their analysis on
data that have been systematically collected
(Nichols 1995).
With regard to the assessment program,
there is again an extraordinary need for
empirical research. In addition to the same
kinds of questions suggested for analysis of
purpose statements and the writing of goals
and objectives, other specific questions can
be asked: How has sociology shaped the
implementation of annual assessment pro­
grams? How are assessment activities sched­
uled into the routine administrative tasks that
are accomplished during the academic year?
How do departments configure themselves
to accomplish annual assessment?
Are there particular objectives that sociol­
ogy programs typically achieve? Are there
particular objectives on which sociology
programs typically fail? In what ways have
sociology programs changed in response to
assessment? What kinds of changes typically
succeed and what kinds typically fail? What
factors influence the conscientiousness with
which departmental assessment is conducted
and with which efforts to enact program
improvements are made and carried out?

An Annual Assessment Program

Assessment-focused research indicates the
potential for substantial enhancement of stu­
dent learning. Yet, much remains to be
learned about the manner in which assess­
ment makes a positive contribution, about
how assessment can be configured to pro­
vide the most and best data about student
learning, and about ways to overcome resis-

The Assessment Plan simply identifies the
working pieces of the assessment process.
Assessment becomes real each year when
faculty select a small number of outcome
objectives (typically about three to five)
upon which to focus, carry out the necessary
mechanisms, examine the results and com­

THE NEXT DECADE: THE SCHOLAR­
SHIP OF TEACHING AND LEARNING
AND THE ASSESSMENT MOVEMENT

�ASSESSMENT OF STUDENT LEARNING
lance to assessment. This is true for all of
higher education, and it is true for sociol­
ogyThe need for more research and reflection
on assessment of student learning coincides
with the development and continued refine­
ment of a scholarship of teaching and learn­
ing (SOTL) within sociology. Defined as the
systematic reflection on leaching and learn­
ing made public (McKinney 2000), SOTL
seeks to promote the research that faculty
members conduct on their daily activities.
Research on assessment can contribute to the
knowledge base about this important move­
ment within higher education and can pro­
vide information that contributes in a very
practical and direct way to improved teach­
ing and learning. This paper ends with a call
and encouragement for sociologists to partic­
ipate in the scholarship of teaching and
learning in general and to assist in conduct­
ing research on the assessment of student
learning in particular.
REFERENCES

Pro­
gram Review and Educational Quality in the
Major. San Francisco. CA: Jossey-Bass.

American Association of Colleges. 1992.

American Sociological Association (in conjunc­
tion with the Association of American Col­
leges). 1992. Liberal Learning and the Sociol­
ogy Major. Washington, DC: American Socio­
logical Association.
Angelo. Thomas A. 1999. “Doing Assessment as
if Learning Matters Most.” AAHE Bulletin
52:3-6.
Angelo, Thomas A. and K. Patricia Cross. 1993.

Classroom Assessment Techniques: A Hand­
book for College Teachers. San Francisco. CA:
Jossey-Bass.
Banta, Trudy W., Jane L. Lambert, and Karen E.
Black. 1999. “Collaboration Counts: The Im­
portance of Cooperative Work in Assessing
Outcomes in Higher Education.” Presented at
the 1999 Assessment Institute, November 7-9,
Indianapolis, IN.
Banta, Trudy W., Jon P. Lund, Karen E. Black,
and Frances W. Oblander. 1996. Assessment in

Practice: Putting Principles to Work on Col­
lege Campuses. San Francisco, CA: JosseyBass.
Bradfield, Cecil D. 1992. “Assessing Student

77

Outcomes in the Sociology Major: The James
Madison University Program.” Pp. 121-28 in

Assessing Undergraduate Learning in Sociol­
ogy, edited by Stephen Sharkey and William S.
Johnson. Washington, DC: American Socio­
logical Association Teaching Resources Cen­
ter.
Bradshaw, Ted, and Sharon McPherron. 1980.
“Issues and Resources in the Undergraduate
Sociology Curriculum.” The American Sociol­
ogist 15:6-21.
Brookfield, Stephen D. 1995. Becoming a Criti­
cally Reflective Teacher. San Francisco, CA:
Jossey-Bass.
Burke, Joseph C. 1999. “The Assessment
Anomaly: If Everyone’s Doing It, Why Isn’t
More Getting Done?” Assessment Update
11:3,14-15.
Burke, Joseph C., Shahpar Modarresi, and Andreea M. Serban. 1999. “Performance:
Shouldn’t It Count for Something in State
Budgeting?” Change 31:17-23.
Centra, John A. 1993. Reflective Faculty Evalua­

tion: Enhancing Teaching and Determining
Faculty Effectiveness. San Francisco, CA:
Jossey-Bass.
Cross, K. Patricia. 1999. “Assessment to Im­
prove College Instruction.” Pp. 35-46 in As­

sessment in Higher Education: Issues of Ac­
cess, Quality, Student Development, and Pub­
lic Policy, edited by Samuel J. Messick. Mahwah, NJ: Lawrence Erlbaum Associates.
Delucchi, Michael. 2000. “Staking a Claim: The
Decoupling of Liberal Arts Mission Statements
from Baccalaureate Degrees Awarded in
Higher Education.” Sociological Inquiry 70:
157-71.
Eck, Beth A. 2001. “Taking Theory Last: A
Different Approach to Organizing the Curricu­
lum.” Presented at the 96th Annual Meeting of
the American Sociological Association, August
18-21, Anaheim, CA.
Eisenbach, Regina, Vicki Golich, and Renee
Curry. 1998. “Classroom Assessment Across
the Disciplines.” New Directions for Teaching
and Learning 75:59-66.

Ellis, Anhur K. and Jeffrey T. Fouts. 1993.

Research on Educational Institutions. Prince­
ton, NJ: Eye on Education.
Ewell, Peter. 1997. “Organizing for Learning.”
AAHE Bulletin 49.
Good, Thomas L. and Jere E. Brophy. 1994.
Looking in Classrooms. New York: Harper­
Collins.
Gardiner, Lion F. 1989. Planning for Assess­

ment: Mission Statements. Goals, and Objec-

�78

teaching sociology

lives. Trenton, NJ: Office of Learning Assess­
ment. New Jersey Department of Higher Edu­
cation.
Hartmann, David J. 1992. “Assessing the Major

with a Bachelor’s Paper.” Pp. 175-82 in As­

sessing Undergraduate Learning in Sociology,
edited by Stephen Sharkey and William s’

Johnson. Washington, DC: American Socio­
logical Association Teaching Resources Cen­
ter.
Hazzard, John, 1991. “Student Competencies and
the Goals of the Undergraduate Curriculum: A
Response to Theodore Wagenaar.” Teaching

Sociology 19:532.
Hilton, Peter. 1993. “The Tyranny of Tests.”

American Mathematical Monthly 100:365-69.
Howery, Carla.

1992,

“Assessment: It’s the

Right Thing to Do.” Pp. 191-97 in Assessing
Undergraduate Learning in Sociology, edited
by Stephen Sharkey and William S. Johnson.
Washington, DC: American Sociological Asso­
ciation Teaching Resources Center.
Huba, Mary E. and Jann E. Freed. 2000.

Learner-Centered Assessment on College Cam­
puses: Shifting the Focus from Teaching to
Learning. Des Moines. lA: Allyn and Bacon.
Hutchings. Pat and Ted Marchese. 1990.
“Watching Assessment: Questions, Stories and
Prospects. ” Change 22:12-39.
Jackson. Pamela I.. Roger D. Clark. Thomas W.
Ramsbey. and Rachel D. Fillinson. 1992.
“Assessment and Engagement in Learning and

Teaching.” Pp. 89-100 in Assessing Under­
graduate Learning in Sociology, edited by
Stephen Sharkey and William S. Johnson.
Washington, DC: American Sociological Asso­
ciation Teaching Resources Center.
Johnson, William S., Shelly A. Potts, Denice W.
Hood. 1999. “Using Learning Outcomes to
Improve University Teaching and Learning.”
Presented at the American Association for
Higher Education Assessment Conference,
June 12. Denver, CO.
Keith. Novella Z. and John Myers. 1992.
“Assessment and Undergraduate Sociology De­

partments.” Pp. 62-77 in Assessing Undergrad­
uate Learning in Sociology, edited by Stephen
Sharkey and William S. Johnson. Washington,
DC: American Sociological Association Teach­
ing Resources Center.
Lazerson, Marvin, Ursula Wagener, and

Nicholas Shumanis. 2000. “What Makes a
Revolution?: Teaching and Learning in Higher
Education. 1980-2000.” Change 32:12-19.
Locke. E.A.. L.M. Saari, K.N. Shaw, and G.P.
Latham. 1981. “Goal Setting and Task Perfor­

mance.” Psychological Bulletin 90:125-52.
Lovell-Troy, Larry. 1989. “Teaching Techniques
for Instructional Goals: A Partial Review of
the Literature.” Teaching Sociology 17:28-37
Maki, Peggy L. 1999. “A Regional Accrediting
Commission’s Survey on Student Outcomes

Assessment and Its Response." Assessment Up­

date 11:1-2,10-11.
McKinney, Kathleen. 2000. “Research on Under­
graduate Education: Past. Present, and Fu­
ture.” Presented at the Midwest Sociological
Society Annual Meeting, Chicago. IL.
McMillan, Martha and Kathleen McKinney.
1985. “Reorganizing Sociology Undergraduate
Curricula: A Case Study and Discussion of the
Issues.” Teaching Sociology 12:425-48.
Messick. Samuel J. 1999. Assessment in Higher

Education: Issues of Access. Quality. Student
Development, and Public Policy. Mahway. NJ:
Lawrence Erlbaum Associates.
Miriampolski. Hyman. 1978. “Thoughts About
Reasonable Goals for Introductory Sociology:
A Humanistic Perspective.” Teaching Social­
op 5:141-50.
National Center for Postsecondary Improvement.
1999. “Gauging the Impact of Institutional
Student-Assessment Strategies: Revolution or

Evolution?” Change 31:53.
Nichols, James. 1995, A Practitioner's Handbook

for Institutional Effectiveness and Student Out­
comes Assessment Implementation. 3d ed. New
York: Agathon Press,

Nilson, Linda B. 1998. Teaching at its Best: A

Research-Based Resource for College Teach­
ers. Bolton, MA: Anker.
Palomba, Catherine A. and Trudy W. Banta.

1999. Assessment Essentials: Planning. Imple­
menting. and Improving Assessment in Higher
Education. San Francisco, CA: Jossey-Bass.
Posner. George J. 1995. /l/wZyzr/ig the Curricu­
lum. 2d ed. New York: McGraw-Hill.
Powers, Charles H. 2000. “Evolving a Develop­

mental Curriculum in Sociology: The Sanu

Clara Experience.” Teaching Sociology 28:4149.
Program
Assessment
Consultation
Team
(PACT). 2000. PACT Outcomes Assessment
Handbook. Bakersfield, CA: California Sute

University at Bakersfield.
Rhoades. Lawrence. 1980. “The Undergraduate

Sociology Curriculum:

A Proposal.”

The

American Sociologist 15:21-29.
Schechter, Ephraim, Alec Testa, and Douglas
Eder. 2000. “Assessment and Accreditation.”

Assessment Update 12:12-13.
Schilling. Karen M. and Karl Schilling. 1998.

�ASSESSMENT OF STUDENT LEARNING

79

Proclaiming and Sustaining Excellence: As­
sessment as a Faculty Role. ASHE Higher
Education Report. Washington. DC: The

can Sociological Association Teaching Re­
sources Center.
Wagenaar, Theodore. 1991. “Goals for the Dis­

George Washington University Graduate
School of Education and Human Development.
Sharkey, Stephen R. 1990. “An Approach to
Organizing the Undergraduate Social Science

cipline?” Teaching Sociology 19:92-95.
Watts, W. David and Ann M. Ellis. 1989.
“Assessing Sociology Educational Outcomes:
Occupational Status and Mobility of Gradu­

Major Around Learning Outcomes.” Teaching

ates.” Teaching Sociology 17:297-306.
Weick, Karl. 1976. “Educational Institutions as

Sociology 18:472-81.
Sharkey, Stephen and William S. Johnson, eds.

1992. Assessing Undergraduate Learning in
Sociology. Washington, DC: American Socio­
logical Association Teaching Resources Cen­
ter.
Southern Association of Colleges and Schools.
1998. Criteria for Accreditation. Decatur, GASACS.
Steadman, Mimi. 1998. “Using Classroom As­

sessment to Change Both Teaching and Learn­
ing.” New Directions for Teaching and Leammg 75:23-35.
Stephan, G. Edward and Douglas S. Massey.
1982. “The Undergraduate Curriculum in Soci­

ology; An Immodest Proposal.” Teaching Soci­
ology 9:423-34.
Tebo-Messina, Margaret and Chris Van Aller.
1998. “Classroom Research and Program Ac­
countability: A Match Made in Heaven?” New
Directions in Teaching and Learning 75:87-99.

Terenzini, Patrick T. 1989. “Assessment With
Open Eyes: Pitfalls in Studying Student Out­
comes.” Journal of Higher Education 60:64464.
Thompson, Martha E. 1992. “How Do I Know if
Students Have Learned Anything About Re­
160-64 in Assessing
Undergraduate Learning in Sociology, edited

search Methods?” Pp.

by Stephen Sharkey and William S. Johnson.
Washington, DC: American Sociological Asso­
ciation Teaching Resources Center.
Vaughan. Charlotte. 1980. “Identifying Course
Goals: Domains and Levels of Learning.”
Teaching Sociology 7:265-79.
Vera, Hernan. 1992. “To Create New Meanings:
A Method to Assess Students’ Reading.” Pp.

170-74 in Assessing Undergraduate Learning
in Sociology, edited by Stephen Sharkey and
William S. Johnson. Washington, DC: Ameri­

Loosely Coupled Systems.” Administrative Sci­
ence Quarterly 21:1-19.

College
Teaching.- Strategies for Developing Effective­
ness. San Francisco, CA: Jossey-Bass.

Weimer, Maryellen. 1990. Improving

Wellman, Jane V. 2001. “Assessing State Ac­
countability Systems.” Change 33:46-52.
Wright, Barbara D. 2000. “Assessing Student

From
Change: Landmarks in Teaching and Learning
in Higher Education from Change Magazine
1969-1999. Sterling, VA: Stylus Publishing.
Learning.” Pp. 209-304 in Learning

Greg Weiss is professor of sociology at Roanoke
College. His research interests center around the sociol­
ogy curriculum and assessment, end-of-life decision­
making, and health protective behaviors. He is co­
author with Lynne Lonnquist of 77ie Sociology of
Health. Healing, and Illness.
Janet Cosbey is associate professor of sociology at
Eastern Illinois University. Her research interests focus
on gender, family, gerontology, and teaching styles and
techniques.

Shelly Habel is visiting assistant professor of sociol­
ogy at Georgetown University. Her research interests
focus on teaching and learning with Web-based com­
puter course delivery systems and assessment of student
learning.
Chad Hanson is a member of the social science
faculty at Casper College. His research interests focus
on issues in higher education, specifically teaching and
learning. He has recently published in The Teaching
Professor, The National Teaching and Learning Forum,
and College Teaching.

Carolee Larsen is assistant professor of sociology at
Millsaps College. Her research interests include assess­
ment of student learning, the impact of technology on
society, and welfare-to-work issues.

�</text>
                </elementText>
              </elementTextContainer>
            </element>
          </elementContainer>
        </elementSet>
      </elementSetContainer>
    </file>
  </fileContainer>
  <itemType itemTypeId="1">
    <name>Text</name>
    <description>A resource consisting primarily of words for reading. Examples include books, letters, dissertations, poems, newspapers, articles, archives of mailing lists. Note that facsimiles or images of texts are still of the genre Text.</description>
    <elementContainer>
      <element elementId="7">
        <name>Original Format</name>
        <description>The type of object, such as painting, sculpture, paper, photo, and additional data</description>
        <elementTextContainer>
          <elementText elementTextId="96807">
            <text>Print Journal</text>
          </elementText>
        </elementTextContainer>
      </element>
    </elementContainer>
  </itemType>
  <elementSetContainer>
    <elementSet elementSetId="1">
      <name>Dublin Core</name>
      <description>The Dublin Core metadata element set is common to all Omeka records, including items, files, and collections. For more information see, http://dublincore.org/documents/dces/.</description>
      <elementContainer>
        <element elementId="50">
          <name>Title</name>
          <description>A name given to the resource</description>
          <elementTextContainer>
            <elementText elementTextId="96797">
              <text>Improving The Assessment of Student Learning: Advancing a Research Agenda in Sociology</text>
            </elementText>
          </elementTextContainer>
        </element>
        <element elementId="47">
          <name>Rights</name>
          <description>Information about rights held in and over the resource</description>
          <elementTextContainer>
            <elementText elementTextId="96798">
              <text>&lt;div id="dublin-core-rights" class="element"&gt;&#13;
&lt;div class="element-text five columns omega"&gt;&#13;
&lt;div id="dublin-core-rights" class="element"&gt;&#13;
&lt;div class="element-text five columns omega"&gt;&#13;
&lt;div class="element-text five columns omega"&gt;&#13;
&lt;div class="element-text"&gt;&#13;
&lt;div class="element-text"&gt;&#13;
&lt;div class="element-text"&gt;&#13;
&lt;div class="element-text five columns omega"&gt;&#13;
&lt;div class="element-text five columns omega"&gt;&#13;
&lt;div class="element-text five columns omega"&gt;&#13;
&lt;div class="element-text"&gt;&lt;a href="http://rightsstatements.org/vocab/InC/1.0/"&gt;http://rightsstatements.org/vocab/InC/1.0/&lt;/a&gt;&lt;/div&gt;&#13;
&lt;/div&gt;&#13;
&lt;/div&gt;&#13;
&lt;/div&gt;&#13;
&lt;/div&gt;&#13;
&lt;/div&gt;&#13;
&lt;/div&gt;&#13;
&lt;/div&gt;&#13;
&lt;/div&gt;&#13;
&lt;/div&gt;&#13;
&lt;/div&gt;&#13;
&lt;/div&gt;</text>
            </elementText>
          </elementTextContainer>
        </element>
        <element elementId="56">
          <name>Date Created</name>
          <description>Date of creation of the resource.</description>
          <elementTextContainer>
            <elementText elementTextId="96799">
              <text>2002-01</text>
            </elementText>
          </elementTextContainer>
        </element>
        <element elementId="51">
          <name>Type</name>
          <description>The nature or genre of the resource</description>
          <elementTextContainer>
            <elementText elementTextId="96800">
              <text>Text</text>
            </elementText>
          </elementTextContainer>
        </element>
        <element elementId="39">
          <name>Creator</name>
          <description>An entity primarily responsible for making the resource</description>
          <elementTextContainer>
            <elementText elementTextId="96801">
              <text>Chad Hanson</text>
            </elementText>
          </elementTextContainer>
        </element>
        <element elementId="44">
          <name>Language</name>
          <description>A language of the resource</description>
          <elementTextContainer>
            <elementText elementTextId="96802">
              <text>ENG</text>
            </elementText>
          </elementTextContainer>
        </element>
        <element elementId="70">
          <name>Is Part Of</name>
          <description>A related resource in which the described resource is physically or logically included.</description>
          <elementTextContainer>
            <elementText elementTextId="96803">
              <text>Chad Hanson Journal Publications, CCA 04.ii.e.2025.01 WyCaC US. Casper College Archives and Special Collections.</text>
            </elementText>
          </elementTextContainer>
        </element>
        <element elementId="43">
          <name>Identifier</name>
          <description>An unambiguous reference to the resource within a given context</description>
          <elementTextContainer>
            <elementText elementTextId="96804">
              <text>CCA 04.ii.e.2025.01_ChadHansonPapers_04</text>
            </elementText>
          </elementTextContainer>
        </element>
        <element elementId="42">
          <name>Format</name>
          <description>The file format, physical medium, or dimensions of the resource</description>
          <elementTextContainer>
            <elementText elementTextId="96805">
              <text>Searchable PDF</text>
            </elementText>
          </elementTextContainer>
        </element>
        <element elementId="45">
          <name>Publisher</name>
          <description>An entity responsible for making the resource available</description>
          <elementTextContainer>
            <elementText elementTextId="96806">
              <text>&lt;em&gt;Teaching Sociology&lt;/em&gt; is published by the American Sociological Association</text>
            </elementText>
          </elementTextContainer>
        </element>
      </elementContainer>
    </elementSet>
  </elementSetContainer>
</item>
