2017 Summer Institute Report

Summer Assessment Institute
June 26 – June 30, 2017
Green River Community College

Table of Contents


Institute Summary/Description

During the week of June 26-30 seventeen full-time and adjunct instructors from English, Transitional Studies, Biology, Communication Studies, and Occupational Therapy Assistant gathered to participate in this year’s Summer Assessment Institute. The Institute is sponsored by the Learning Outcomes Committee and was facilitated this year by Kelsey Denton and Julie Moore. It had four parts. First, participants articulated research about assessment practices and philosophy in their discipline. Second, participants aligned Campus-wide/Program/and Course Outcomes, making sure they had a clear connection across all of these levels and into their courses. Third, participants created or revised an assignment, paying attention to using clear outcomes language and assignment design. Finally, participants created or revised a rubric that aligned with the assignment. Throughout the Institute, participants learned about outcomes assessment and the Campus-wide Outcomes at Green River College.

For the first part of the Institute, participants researched and then presented to the group answers to the following questions:

  1. What are the attitudes toward assessment in your discipline?
  2. How do people in your field practice assessment?
  3. What areas still need to be explored about assessment? What are the gaps in the research?

This was a time for participants to learn from each other what assessment looks like as it crosses different disciplines. This was paired with a lecture from the Institute facilitators about Green River College’s framework for assessment across programs at the college. By the end of this first part of the Institute, participants emerged with a big picture of what assessment practices include and how that can be adapted for various disciplines and institutions.

The second part of the Institute asked participants to align outcomes from the Campus-wide level to Program level to Course level. Participants learned about how and why to do this and then created an alignment map that included all of these levels of outcomes. This work established an important foundation upon which participants then designed an assignment and rubric of choice as it helped them to connect their specific assignment to larger outcomes in the CAR, their program, and the overall campus.

The third part of the Institute asked participants to create or revise a specific assignment for one of their classes. Kelsey and Julie reviewed with the participants principles of assignment design and then participants used their outcomes maps to identify specific outcomes they wanted to target with the assignment. Participants got feedback on their assignments and revised them according to the criteria for strong assignment design.

The final part of the Institute focused on rubrics, guiding participants through the process of writing one to match their assignment. Participants used elements of strong rubric construction, an activity they completed on writing strong outcomes, and student work they brought with them to design their rubric. They then got feedback from their peers and the Institute facilitators on their rubric. The final test was to bring the rubric and assignment together to make sure these were fully aligned to each other and to the larger course/program/campus outcomes.

The Summer Assessment Institute is a key component of Green River Community College’s Learning Assessment Plan as it trains faculty in the full assessment process and guides them through implementing this process for their course outcomes, program outcomes, and/or campus-wide outcomes.

Closing the Loop: What we learned

Part I: What We Learned Through Studying Assessment Across the Disciplines

Following the first part of the Institute in which participants shared research about assessment practices in their discipline, each participant was asked to reflect on what he/she learned through the process and how the process might impact his/her teaching practice or materials. The following are participant reflections following this first section of the Institute. Participants used the question below to guide their response. Themes that emerged from their responses included the tension between freedom to design courses and standardized practices that govern classroom content and pedagogy, the difficulty in quantifying deep and nuanced student learning – especially with things like critical thinking, and the confirmation that assessment challenges cross disciplines.

  • How did researching assessment scholarship in your field change or solidify your own thoughts on assessment theory/practice? Did it make you rethink any larger design or alignment issues within your course?
    • “Because writing and communication are deeply personal, idiosyncratic, contextual, and complex, they are hard to teach in a standardized way.”
    • “If anything, the research I conducted reaffirmed my beliefs on assessment. I was surprised to hear that the struggle between standardization and instructor's free license was such a commonality among all the programs at GRC.”
    • “I was pretty amazed at how many gaps I realized there were in the field. Not only at wider level (i.e. how I wish there was more emphasis on technology/social media in the field) but also at a more detailed level. This examination made me think about all the different rubrics I see as a coach in the PSC, and via conversations with colleagues. I often get committed to my own rubrics, just as others do. I am puzzled now at how there can be consistency that works for all.”
    • “It was so interesting to see that despite belonging to different fields, we face similar issues in the construction and implementation of assessment. The belief in improving and constantly evaluating my assessment practices has been solidified. Also, I still believe in aligning, as much as possible, my assessment practices with others in my field. The students, I believe, would welcome this consistency. As we discussed, the trick is to balance academic freedom and consistency. Lastly, I learned through the research that assessment can be messy and imperfect, and that's OK!”
    • “Researching the literature on assessment confirmed how challenging it can be to develop specific assessments and rubrics that truly measure some of the most important outcomes.”
    • “This was basically a new concept for me. I have been aware of assessments at the k-12 level, but was quite surprised that it is consuming a lot of time and money at the college level as well.”
    • “I really enjoyed doing the research because it helped me to better understand how vast and complicated assessment is, not just for English, but for all disciplines.”
    • “I found that assessment is highly prized in our field, from the National Communication Association as well as through the textbooks and instructor's manuals. Especially in the instructor's manuals, each of the chapters have learning objectives, but I see a lacking when the manuals then propose assignments. There's a lack of alignment between them.”
    • “One of the things that was really emphasized yesterday with everyone's presentations is that the term “assessment” encompasses so many practices, some of which don't overlap. However, it was instructive to discover that so many assessment practices do correlate and feed into one another. It’s helpful to think about each assessment fitting into a broader and broader scaffolding – the way someone described assessment yesterday - that becomes broader and broader the further the assessment criteria move from the assignment/course level. Fitting each classroom assessment into outcomes for the course, the program, and the college seems difficult to me, but our discussions yesterday made me think it’s possible to interpret all three levels of assessment as ultimately—or at least, ideally!—having similar goals.”
    • “Researching assessment scholarship solidified my own thoughts on assessment theory and practice, but it also brought to my attention some new ideas I hadn't considered, especially due to the subjectivity issues that arise in grading essays. Any time I consider assessment it causes me to think more closely my lesson design and whether or not I'm truly measuring what I intended to measure, as well as whether or not my assessments align with the various objectives (course, department, campus-wide).”

Part II: What We Learned Through the Process of Creating/Revising an Assignment

Following the part of the Institute in which participants used an assignment design framework and the outcomes on the CAR to revise an assignment, each participant was asked to reflect on what he/she learned through the process and how the process might impact his/her teaching in the years to come. The following are participant reflections that address how this part of the Institute impacted participant teaching and learning. Participants used the questions below as guides for their responses. Many of these responses discuss the value of outcomes mapping. This activity, done before they revised/created their assignments, helped them gain clarity on what they were asking students to accomplish in the assignment as well as its place and purpose in the course overall.

  • What did you learn about the quality and alignment of your assignments today?
    • “Specificity is an important component to the development of alignments in the course. In order to view the course assignments from a holistic view it is nice to be able to break down the assignment and align it with course and college and department outcomes and also identify the learning outcome that is used in the assignment. It is also good to check the wording so it remains cohesive and makes sense in simple language to the student. I also feel as though communicating the reasoning makes the student more comfortable with the perceived learning intention.”
    • “This new version [of the assignment] moves the outcomes from where they were--legalese on the syllabus--to where they should be--active language discussed in the class. It will also require a bit more work from me as we move forward, and (as Kelsey had mentioned earlier) a chance to really assess the success of my own class in meeting these outcomes.”
    • “I have learned about how I really didn't include a lot of outcomes in one of the most important assignments I have of the quarter. This was pretty eye opening for me!”
    • “Another discovery I've made is that while I might cognitively have the alignment in my mind, this does not always translate clearly for students or through to the assignments I've crafted for students. I have to continually remind myself to be clear on the purpose of each activity/assignment.”
    • “It is important to be specific, measurable and the all the language is consistent.”
    • “I learned that when I take a step back and read my assignments looking for quality and alignment, that it isn't nearly as is clear and concise as I initially thought.”
    • “I wrote a new assignment today and I found the process so much easier because I knew going in what I wanted to actually assess (thanks outcome mapping).”
    • “I learned that moving from campus-wide learning outcomes to the class goals means that the assignment not only aligns but also sort of maps itself. Also, once the goals align in the assignment, it seemed that the tasks became much clearer, so it should be clearer for the students as well.”
    • “That there are great gaps in the formality of my assignments. While the assignments themselves are relatively in line with outcomes, the specifics of exactly how they align has been almost completely internal (ie: in my head) until now.”
    • “I learned that my original prompt was a hot mess! It's one that I've used for many years now, but that I've always adapted and changed and I realize, after my peer review, that I never bothered to go back after all these changes to make it cohesive. So, thank heavens for the opportunity to do that!”
    • “I found that there is a fine balance between providing too much information and not enough.”

  • What strategies might you use in the future to ensure and maintain alignment between assignment/syllabus/CAR in your courses?
    • “Always match assignments with the outcomes to ensure you are meeting the guidelines in order to convey the importance of the assignment to meeting the goals of the course.”
    • “One thing I've learned here: learning outcomes can be their own teaching tools.”
    • “I'd like to continue to go through the process of printing out the CAR for each big assignment and incorporate more language into my prompts. I think I also need to revisit my syllabus and include more outcome language in the description of the assignments that I do include.”
    • “Maintaining that alignment between all of the pieces will definitely improve the way I word my assignments, especially with stating the purpose and how it fits in to the outcomes set by the campus and the program.”
    • “Whereas before I found the outcomes very overwhelming to work with and difficult to wrestle into practical applications, now I can think of them linearly and connect them to real assignments. I'm also able to articulate the relationship between the outcomes and the assignments to my students because I'm thinking of each assignment as a measurable part of a measurable whole.”
    • “I am going to check on all of my assignments in my current courses to see if they align with the course objectives. I plan to do more outcome mapping as well.”
    • “I will consult the CAR constantly and create assessments based on outcome maps. I plan to use concrete, specific, quantifiable language in my assignments, in class, and in my rubrics, so that even when the activity is abstract, such as analyzing a piece of literature, there will be a quantifiable measure to guide the students and keep their instructor on track.”
    • “Map out campus-wide outcomes, program, and assignment outcomes for alignment and to determine what types of assignment are really needed. This will avoid having students spend their limited time completing extraneous assignments.”
    • “Use an “assignment template” as a working tool to be sure that I include all the assignment components so I won’t forget to put in the time frames, etc.”
    • “Use an “assignment rubric” to self-assess and have someone within and outside my field review critical assignments to see if everything is included and is clear.”
    • “Complete that outcome map, for one!”
    • “I realize that I need to go back and really do a more thorough outcomes map for each course I teach to ensure everything I teach aligns with the outcomes.”
    • “As we discussed in the session, rather than just "dragging forward" an assignment, it's a good idea to reevaluate the purpose and objectives of the assignment and that it's meeting the learning outcomes.”

Part III: What We Learned Through the Process of Creating/Revising a Rubric

Following the third part of the Institute, which focused on rubric creation, participants reflected on what they learned. The following are participant reflections that address how the third part of the Institute impacted participant teaching and learning. Participants used the questions below as guides for their responses. Participants were especially motivated by the notion of analytic vs. holistic rubrics and many seized the opportunity to mindfully select one or the other as a grading tool. This included exploring how different types of rubrics might work within Canvas. As in years pasts participants also commented on how powerful it is to have peers review your rubric and to take the time to review and revise rubric language.

  • What did you learn that was new and/or helped to deepen your understanding about how to write rubrics?
    • “I learned there are two different kinds of rubrics, which really helped clarify my own style, which is more holistic.”
    • “Writing them is difficult as I realize how invested I am in "gut" grading.”
    • “I have learned to be a lot more descriptive in my rubrics. I was telling Kelsey "I can't believe I used this rubric last quarter!" Working on this particular rubric, I'm pretty happy with the new progress I'm making, while also being kind to myself in understanding that probably when I visit this rubric again in the future, I will have the same thoughts. My hope is that with this more detailed rubric, I can save time on the comments I write within the paper.”
    • I have never used a holistic rubric in my own classes, so that is what I made today! I actually started with an analytical rubric and adapted it to a holistic rubric. The process really made clear the similarities and differences between them.
    • “I was reminded of the two different types of rubrics…I was also reminded of the importance of using precise language consistently in each rubric, regardless of type. I love working with rubrics, although they are time-consuming to craft.
    • “I learned the difference between a holistic and an analytic rubric. I realized that I have been using both types, but that the analytic rubrics I have used need to be more fully fleshed out and clearly defined.
    • “It has been great getting feedback on our assignment and rubric as there never seems to be enough time to ask or get feedback from others.”
    • “Not having had much experience with rubrics, learning both how to effectively tie them to outcomes, and how to make them accessible to students has been really helpful.”
    • “It's clearer now how to build a fully-developed analytic rubric in Canvas.”

  • Which part of the rubric development and revision process is most challenging for you and why?
    • “I find it challenging to get the verbiage correct, simple, and concise enough to both link clearly back to the outcomes, and to still be legible to students.”
    • “Giving a point value to criteria that are not easily quantifiable is challenging for me.”
    • “For me, finding that balance between ease of use and precision in my rubrics is very difficult to achieve. To elaborate, I am always trying to add components that will focus and pinpoint growth areas that students in my writing class need to work on. Sometimes these components need to be combined and other times they need to be split up…It was interesting that my peer whom I worked with took an opposite approach, creating a rubric with many components (27!). When assessing the assignments, I was surprised how easy it was to use and how accurate it was. This made me rethink my approach: Maybe I need more components with more detail. ironically, my peer, after using mine, thought that hers need to be more streamlined! Go figure. It just goes to show how challenging striking this balance can be.
    • “Developing a rubric is…a time-consuming activity …done in this independent setting where we don't have the opportunity to receive feedback. So as instructors we continue to perpetuate the same mistakes over and over because of a direct lack of feedback from colleagues who could provide valuable insight into this process; from assignment to rubric and statistical analysis at the end.”
    • “I have trouble matching my rubric language to my assignment. I have lots of great ideas for my assignment, and I know exactly what I'm looking for in the rubric--now I need to "marry" these two together.”
    • “The hardest part of rubric development is describing different levels of mastery, especially describing abstract concepts (how to describe "vagueness"?). I also find it challenging to differentiate related ideas into discrete categories on an analytic rubric. “
    • “I have a very analytic rubric for my speech class, and that has greatly influenced my past development of rubrics. Not everything needs to be analytic, especially when the rubric is helping students with developing skills -- in a formative way -- rather than a more comprehensive summative manner at the end of the course.”
    • “My challenge is wording the rubric so students can use it as a guide and produce quality work that might go beyond what is a "Meets Expectation" on the rubric. I want them to use the rubric but don't want it to limit their work.”

Overall Reflections from Entire Institute

At the end of the Institute, participants were asked to reflect on how the week’s activities helped them to better understand outcomes assignment, alignment of outcomes across course materials, and their role in assessment at the larger institution of GRC. The following are answers to their questions.

  • What did you learn about designing and aligning courses, assignments, and rubrics? How will this impact your design choices in the future?
    • “[I] learned the importance of vying for consistency and uniformity in language--institutional outcomes down to the assignment objectives themselves.”
    • “As a Canvas "power user", I also picked up some great tips for baking in and using rubrics in [Canvas].”
    • “I was not even aware of the CAR's, so this was an "eye-opener" for me on what I need to align. This institute gave me the opportunity to now align my course with the outcomes.”
    • “I learned about the importance of language alignment. It not only makes it easier to create assignments, but also it shows students (and instructors) the relevance and connection of class tasks and outcomes.”
    • “Holistic rubrics were a new concept to me. I love them! I think they fit my teaching style and many of my students’ approach as well.”
    • “I didn't realize how much language mattered! To be honest, I was not even paying attention to outcomes when writing my assignments. I have learned now to check the outcomes.”
    • “I learned the value of aligning vocabulary. It seems so simple, but when evaluating many of the assignments I noticed inconsistency in this way. And it is such a simple fix!”
    • “Being open and clear to students about what the outcomes are for the course and class and including these outcomes on assignments and rubrics makes so much sense! I've already started to revise assignments so that the relationship of assignment to assignment (and the relationship of assignments and rubrics to the class itself) are clear and stated.”
    • “I'm already taking a much more critical approach to how I design lessons, projects, and indeed just about everything in my classes. I've really taken to treating the outcomes as "this is what you're paying for."
    • “I think it will change my entire outlook on developing assignments in the future. It actually makes using assignments much easier when the course outcomes provide a foundation for building the assignment as well as assessing the assignment with specific, concrete criteria that benefits me as an instructor and clarity for the students.”

  • How do you plan to respond to the findings from this week’s activities, and how do you think this will improve teaching and learning in your classes and/or overall program?
    • “Beginning with the end in mind, will improve the outcome of my courses. I think having high expectations of where students should be at the end of the course will drive quality instruction throughout the course.”
    • “[This week’s work] has inspired me to look at another summative assessment for my reading class, where I have been consistently disappointed with student performance. In reflecting on the assignment, it is now abundantly clear that I am not communicating my expectations for the final product clearly, which is essentially setting up my students for failure.“
    • “I think writing rubrics and teaching the specific markers that match up with each item on the rubric will help me improve transparency in my classes.”
    • “The construction of more effective rubrics will make the entire process of teaching, learning, and grading more transparent to the students. It will also make me feel as though I’m meeting the college and program requirements. This is very empowering.”
    • “Being intentional about stating the assignment's relationship to the outcomes has changed my thought-process in organizing and planning the classes, i.e. I'm much more outcome-oriented than previously.”
    • “I think I had blinders on when creating rubrics such that they (1) had to be analytic, and (2) they had to equal "x" “number of points. For some larger, more complex assignments, a holistic rubric with criteria is more appropriate.

  • What challenges do you anticipate as you implement any of the identified class and/or program improvements?
    • “The challenge will be to put in the time needed to make changes that are needed right away when something isn’t working as well as it should.”
    • “One big challenge, as an adjunct, is the inconsistency in classes. I don't know from quarter to quarter what I'll be teaching (or even where) and the outcomes for classes and courses are not consistent across community colleges. This process will be continuous, but it is time consuming and it is probable that I can do all of the work and never re-use an assignment, rubric, or outcome map.”
    • “One [challenge] is simply finding the time to do this, and finding colleagues who'd be willing to and have the time to edit my revisions.”
    • “Another challenge is keeping focused as I design assignments and corresponding rubrics - it is often easy to lose that focus and/or find myself in the situation of being rushed.”
    • “I think more than anything, it's just going to be time constraints. I need to find and make the time to really create in-depth analytic rubrics for all my assignments.”
    • “I think clearly explaining the differences between A level work and B level work etc. is the most difficult part.”
    • “I think students might have grade complaints because they see one element on a rubric as an absolute, even when it's only part of a larger competency. Like, for example, if they see that an “A” range in one criterion required five sources, along with other factors, they might hang on too tightly to the one absolute. This isn’t really anything new, but something to stress to my classes: that I will be interpreting all criteria; they're not simply boxes to check off.”
    • “I think going back and aligning language between the CAR, syllabus, assignments, AND rubric. I want to continue to map this out. I have a feeling I will find some beloved assignments that don't line up.”
    • “I find that my less successful rubrics tend to be top-heavy and confusing or too vague. This session definitely helped me come up with strategies to help me here (again, that conciseness with language was really helpful), and I'll be applying what I've learned to future rubric creation sessions.”

  • What did you gain this week that you're likely to share with others in your division or program?
    • “Having a retreat to discuss the threads in the course would be helpful but that is not always practical as it is difficult to get all adjunct instructors together at the same time. Maybe having a document that can be filled out by instructors can start to provide an awareness that alignment is needed and that course outcomes must match the CAR, etc.”
    • “[M]any of our course outcomes are not written with clear, measurable action verbs. I think it's time to go through the CARs and update the language and wording. I think it would also be good to share some of the principles with our adjuncts, as well as keep assessment as a regular agenda item on our department meetings.”
    • “I already shared the link to rubrics that are available for us on T and L site on the GRC webpage and have shown others how this site, as well as department resources are available in order to assist us all in having more clear goals and assessment work”
    • “I will share the importance of alignment of the course with the outcomes and the importance of action verbs in the learning objectives.”
    • “Aligniment of the course objectives and assignments should be consistent!”
    • “Honestly, the fact I can make rubrics for small-scale assignments and integrate them onto Canvas was a major revelation
    • "Peer review can be very beneficial in this process.”
    • “The Canvas stuff was really useful So many people use Canvas here, yet many of them have never received the proper training to utilize its useful assessment tools and features. I will definitely want to spread the word about these!”
    • “The idea of alignment is really significant and I’ll likely share it with others. I found the part about using consistent terminology to be the most useful and will probably tell other people about how much that will help us reinforce the learning outcomes to students.”
What the Facilitators Learned

The following are some insights from the facilitators, Kelsey Denton and Julie Moore, and reflect important points and realizations they saw while facilitating the Summer Institute. Overall, the Institute continues to be a valuable place for all faculty to get professional development on outcomes, assessment, and course materials design. This is especially true for adjunct faculty and probationary faculty.

  • The need for more professional development like this on campus: One thing that has been eroded over the past 4-6 years on our campus is professional development opportunities for faculty that focus on classroom practice, materials, and assessment. The Summer Institute remains a place where this has continued to take place, and we are thankful for that! However, participants were noting the need for more on syllabus design, critical thinking, and outcomes writing. These are things that have taken place in the past but have slowly disappeared with the termination of the TLC, Syllabus Workshop, and other trainings on campus that have been supported by Faculty In-Service monies. There is growing evidence that we are seeing from the Summer Institute participants of the need to revive and add to professional development opportunities for faculty at GRC.
  • Importance of Summer Institute on campus assessment: The Summer Institute continues to be a place that helps to develop and drive forward other larger assessment efforts on campus at the program and campus levels. For example, one participant used this year’s Summer Institute to further hone and design new elements of program assessment in English. It also provided Julie with additional ideas for how to develop and streamline program and campus assessment for next year and beyond, especially using Canvas as a means of gathering assessment data across campus.
  • Impact on Adjunct Faculty: The value to adjunct faculty is extremely evident. Adjunct faculty continue to be the primary participants of the SI. Besides giving adjunct faculty an opportunity to work on courses they might teach for GRCC, they get the opportunity to connect with other full and adjunct faculty members from within and without their divisions. They get a rare chance to check and compare their ideas and teaching products with other faculty, no longer in such a vacuum. They also get clarity on key elements such as course outcomes, which we realized remained unclear for some attending this year.
  • Skill Development: Many participants noted that the institute was an opportunity for them to develop both technological and pedagogical skills that they will continue to use in their classrooms. For many the institute is their first exposure to outcomes mapping, Canvas rubrics and other tools that allow faculty to assess, align and execute outcomes assessment at campus, program and course levels. In future years of the institute it will be important to continue bringing new tools and approaches to the institute participants.
  • Networking: The institute is a formal space for faculty to network across divisions and discuss different approaches to teaching, learning and assessment. For many participants, this is the only opportunity to meet or know faculty outside their discipline. In the future, including workshops like this throughout the year as part of ongoing professional development would help to further support and foster campus community.