Finding high-quality, student-created projects online is difficult. Just recently, our Core Learning team spent hours scouring the hinterlands of the web in search of exemplar products, in a variety of media, that would showcase what a quality product should look like in terms of significant content, student voice, and revision and reflection–three of the 8 essential elements of a PBL project.
The products we narrowed down did not all meet the three criteria, and some were included to serve as contrasts to the higher-quality products. But the question still remained: What specific elements do high-quality project-based learning products contain? And how do we communicate and discuss these elements with our students and colleagues, in order to ensure that the final products meet certain quality standards?
A Google search for rubrics will lead you down an endless path of very specific, very thoughtfully-created matrices and documents that serve one or two purposes very well. You can find rubrics to grade slide presentations, iMovies, digital storytelling projects, Prezis and more. But there was nothing that we could apply to all media products, regardless of platform. So we decided to roll out our own. We knew that the typical rows-and-columns rubric did not seem to work for our purposes. We needed an adaptable checklist document, one that would be specific enough in a variety of areas, but also not cumbersome to the point of annoyance.
Thus, our Digital Project Evaluation Checklist was born. We broke down the elements that make a project great, and simply ask the user to check off whether the attribute is present or not. It also includes columns for more detailed explanations as to why it may not have meet the criteria, and a follow-up column to ensure accountability. Is a Driving Question is missing on the slide presentation? Well, then who is responsible for including it in the next draft, and by when?
The checklist tool includes the areas of design, organization, voice, content, and audience, which can be useful when looking at most media projects, from Scratch animations to photo-essays to poster-board triptychs.
What makes a project look good? While this may typically be seen as a subjective measure, the reality is that there are actual design elements that, when included, work together to create the appearance of “high quality.” For example, are the fonts in the project consistent? Is attention given to a choice in color palette? How do the layout of visual elements on the page interact with each other and the viewer? All of these attributes should be explicitly discussed with students before a PBL project is launched and reviewed along the way, in order to ensure that the students maintain consistency in their design choices, which in turn will result in a more visually-appealing product.
Whether one is talking about a video or a podcast, the way the information and content is organized goes a long way in making that content more appealing and effective. Poor organization almost always detracts from the content and can be confusing to the audience. But what specific elements should we be looking for? The use of titles and headings should match the content underneath, for example. Storyboarding the project often helps in structuring the final product, and helps students decide who will be responsible for what components, thereby avoiding the uneven aspects of some of the poorer exemplars many of us are used to seeing.
When thinking about student-centered projects, voice is an essential element that needs to be defined. What evidence can we find of authentic student voices? It is often hard to pinpoint whether a project topic or focus was born out of a students’ own interests and desires, or whether they had a choice in the medium or mode of expression. Often, these attributes become more apparent in projects that include as their final project a certain element of reflection. “I chose this topic because…”, “This was meaningful to me because of XYZ.” Including a segment, or even a caption about why and how the idea came to be, and how it was constructed, shows that the student was at the center of the creative process, and thus more engaged in its outcome.
Content is key. Without significant content, it is often hard to see that authentic and rigorous learning took place. With the many digital tools we have available, it is often easy to hide behind a well-produced template, or theme. What, then, do we mean by significant content? The product should show evidence of research. And the content should, of course, be authentically written in the students’ own words, with appropriate grammar and spelling conventions. In the case of a video production, the content should show evidence of a script that includes these attributes. Tying the content to standards also shows that the student(s) understand what they were expected to learn. Students need not include alphanumerical codes that mean nothing to them, but should be able to articulate what they are learning and why that is appropriate.
The last element we considered was audience. It is often how we ultimately determine whether any content any of us encounters meets a certain degree of quality. Do we want to finish watching it? Are we the viewers compelled to view more? If the product is informative, creative, and original enough, then we are more likely to be engaged by it. If most of the previous attributes are included in a project, then this last element has a much greater chance of hitting all of its marks as well. Having the audience in mind from the very conception of a project also helps students rise up to the occasion to create higher quality products. We hustle when we know our peers and our community, and yes, even the world at large, will take a discerning peek at our work.
Final thoughts on the Digital Product Analysis Tool
Teachers who had a chance to evaluate different projects, from kinder to high school, have found this tool useful–not only in terms of analyzing student work, but also in terms of reflecting on their own expectations of what quality products can and should contain. While the tool could be shared with upper-grade elementary and secondary students for them to fill out on their own, it will be more effective for younger students to engage with it as part of a one-on-one teachers conference, where both teacher and student can check off the list and make comments on what needs to be “fixed” or improved, and how. Showing students exemplars of finished projects, such as those included in this post, might also help them make connections between the different attributes and what they actually look and feel like.