 Welcome, everyone. I'm Karen Fassinpower. I'm part of the team who's facilitating DL MOOC. It is March 3rd, and we are now in week 7 of DL MOOC. And this week we're looking at assessment for deeper learning. We're happy to have you here with us, whether you're watching live or watching the archive. This week, DL MOOC got some great press from KQED's MindShift blog, which explores the future of learning in all of its dimensions. MindShift is a great column to read if you don't already follow it. In their recent article called Beyond Knowing Facts, How Do We Get to a Deeper Level of Learning, writer Katrina Schwartz acknowledges that while deeper learning isn't necessarily new, there is now a movement to help codify and spread the practices of deeper learning to teachers and learners everywhere. DL MOOC is a part of that movement, and we appreciate everyone in our community helping build this energy and momentum. Before we get started this evening, I want to also highlight a few of our participants who have recently been awarded deeper learning badges. As evidence of her deeper learning, Celeste Kirsch wrote a blog post about how academic mindsets were brought to life in her recent cross-country skiing trip with her students. Bart Miller made a video about rites of passage and his students' exhibitions, and said in his reflection that the deepest learning occurs in the process of creation, and we agree wholeheartedly with that statement. If you'd like to see some of this work or apply for your own deeper learning badge, please visit the DL MOOC website at DL MOOC.net, and you can get information to do that. It's really easy to apply for a badge if you participated in DL MOOC and experienced deeper learning. All you have to do is supply a link for that, and it could be a blog post, or it could be links to your comments on the G-plus community, or Twitter, or whatever you'd like to share to show that you've been a deeper learning, deeper learner. If you are watching this program on G-plus, remember that you can use the Q&A feature to pose questions to our panelists, and you can also tweet out your questions or comments with the hashtag DL MOOC. And with that, I'm going to turn it over to Rob Reardon to get us started. Thank you, Karen, and welcome everyone to this seventh session of the DL MOOC. I just want to remind folks that we are operating around deeper learning, which the Hewlett Foundation has, the elements of which are kind of articulated by the Hewlett Foundation as content mastery and critical thinking, problem-solving, collaboration, effective communication, and self-directed learning and academic mindsets. And the last two are qualities that the Rakes Foundation identifies as agency. Now, that's deeper learning, and that is a lot to assess. And so we're going to have a wide-ranging discussion tonight about assessing those elements. We have a panel that spans from the classroom to the international scene, and it's going to be very interesting to make connections across those areas. Let's get right to introductions of the panel, and then we'll get to our first question. I'm Rob Reardon. I'm the co-founder of High Tech High and the president of the High Tech High Graduate School of Education, where we devote a lot of thought to designing assessment practices that foster student learning and growth, both for our adult learners at the GSE and for students in our K-12 schools, and where we understand that we still have a lot to do and a lot to learn and a long way to go in terms of assessing deeper learning. Next, Peter, let's hear from you. Hi, my name is Peter Canham. I work with an organization, a national education nonprofit called America Achieves that's committed to raising the bar in U.S. education. And in particular, I've been a coordinator of a project in the U.S. called Bringing the PISA, the PISA International Assessment that's been done at a country and a national level down to the school level. And so I'm going to be speaking about some of that today. Great. Thank you, Peter. Megan. Hi, my name is Megan Bacheco. I'm from the New Tech Network. I'm the senior director of school design and implementation for that network. We're a network of schools across the country that implement project-based learning as a primary mode of instruction and sort of the last couple years we've been deeply focused on assessment practices and the type of outcomes that are articulated in the deeper learning outcomes. Also been really looking at how we're assessing the systems piece and what makes a school successful overall. Great. Megan, we'll be eager to hear about both of those pieces. Katie. My name is Katie Staff. I'm an eighth grade humanities teacher at High Tech Middle Chula Vista. I feel like my connection with assessment is kind of complicated. I find assessing for deeper learning really challenging. I feel like I have more questions than answers, but last year I got to live and work in England and I found that the schools that were trying project-based learning for the first time requested workshops on assessment the most. And I found that it's something that as teachers we all struggle with and talking to our students is a good way to kind of figure out what is a meaningful way to assess. Thanks, Katie. And finally, Bob. Hi, Bob Lenz from Envision Education. I'm the founder and CEO. We have two divisions. Our Envision Schools, which are a network of small charter innovative high schools here in the Bay Area that are organized around a deeper learning student assessment system. And Envision Learning Partners, which is a consulting, coaching, professional development division that works with public schools around the country who are interested in adopting and adapting a deeper learning student assessment system. Great. Bob, thank you. And Bob, I'd like to start with you for our kind of lead-off question here. Assessment at Envision Schools is kind of an integral part of the learning process, I know. And you've described that process as kind of a cycle of no, do, and reflect. Could you tell us more about that process and how it relates to deeper learning? Sure. Well, when we think about assessment, the no, do, and reflect, I think that most schools and most of our learning stops at knowing and that we need to move that and broaden it to the doing and the reflecting. So we think of it in a lot of ways like a triangle. So if you have the knowing, in the deeper learning outcomes, that's the content mastery, the skills, often are assessed through fairly traditional assessments. And so when we think about that, kids and learners still have the opportunity to be assessed on their knowledge mastery. But we don't think that goes far enough. And so we think the next step is around demonstrating and applying that knowledge. And so we have, and that's where we believe, performance assessment, project-based learning comes into play, public exhibition, creating authentic products, taking and if you're going to start to want to assess collaboration, communication, critical thinking, you actually have to do it in order to assess it. And then the third leg of that or third angle of the triangle is the reflection. And we think that is really the magic and the third missing piece that is really missing from our traditional assessment systems is giving the learner the opportunity to reflect, to use their, to metacognize, and to think about what happened, what was the learning, both the knowledge mastery, what they did, what was the skills that they learned, reflect on that, and then be able to look to the future and say, well, how are they going to do things differently if they were to face a similar project? It's also a time to look at growth over time and look at, well, how did you do on this project or a project similarly and how are you progressing? And so we think that's a cycle of continually going back through those things. We've designed an assessment system that was Stanford when we launched Envision Schools in 2003 that includes a set of performance frameworks, rubrics that culminate in a portfolio and a defense. And that is what we could like to think of as like a four-year project, that by the end of the four years you can demonstrate your knowledge, your skills, and your ability to learn to learn through a portfolio and a defense. And then that cycle goes on over and over again through the four years. And for the rest of their lives. Yeah, yeah. And that was just one quick follow-up. You know, as I look over these elements of deeper learning and I see content mastery, okay, critical thinking, okay, self-directed learning is one that it's hard to get a handle on perhaps in terms of assessment. But it seems as though your reflection piece really digs into that piece, into self-directed learning. Yes, you know, I think Rob, that I have, I think we subscribed to this in Envision and I personally believe that some of these deeper learning outcomes are really more powerful in the opportunity for the self-reflection and evaluation in a conversation with your peers and your adults. In the context of a rubric or a standard that gives you a benchmark to draw from. And so I think it's actually, as the learners grab the language and they get in the habit of this reflection, then it becomes part of who they are and they can embed it. And so I think collaboration in the same way. I think we spend a lot of time trying to figure out lots of tools and observation, protocols and more systems in an already burdened learning environment where the real power comes in the reflective process, both individually and in peers. And so both the self-directed learning collaboration, even the, I think any of the deeper learning outcomes, the reflective process is really where the power is. And it puts the onus back on the learner, as opposed to the teacher who's standing in judgment. Great. Assessment in a sense as dialogue, as reflecting the dialogue as opposed to an occasion for judgment. Right. Back probably, I don't know, 20 years ago, maybe not that long, but back when we were working with Project Zero and Steve Seidel and he, and I was working at Sir Francis Drake High School in San Enselmo, he took us through the collaborative assessment that he developed and I've always subscribed to this view of assessment was it's standing side by side with the learner, looking out at where the standard or the outcomes are where you want to go. And I think that's where that reflective piece comes, where the teacher is standing side by side with the learner and they're reflecting together on how to move forward. Great. I want to return to that notion. Thank you, Bob. I want to go over to Peter. We'll move from the kind of the local scale into the international scale. Peter, would you tell us a little bit about your work and the work that you're engaged in around assessing for deeper learning? Sure. Yes. I mean, when we think about deeper learning, I think about one of the kind of gold standards that's out there is the PISA, the National PISA Assessments developed by the Organization of Economic Cooperative Development and we know that, you know, every three years an assessment is done and countries across the globe get to see how they stack up on deeper learning assessment items in math, reading, and science. And so while that's helpful to understand where different countries are against each other, what we realize that America achieves is that it's not as meaningful as it could be as if, you know, so what does that mean for me in my individual school, my individual high school and where I am? And so what we did was we worked with OECD and helped develop a school level assessment that any high school now in the country and in other countries now can take the items, the PISA items and the assessments called the OECD test for schools and you can see how your individual high school can stack up against not only the United States but then other countries around the globe. And it's not just about the rankings, yeah? People want to see how they do and where they rank. What we found from the participants, we have over 300 high schools who voluntarily have signed up for this assessment is you can see how you're doing on the critical thinking and problem solving that PISA is kind of known for and they have different proficiency bands, proficiency band one through six where it's more kind of simple problem solving to more complex problem solving. And what we found is schools that have taken this assessment, it's a dipstick. So it's for 15-year-olds at high school. It does not get tied to the individual student and it takes a sample of your 15-year-olds and what it does is it gives you a school, a sense of how are we doing on these deeper learning critical thinking problem solving skills. And then we've seen school communities kind of dive in to see again how they're doing and so I actually wanted to move to point to a sample item that just to show the group real quick and to kind of give people a sense of the different levels of PISA. So Ryan, could you pull up the... It's up there. Oh, it's on there for the group. Okay, thanks. So this is on PISA, a level two. Again, I said there's six proficiency bands in reading math and science. This is a math one, hell in the cyclist. And as you can see, you can see the United States basically ferries pretty well on this question overall, level two questions. About 74% of our kids are able to get this right from this is all OECD information. But it's fairly straightforward where students can interpret and recognize situations in context and, you know, it's pretty straightforward on how to solve that problem. And then if we could... I have an example of a level six question and assessment item that I wanted to kind of show to the group. And could you just tell me, Ryan, whether that's up on the screen? It is okay. So thank you. So, you know, 2% of United States students are able to complete that based on the OECD. And so if you look at this hell in the cyclist question, it's a little more complicated. And it takes multiple steps and piecing together multiple pieces of information. So it's a multi-step problem-solving problem, which, you know, we need to be teaching our kids how to kind of attack, in which I think in the United States we're not doing as much of. And so when I hear Bob talking about, you know, the project-based learning and how to get to the critical thinking, it's just so important because I think what we've demonstrated is that we can do pretty well straightforward problems, but the critical thinking, deeper learning questions, we have to focus more time and energy on it because it's really important. Because, you know, when you really think about it, that's what you're going to be doing out there in the real world in jobs, you know, critically applying and solving problems. And so I think we... So this assessment that we've kind of been rolling out across the country as a voluntary no-stakes learning opportunity is, you know, people can really learn from. And we really are encouraging kind of school communities just to see how they do, and it's not a judgment. It's a dipstick to see how I'm doing on these deeper learning skills so then faculties can see how they can take action to improve. Great. Thank you, Peter. Let's... I'm going to move over to Megan. Megan, you're at New Tech. You've got kind of a systems approach to getting kind of high fidelity to your model and good results, and you applied that model to any one of a number, a lot of different settings. So what does student assessment look like in the New Tech schools, and how do you assess your development work overall? Well, I think assessment in our schools looks very similar to actually what I heard Bob describe, and we've been really heavily focused on how do we seamlessly integrate assessment into the project-based learning so that it just feels like a natural part of that and not as a separate thing or as an endpoint, but as Bob described, really an opportunity for reflection and growth and continued development in there and all of those deeper learning skills. So the deeper learning work has really influenced us and how we're thinking about assessment, particularly the agency piece. We really see as kind of the glue that binds it all together. So helping students, like I said, not to think about assessment as just for a grade or as that endpoint of learning, but really that continued path towards developing in all of the types of skills that we know they need. So I think that cycle that Bob talked about with students, I think we have tried to kind of transfer that up to the system level because that's what we want to see our schools doing as well, as engaging in that cycle of inquiry and growth and development and reflection as a school system. So early on in our kind of school development work and as we grew as a network, as you said, one of the things that we looked at primarily when we looked at the success of a school or a system was mostly teacher practices. So we were looking at things like are teachers implementing PVL? What's the quality of those PVL units? What does the leadership structures at the school look like? What is the staff collaborative structures? What are the student cultural aspects? Is there an advisory or students doing internships? Things of that nature. And what the work with the Deeper Learning Network has done for us over the last couple of years has forced us to kind of think that's going to get us to a certain level, but we really need to shift and be measuring the student outcomes that we're after. So where previously we looked primarily at fidelity to the sort of practices that I shared, now we're looking entirely at student outcomes because really ultimately what matters is what students know and are able to do. So we have a school success rubric that kind of outlines all of those things. We look at cultural outcomes, like how connected, engaged, and challenged students feel, but also the learning outcomes, the things that Peter was kind of talking about, the levels of deeper learning that students are getting to. So not only their knowledge, but the application of that knowledge, their skills, the deeper learning skills, and then also the attributes which are reflective of kind of the agency type things. So we're using that school success rubric with our schools to actively reflect on their own growth and development as a school system and be tracking how they're doing in the different areas of those outcomes. Megan, I'm just thinking of one of those academic mindsets around, I belong to a community of learners. I'm wondering if you're doing any taking a look at the schools that are in your orbit around that business, the community of learning, school climate, and stuff like that. Yeah, the agency indicators have been a huge conversation amongst our schools over the past year or so, and it's very, I think it's new work for us, and so schools are trying out a lot of different things and they're trying to sort of unpack what would it be a good indicator be of knowing if students feel like they belong. So we're using surveys like youth truth surveys, we have our own cultural survey, and again really engaging in that cycle of inquiry of collecting data, analyzing it, and then looking at are our practices getting us to where we want to be? Who is this working for, what's working, what's not working, and what do we need to shift as a result of what the data is showing. Right, we use youth truth also at High Tech Eye and have found that to be very useful data for us, more about that later. Unless, if no one on the panel has questions at this point for anyone who's presented so far, I'm going to move over to Katie. And Katie, you recently wrote something, well it was just a couple of years ago actually, but you wrote, I have found it, that it is easier to judge student work than it is to judge my assessment practices. In that department my students are my jury and I must remember to spend time in the deliberation room to listen to the reasoning behind their verdict. Ultimately their judgment is what matters most. So Katie, what's happening by way of assessment in your classroom and what are you hearing from your students? Well I think, I feel like assessment practices are ever evolving when I ask my students what they find most meaningful in the feedback. And so, I mean in this, I do find that this class at least they have said that they find the feedback that we give each other whether it's peer assessment or feedback from me or from our self-assessment at the beginning is if it has to be kind of pushing them towards another draft or another iteration of the work. They don't really like to get feedback on something they're done with because they don't really see the value in it unless they've said, you know, if it's a presentation of learning which is kind of like those portfolio presentations they do see value in that because they know another one's coming in another semester. But the feedback that they value the most is the feedback that will help them get to that next draft or get them to a final product that they're going to show an audience outside of me. And so, as a teacher I feel like that's really helpful information because assessment is really time consuming and exhausting and a lot of, and so when I do it in a meaningful way when I give a lot of feedback I want it to be something the students value and that they will look at and that will help them improve because that's the purpose of it. I think it's kind of helped me when I've, you know, I'll ask them, do you guys want feedback on this or we're going to do another draft, what do you want feedback on? So if I have the students write me a specific question it really helps to make that time well spent and effective in giving them feedback for assessment. Katie, as you're speaking I'm getting the image of someone who is standing beside her students. I hope so. And kind of looking out to where we're headed there. Yeah, when I heard Bob talking actually I was just thinking about when he was talking about that self reflection piece and having the students identify where they need to grow and improve and I was just thinking that's when I really feel like I've made it as a teacher and I think that's such a challenge to get to with every student to help them in that self reflection piece to know where they need to improve without anyone having to tell them. Yeah, Katie I see on the board behind you it says be descriptive and you're very descriptive so thank you. Oh, thanks. Bob, Katie, I thought that was great and I think it's so right and I think for teachers that are listening in you often hear about teachers will say oh, I spent so much time commenting on the kids work and then they don't even look at it they just stick it in their binder or maybe they threw it in the trash and we think about all the time that goes into assessing kids work but when it's done, it's done and unless there's an audience outside of the classroom or it's for improvement then kids will love that feedback and so that's not a big shift in practice like for teachers saying well how would I become more of an assessment for deeper learning and assess in the process of iteration have kids work deeply on something where they're getting feedback and instead of spending all the time at the end when the kids don't really care and so I thought that was really important glad you pointed that out. Thanks Bob. We have a question from Michael Klein in the audience who writes in I'm really interested in the school success rubric what do the outcomes look like for skills and mindsets of successful readers at New Tech? Sounds like that's a question for you Michael. Yeah and I'd be happy to share our school success rubric after this, I think we can post it. We can put it up on the DL MOOC site. Okay. As we worked on making this shift from teacher practices to student outcomes in that rubric we really thought about what is it that we want every student to graduate from our schools with and so we came up with those cultural outcomes of the things that they kind of experience and then the learning outcomes the knowledge skills and attributes so what are the skills and mindsets look like for successful readers? I think the agency growth mindset is critical in all the things that we just heard Katie and Bob talking about that it's really about focused on improvement and growth and I have to sort of persevere through things and continue to try to stretch myself as a learner and so those are some of the outcome or kind of indicators that we try to describe in that rubric and the type of attributes that we're after because those are something that can transfer outside of school to sort of any kind of experience that students find themselves in whether it's post-secondary schooling or careers or their life and so we really spent a lot of time talking about those kind of attributes with students and helping them to be reflective on that and tracking their own growth as I kind of heard Katie describe. Great, thanks Megan. Peter, I wonder if you could say a word about I mean bringing assessment of deeper learning to scale is a huge project and I wonder if you could say something about how these tests get generated how it's decided, what are the items that would lead us to deeper learning and what kind of information do we get? Yeah, I mean and this is actually a question we're still trying to figure out because if we're talking about you know multi-step deeper learning critical thinking and problem solving like we've kind of identified the actually generation of the assessment the scoring of it and the reports that give good information take time and energy and cost money, you know and so, you know, for instance at our school level assessment the school level PISA called the OECD test for schools there's a service provider that it charges schools $11,000 per school to take the assessment in the United States and that's an investment but you know the school gets back 100 page report that shows how they're doing on all of those proficiency bands that I explained in math, reading, and science they get a student questionnaire back but at the end of the day it costs money to grade and we haven't, I think it's so critically important and I'm just so I'm so adamant that we don't necessarily more assessments we need to do even better assessments and so I think a lot of the assessments that we're taking are spitting back and just giving basic feedback is not as necessary but we've got to spend more time on quality assessments assessing deeper learning but that's going to take time to grade and cost money and I think we have some as we're looking at common core implementation and what park and smarter balance to consortiums are looking to do if you look at those sample items they are looking to use more evidence based texts to kind of support answers not just there's not a right and wrong answer, there's more showing your work for multiple steps and so we're moving to this direction but I think the key thing is what we haven't figured out is how do we you know the human element of assessing and giving good feedback in a timely manner I mean the more complex the item and more it engages people in showing their thinking the harder it is to score in a way that's efficient and cost effective yeah so Bob I wanted to and Bob Lenz asked me a question if I could respond so the question was how are some schools using results to move deeper learning forward which is a great question and out of the pilot we saw for instance where schools were realizing based on the data that they were saying one school in particular thought that they saw how their kids were performing in science and application of knowledge they were seeing a real disconnect with what they were teaching to real world life and so what the school did was actually make explicit this is North Star Academy in Newark, New Jersey they went ahead and looked at their juniors and matched them up with an internship getting an internship with a scientist to really kind of give them a real practical experience of how the learning that they're doing connects to the real world another set of schools learned from the assessment that the kids were reading but they weren't reading deeply and really understanding at a deep level the higher levels of reading and two things were happening and this is some data coming out of the Fairfax County schools that were participating in the pilot one is that they were stopping the kids when they realized weren't reading for enjoyment anymore there was no more and so they were reading to get the work done but not really reading texts that were not only for enjoyment and they realized that that was really hurting how they were how kids were approaching their reading and their comprehension skills and so they infused more for pleasure within there to get kids more engaged in the reading that they were doing so those are two examples I think that's great thanks for sharing that Peter it's a great example of how you can use like you mentioned earlier like a dipstick so it's an opportunity like if the school leader or the system leader is thinking well let's use the piece of her schools to really check on our progress here and then use that data to drive the stages you want to see towards deeper learning as opposed to I think sometimes I think some people are looking for ways to compare themselves and just do the comparison and I think that's I don't think that's as helpful and so I think the two examples that you gave are really thoughtful ways that the leadership is using that data to make make the deeper learning outcomes happen for kids I'm glad to hear one of them was Norstar we have a team from Envision going to visit them next week back in New Jersey so that's really good to know yeah please have your team talk to Michael Mann he's a real spokesperson on the and Norstar knows assessment better as well as anybody and they really believe in the depth of the items and how they're changing how they do business and that's exactly what it's for the comparison and the rankings get you so far that's at a very surface level but why school practitioners really want to dive in that we found is that they really want to think about how do they get to these deeper learning skills and this again gives them some data to do that so yeah I totally agree with you I'm inspired by your anecdotes also Peter and to know that people are responding to and using acting upon data in those ways I mean we've all heard the stories or been part of the stories where a leader will simply say look just get the scores up you know just get the scores up whatever you can do but to really ask seriously about around science to have a solution that says let's put our kids together with scientists that's a different action than simply let's get the scores up we have a very very interesting question coming in from the audience how do we create assessments that address the key learning goals we are seeking while at the same time enable opportunities for students to surprise us I want to add a little piece to that to me this connects with rubrics because and it's a related question is what are the advantages and disadvantages of rubrics rubrics specify targets but perhaps don't leave room for surprises so the question how do we create assessments that allow for that get what we're after but allow for surprises anybody yeah I can jump in there and I think just what you said rubrics play a really a large part about for us so we've developed a set of common rubrics for each of the journey deeper learning outcome skills that we're after and we're in the process of sort of calibrating around those rubrics with teachers so that we know what it is we're aiming for and then teachers are developing performance assessments aligned to those rubrics so these are the skills we want these are the indicators of that here's a performance assessment that I can seamlessly embed in my project based unit that will allow students to demonstrate that in a variety of ways so I think that's where the allowing students to surprise us piece comes in if those performance tasks are really open-ended then students go about that in very unique ways and I think that one of the key pieces I would add in here is that a lot of times I see rubrics that are kind of checklists in disguise so they really articulate a task list do this do this do this and really we want them to articulate what's the knowledge and thinking that we want to see out of students I love your mention of checklists which applies across so many things peer critique can be turned into a checklist there are so many things that can be turned into checklists that students think they need to find their way through as opposed to opportunities for ownership of a process so I know that there are others on the panel here who would like to answer a question too about assessments that address learning objectives while allowing room for surprises I think Bob you're up next before I get to my point there's a great couple slides that the Buck Institute for Education has and their training shows the difference between a checklist and a rubric and they describe the project is to create this great cat box and they give you the qualities of it and then they show you the picture of the cat in a cardboard box and it actually meets all the criteria that they laid out and I don't think that was I've seen them do it and it's like I don't think when the teacher was assigning this they thought they were going to use a cardboard box they thought it was going to be something original and creative but they checked it off so my main point around this is the envision system that we now call the deeper learning student assessment system that's being used with folks like new tech and the linked learning folks and other partners of envision started when we worked with Stanford the Stanford Center for Assessment and Learning for Equity and we said well we want teachers to be designers and we want kids to have opportunities for expression and for surprises but we also want to make sure that they work that they produce is college ready and that when they get to college and they get an assignment they're going to say oh I know how to do this and I know what quality looks like so we have a performance assessment framework if you will that really describes what the quality all the different criteria that would be in there so in scientific inquiry so you know all the different components that would be in a great scientific inquiry performance assessment or project and then worked with them to design the rubrics and we actually had the project area experts at the university work with the teachers and the school leaders to design the rubrics to assess the work and so they're really a framework and so ideally when it's working well we get both high quality work that's meeting the standards and lots of opportunities for surprises it's not, in theory it sounds great and often it still can be fairly restrictive it really becomes the job of the school system and the school leaders to encourage the design both by the teachers and by the students or else anytime you start to use a rubric or any sort of pieces they can start to become a box and I think it requires constant dialogue to allow opportunities for choice and surprises and perhaps models about what kind of variety is possible and so on Peter? Recently an example I saw is a group of students at a school who participated in National History Day and there was some guidance on how to kind of come to different projects but what constant theme was that they were using evidence based text they were creatively kind of working together to present information on different topics but the thing that I saw that I was really impressed with is one of the deeper learning pieces of just working collaboratively and the learnings that I was seeing and reflecting with kids about how it's hard to work with others and how it's coming up with real authentic group work where you produce a product that's based on a rubric that has a high standard and boy were we surprised with some of the products that the students developed from videos to interactive you know presentations that again had a high level of content and was really like some big surprises and just kind of blew us away as judges of this but when you really talk to the students it was about working together the hardest part of the project was how to work well together with other people and I think that's something that we've got to constantly emphasize as well because when you really think about the real world and the workforce we're going to have to have working together and collaboratively is so important Thanks Peter. Katie, you want to unmute yourself there Oh yeah One thing I've noticed maybe it's the nature of an 8th grader but they often don't like what I bring to them and so I have found that when I do use a rubric it would be a co-designed rubric because they like to have the ownership we'll look at real world models first and come up with what needs to be included in this op-ed article for example working on now and they will come up with that list together and then I'll merge the classes together and of course I'll add in those things that I definitely need to have in an op-ed article if they're not hit on if I just bring in an outside rubric with these different scoring for an 8th grader they won't, they don't usually like to pay attention to it until it's been, they've had their hands in it and kind of modified it to be their own also I've been impressed by some of the work of Joe McDonald and Ron Burger around meeting real world standards in project work I'm thinking of a project that some kids in one of our schools are doing the kids have been very concerned about gun violence and what they might do to act act upon it and they decided they wanted to go on have a campaign and they need to raise money so they had to do a Kickstarter campaign and to do that they had to make a really compelling and persuasive website someone had to write the script for that someone had to do the visuals, the camera and so forth and they had to ask the question what is going to make it what is going to get us over so that they in essence they created the rubric as they went along around what would constitute quality work and they raised $30,000 so there was but what I'm saying is I think that there's a very important connection between quality of rubric or effectiveness of rubric and authenticity of the work yeah I think Rob I think it's like for me like all three are really important so teachers working with students to create rubrics that are authentic and real and to the moment of what they're doing and building the students capacity as well to use a rubric some projects and products they have an authentic audience in there you know you don't necessarily need a really good rubric to decide whether you're watching a quality film you know when a student created a quality film and they can get feedback from industry and go along and then we also believe and our kids you know the ninth graders you know takes a while but by the time the 12th grade they understand the power of having a validated college ready rubric because they know when they go to college that they're going to have they're going to have they want to be able to write an effective research paper and that's a little less giving them the quality so that they're intrinsic we'll have students say well I know when I get to college that I'm not going to have these rubrics any longer so it's really important that I you know I don't know if they say internalized but that I have it inside me what quality is so that then I'll be able to apply it when I'm in college because they're not going to do the same things that I'm doing in high school so I think being really clear so that especially you know we our kids you know over 65% of the students in Envision are come from low income backgrounds about 90% of our kids are students of color and over over 70 or the first in their family go to college and so we really think it's important to make the standards really transparent so that when they get to college it really feels like it's almost easy at that point and it's burying out like our college persistence rates through four and five years of college either kids graduating or or staying in college is almost 80% which is the same as white affluent kids and so I think having all three of those opportunities is really important great and it is about the internalization of standard that could become standard setters not simply standard meters or meters of external the internalization part is really important. Meglin I think you wanted to add something about rubrics yeah I would just you know add to the conversation that I feel this kind of links back to what we discussed earlier that it's really about the process of learning and so if you know whether it's a student created rubric or something you're giving to them introduce the rubric to them once and hope that they'll use it effectively that that probably isn't going to be very successful so having them actively sort of unpack that rubric or make sense of it doing assessment of student samples against that rubric doing their own self assessment peer assessment doing regular reflection using that rubric so pulling descriptors off of that and having them do some written reflections are all good ways to help students really process and make sense of the rubric see it as a useful tool in their learning process yeah and critiquing the rubric as we go along also and letting the rubric evolve also as the learning evolves perhaps too yeah we have a question from Michael Klein can you all talk about the transition you are seeing from schools that might have been more traditional or no excuses in their pedagogy moving toward deeper learning I wonder if anybody wants to address that question and I'm thinking also about the assessment angle in that what happens to assessment practices as schools make that transition I'm all glad to start I think what's driving some of this I think it was great to hear about Northstar getting the PISA for schools data the no excuses so-called no excuses schools are very in one of the things I think we can all learn from them is how how adept they are at building the capacity of their leaders and teachers to use data and so the PISA data is a great example of how they did that I think the other data that they're concerned about and I share that concern is the lack of persistence of their students in college and so while they're achieving higher test scores on standardized tests are graduating in high numbers their persistence rates in college even through the first year are quite alarmingly low and I think they're looking insane so maybe we haven't figured this completely out and the deeper learning might be a really important thing for our kids to know is the kids haven't become self-directed learners they don't have the critical thinking the collaboration, the communication skills that are required in college to be successful and so I think they're doing some really important reflecting on how they're going to give those kids that opportunity I still think they're struggling with really what it means to go deep but they're asking the right questions Cool, thanks for Megan Yeah, I would just add to that I think for a school that's maybe kind of more in the traditional practices and looking to make this shift we really find it effective to really have them start with the why or kind of really understanding what are the types of outcomes, student outcomes that you're after and having all of the staff really make sense of that and commit to that common why and then developing the instructional practices that match to that so often times I think we kind of jump to well let's do project-based learning let's try this new instructional approach with really understanding the why behind you're doing it and so really kind of spending the time on the vision of sort of your graduate and what kind of skills you want them to develop is really important in that process and I think back to the kind of the data piece, sometimes I think schools can get overwhelmed by the amount of data that they look at and have access to support our schools in developing a clear theory of action and have a clear focus for their work together as a school team so as you look at the piece of data or the youth through data that you sort of say this is what we really want to target we really want to get students to be better collaborators or better critical thinkers, look at the data from that lens and think about how that can inform your instructional practices from there great, great, thanks Megan I think that it's a matter of culture as well the question behind it is what is significant learning and that's a question that we can lose sight of if all we're concerned about is performance on standardized tests but it's helpful and healthy to ask that question again and again and to ask the question about what does the place look like where significant learning is going on all the time and how can we get closer to that which I think is everyone's aspiration no matter what the context so we have another question, Katie did you want to say anything I was thinking about adding just working with the schools that were transitioning in England but they were doing it just on a classroom by classroom basis it wasn't necessarily a unified school effort and I did find that the doorway in seemed to be adding more critique and having a real audience that they were willing to take that step to make for some more deeper learning opportunities while holding on to a lot of the traditional things that had to be a part of the classroom because it was part of the school culture but just giving themselves a little bit more time to critique and revise and then present to a real world audience seem to be the small steps they were willing to take in individual classrooms which then becomes an alternative assessment process as you begin to do that critiquing which is kind of a peer assessment kind of thing and then to your presentation and so forth we have one more question and then we're going to move towards our final words here as we'll go around and give everybody a chance to say that one thing that they still want to say and the question is given the panel's experience on international students how might you assess students for deeper learning while at the same time addressing their varying language challenges for example how well does this work with English learners wonder if anybody would like to take a crack at that Peter well first of all to me it's about the process of understanding like again when I look at the PISA data and think about how we're doing on kind of the more basic straight forward when we think about math problem solving and to me it's more important to really think about how do we do multi-step and more complicated problems and so to me it's tearing it to where students are and thinking about how to ramp it up as they improve continue to think about how do we offer students where they are more and more complicated problems or deeper more rich tests so first you have to know where they are and then you stair step it with them and constantly think about how to improve just on a very practical note with the assessment that we're developing we're working with the OECD and they're coming up with a Spanish level Spanish the assessment in Spanish as well so that will be able to reach many students across the world Spain is taking piloting the assessment this year as well as England is going to be doing it and then we're going to Mexico and Australia have given an expression of interest so I think there's also this piece of we have to make great assessment items available to all in all different languages and we're making there thanks Peter anybody else want to tackle that one working with English learners for example well I would just add I think that we found that English language learners do really well in a project based learning environment because they're actually practicing the language more actively they're not sitting and just listening to teacher input so as Bob kind of stated in the beginning we would actively do these skills if we want them to develop it so if we want students to develop effective communication skills and collaboration skills having a learning environment around authentic open tasks allow them to practice that and so I think English language learners do really well in terms of having smaller group more informal learning experiences with their peers where they can kind of get that support from their teachers as well as teachers and kind of in more informal ways okay thanks we do have one more question that has come in that I want to address briefly and assuming we have a little time before we get to final statements and that is has anybody here using learning contracts student statements about what why when and how and so forth before they engage in courses or projects and if so are they valuable and that that question comes from Derek Nicolai not on a systematic basis no some teachers might but it's not part of the process for ed and vision well one thing I would say is a little different from student contracts student questionnaires and getting feedback from students on whether they're engaged in learning and are participating and the culture within buildings they feel valued and supported and that they own their own learning I think there is as important as it's really important for staffs to and there is a questionnaire on the PISA for schools the OECD test for schools where you get really good feedback on how the students are feeling about their learning and I think that's a really important component to do in a systematic way within schools okay we're going to we're running short on time now so it is time to last to wrap up so let's have a quick round Robin here of the panel for your final word and this must be brief and of course we know it will be cogent and profound so who will start I'll start no do reflect okay thank you Bob I'll jump in I would say authentic engaging learning experiences paired with deep reflection cool I'll go four score and seven years ago no I would just say that take independent problem solving and it's the learning you get to where students are and then think about how to get kids to the deeper learning skills is so important not as accountability but as a learning tool an improvement tool I would say striving to help students self assess their deeper learning and I'll have a final word here too that assessment for deeper learning is a two way street and the question is not only for the student how am I doing in this work but also how is this course or how is this design working for me so that will be it for our panel thank you so much panelists this has been a really really interesting and lively discussion and we're going to turn it over to Karen to wrap up great thanks Rob and again thank you to our panelists and our whole deal moot community this is such a big and important topic and we certainly didn't cover every aspect of it so we will transfer this conversation to the G Plus community and any questions from the audience that we didn't get to we will post and I'd invite our panelists as well as our audience to chime in on those topics we'll also post some of the resources that were mentioned on this panel with links as well as our normal archive this week our tweet of the week the question is what is the best assessment or piece of feedback that you've ever had on your own work so give that some thought and tweet your response with the hashtag deal moot and feedback and on this Thursday's lens into the classroom session we will be talking about assessment again with new tech network schools and talking about how they use a collaboration rubric and we hope you will join us for that thank you all very much and we look forward to continuing this conversation online good night night everybody