Measuring Students Social & Emotional Learning: Review of Instruments & Implications for SEAs & LEAs

Measuring Students Social & Emotional Learning: Review of Instruments & Implications for SEAs & LEAs

January 1, 2020 0 By Kody Olson


>>JOSHUA COX: Good afternoon. My name is Josh Cox. I’m a researcher at the
Regional Educational Laboratory Northeast & Islands. I want to thank all of you for
taking time out of your busy schedules to
attend today’s webinar, Measuring Students’ Social and
Emotional Learning: A Review of Instruments and
Implications for State and Local Education Agencies. I’ll begin with a brief agenda
of what we’ll be covering today. First, I’m going to begin by
introducing my co-presenters. Next, I’m going to provide an
overview of our findings from our recently published report, A
Review of Instruments Measuring Social and Emotional
Learning Skills Among Secondary School Students. After I provide that overview,
I have some questions that I’m going to ask representatives
from Champlain Valley School District who partnered
with us on the report. In that discussion, Champlain
Valley will offer some insight into their efforts to develop
their students’ social and emotional learning skills,
including how the district might use some of the findings
from the report that I’ll be discussing. Afterward, we’ll open it up
to any questions that you have regarding the report’s findings
for Champlain Valley School District’s efforts to
support their students’ SEL. Finally, we’ll wrap up the
presentation by inviting you to complete a short survey to
evaluate your experience participating in this webinar. All right. I’m going to
introduce today’s presenters. So, I’m joined by my colleague
and the Director of Regional Education Laboratory Northeast
& Islands, Dr. Julie Riordan. I’m also lucky to be joined by
representatives from Champlain Valley School District, and
that includes CVSD’s Director of Learning and
Innovation, Jeff Evans, as well as CVSD’s
Director of Behavior Systems, Cassandra Townshend. And as I mentioned
earlier, my name is Joshua Cox. I’m a researcher for the REL and
a co-lead for the REL’s social and emotional
learning research alliance. More about that alliance soon. So, let’s talk a little bit
about the goals for today. First, we’d like to help you to
learn about the availability of instruments measuring
collaboration, perseverance, and self regulated learning
in secondary school students. Second, we’d like to help you
learn about the intended uses of the instruments. Third, we’d like you to
learn about the availability and reliability and validity of
information for the instruments. And last, we’d like to explore
the implications of the information presented,
including how Champlain Valley School District and other
districts can use the resource. Before we proceed, I just want
to mention that this work was conducted under the
Regional Educational Laboratory Northeast & Islands. The Regional Educational
Laboratory Northeast & Islands is one of ten Regional Education
Laboratories across the country that is funded by the Institute
of Education Sciences at the US Department of Education to
conduct applied research and trainings with a mission of
supporting a more evidence based education system. One way that we conduct our
work is by partnering with practitioners that focus on
their research and technical assistance needs related
to a specific topic area. The work that I’m about to
present was conducted under our Social and Emotional Learning
Research Alliance who helped us to create a research agenda
focused on topics that I’m sure many of you are thinking
about when trying to develop your student’s social and
emotional learning skills. Those topics
include measurement, supports for SEL
in and out of school, instructional strategies
and professional learning for teachers and other staff. Before we proceed, I’d like to
open it up to a poll to get a sense from our practitioners
in the audience whether you’re currently using or considering
using an instrument to measure social and
emotional learning skills. Could we open that poll? The question is, are you
currently using an instrument to measure social and emotional
learning skills in your school or district? The response options are yes,
considering or making plans to do so; no, and no plans
to do so; and finally, you can indicate N/A if
you don’t work in a school or district. All right. Thank you so much
for your responses. It looks like many of you are
at least thinking about using instruments to measure
students’ social and emotional learning skills. All right. That’s great. And I hope this webinar and
the resource that I’m about to present will help to
inform those plans. Today, I’m going to discuss our
recently published resource, A Review of Instruments for
Measuring Social and Emotional Learning Skills Among
Secondary School Students. This work was authored by
Brandon Foster, David Bamat, and myself. I want to note that the resource
includes a systematic review of instruments measuring three
social and emotional learning skills, more specifically, our
review identified instruments measuring
collaboration, perseverance, and self regulated learning. We also looked at the
availability of reliability and validity information for those
instruments that were included in our review and I hope that
this resource will be helpful in introducing you to some of the
instruments that can be used to measure social and
emotional learning skills, as well as some of the important
considerations that you should be thinking about when
selecting instruments. Let’s start with a
little bit of background. So, we conducted this work
in partnership with Champlain Valley School
District in Vermont. Jeff Evans, who I
introduced earlier, is also a member of the core
planning group for the REL’s Social and Emotional
Learning Research Alliance. And a little bit about CVSD,
they’ve been implementing both proficiency-based learning as
well as personalized learning plans for quite some time. As part of that work, they’re
currently implementing standards related to social and
emotional learning skills. Jeff identified a need for this
review of instruments to help inform the development of
CVSD’s assessment systems, both formative and summative, in
an effort to help the district to be able to measure and track
students’ social and emotional learning skills. What types of skills is
CVSD interested in measuring? Let’s talk a little bit about
the evolution of the project. Initially, Jeff at CVSD reached
out and provided us with this mission statement the district
had recently developed. Jeff, can I have you talk a
little bit about the evolution of this document and the
specific SEL skills your district was focused on?>>JEFF EVANS: Yeah, absolutely. So, hello from Vermont,
where, by the way, it is snowing right now, so
we can give you that winter wonderland backdrop to our work. So, yes. As Josh mentioned, about five
years ago our state passed a law that requires students in the
year 2020 and beyond to graduate based on
demonstrating proficiency, a set of graduation standards,
and also have personalized learning be a big piece of that. And so, it was incumbent
upon our district to create a document that represents our
expectations for graduation. So, we took a number of
national standards and created the document that
you’re looking at now. I know it’s hard to decipher
because of the size of the font, so I’ll describe a
little bit of it. What we did was we took our
mission statement and then conflated that with
graduation standards. This graphic has five
leaves, what we call them, which are categories for
our graduation standards. The categories from top to
bottom are self direction, creative and practical
problem solving, informed and
integrated thinking, clear and
effective communication, and responsible and
involved citizenship. And the two that are circled
are really pretty specific to typically referred to as
nonacademic skills or soft skills, or what we often
call habits of learning. And so, at the time, I was the
principal at the high school in this district and we were
struggling with coming up with any kind of reliability when it
came to assessing a student’s self direction or
responsible citizenship. And so, that’s when we partnered
– or that’s about the time that we partnered with the REL
to ask them, you know, what kind of measurements
can we rely on to inform our instruction and inform our
decisions around determining graduation based on
demonstrated proficiency? And so, in the
self-direction leaf, there are three standards that
talk about things like taking initiative and
responsibility for learning, making informed
decisions, setting goals, taking constructive risks,
and demonstrating growth mindset and persevering. Under responsible and
involved citizenship, participation in collaboration
effectively and respectfully, taking responsibility
for personal decisions, demonstrating respect for
different cultures, values, and points of view, and
demonstrating a commitment to community and
personal well being. So, those are the standards
that live within those two leaves that we call them. So, when we worked
with Josh and REL, we coalesced all those into the
three areas that Josh already referenced,
collaboration, perseverance, and self‑regulation. So that’s me, Josh. Thanks.>>JOSHUA COX: Thank you, Jeff. Yeah. And Jeff has sort of already
talked briefly about this but – so, while these terms
self direction and responsible involved citizenship that CVSD
was using would make a lot of sense in the context of
their mission statement, we weren’t consistently
finding them used in the research literature. We did, however, recognize that
there were some terms that were commonly used in research
literature that aligned with the components that were
laid out for those terms, self‑direction
and responsible and involved citizenship. And so, taking initiative in and
responsibility for learning is really aligned to a term in the
research and literature called self‑regulated learning. Persevering when challenged
is represented by the term perseverance in the
research and literature. And finally, collaborating
effectively and respectfully to enhance the learning environment
is just collaboration in the research literature. And so, we settled on
these three constructs, self regulated
learning, collaboration, and perseverance and we felt
that these really worked nicely from the components covered
under self direction and responsible and
involved citizenship, terms that are more commonly
used in the research literature. I suspect that some of your
schools and districts might also be interested in some of the
social and emotional learning skills like
collaboration, perseverance, and self-regulated learning. Let’s just return to the purpose
of this resource which is really to support stakeholders to one,
be able to identify available instruments for measuring
collaboration, perseverance, and self‑regulated
learning, and two, to understand information about
reliability and validity that’s available for each
of those instruments. Again, CVSD expects to use
the resource to inform the development of its assessment
systems related to measuring and tracking students’ social
and emotional learning skills. Additionally, we hope that
schools and districts nationwide can draw on the report to
identify and vet assessments for use with SEL programs and
nonacademic data collection. All right. So, I don’t want to get
too in the weeds here, but I do just want to call out
that for each of the instruments reviewed, the report identifies
whether information was available for reliability and
seven components of validity. You can see the seven
components of validity listed here. I’ll let you look at the report
if you’re interested in the definitions for each of
those components of validity. I will just quickly define
reliability and validity here, though. So, reliability refers
to whether the instrument consistently measures the
skills across respondents, time, or raters. And validity refers to whether
an instrument measures what it intends to measure and whether
the inferences made from the instrument are appropriate. All right. So, how did we
identify instruments? The first, I’ve already talked
pretty extensively about why we looked for instruments
that measured these three specific skills. But again, I’ll
just say it again. We were looking for instruments
measuring one of the three targeted social and
emotional learning skills, and those are
collaboration, perseverance, and self-regulated learning. CVSD was also specifically
interested in identifying measures for
secondary school students. And that’s another
requirement for this resource. The instrument needed to be used
with a population of secondary school students in
the United States. Because we wanted to make sure
that the resources that we were presenting were
accessible to practitioners, they had to be publicly
available online at no or low cost. And we also needed to
introduce some parameters around the timeframe for when the
instrument was developed. And so, the timeframe that we
used was from 1994 to 2017. 2017 was the year
we started our search. 1994 was the year that the term
social and emotional learning was coined by a
group of researchers, practitioners and policy-makers. Finally, we decided to not
include instruments that were published as part of a doctoral
dissertation because those instruments often don’t see
the level of scrutiny as other instruments
typically published in peer reviewed journals. Alright. Now, let’s talk a little
bit about the instruments that we found. In total, we identified sixteen
instruments measuring at least one of our three
targeted skills. For collaboration, we
found five instruments. Three of these were
self‑report surveys. One was a performance
based assessment. And one was a
teacher report survey. For perseverance, we found four
student self report surveys. For self regulated learning,
we found four student self report surveys. And finally, we found three
student self report surveys that measured both perseverance
and self regulated learning. Alright, so, actually, can I
open it up to the next poll? Now, I want to check in with
those of you that are interested in measuring your students’
social and emotional learning skills, so I just
want to poll you again. If you’re currently using or
planning to use an instrument to measure social and
emotional learning skills, how do you plan to
use the results? And so, there are three
categories, research use, formative use, summative use. I think these terms are probably
somewhat familiar to all of you but I’ve also offered a
short description of each. With research use, the intention
is to use results produced by the instrument to describe
these skills for a particular population or
examine relationships. With formative use, the
intention is to use results produced by the instrument to
inform instructional change, like to influence
positive change in students. Lastly, with summative use, the
intention is to assign a final rating or score to each student
by comparing each student against a standard or benchmark. Okay. It looks like the majority of
us are interested in formative instruments, with a few that
are interested in research, instruments used
for research purposes. Okay. We’ll go back to the slides now. Okay. So, for the
instruments that we found, let’s look at the intended
uses as described by the instrument’s developers. So, among these sixteen
instruments identified, eleven were developed
for use in research. Five were developed for
formative instruction. And none of the information
collected suggested that any of the instruments should be
used for summative purposes. That last bullet is really
important and I think it’s probably expected
for two reasons. First, instruments used for
summative purposes require more stringent reliability
and validity evidence. And second, many instruments
measuring social and emotional learning skills are fairly new
and instrument developers and researchers are still
collecting evidence on the reliability and validity. Okay. So, over the next few slides,
I’ve identified the names of the instruments
measuring each skill. I’m sure you’re going to notice
that some of the names are somewhat generic and I blame
that on the creativity of our instruments developers. That being said, let’s
start with collaboration. As I mentioned earlier, we
identified five instruments measuring collaboration and
four of the instruments were used in research. And one of the
instruments, the teamwork scale, was used for
formative instruction. Again, we
identified four instruments measuring perseverance. Three of those instruments were
developed for use in research and one of the instruments, the
expectancy value cost scale, was developed for
formative instruction. We identified four
instruments measuring self regulated learning. Two of those instruments were
developed for use in research and two instruments, the
junior metacognitive awareness inventory and the
self-regulation strategy inventory was developed for
use in formative instruction. Finally, three of the
instruments measure both perseverance and
self regulated learning. Two of these instruments were
developed for use in research and one instrument, the
motivated strategies for learning questionnaire,
was developed for use in formative research. For each of the
instruments that we identified, we also indicated whether
reliability and validity information was
available for the instrument. That’s what you’re
seeing on this table. On the left side, you’ll see
each of the sixteen instruments. To the right of instrument name,
you’ll see columns indicating the availability of
information for reliability, the seven components of validity
that I mentioned earlier. And so, for each of
these instruments, we used a filled in dot to
indicate the information when available and an open dot to
indicate the information was not available for each component. Now, I just want to note that
availability of information does not necessarily mean that the
instrument is reliable and/or valid; it just means that we
have information to help us inform about reliability or
specific component validity. Let’s talk about some trends. First, it’s noteworthy that
all instruments had information on reliability. All instruments also had
information on content validity, and many had information
on structural validity, external validity, and
consequential validity. It’s important to note,
however, that no instruments had information on substantive
validity and this is important since substantive validity is
necessary to understand whether students process the
instrument’s questions or tasks as the developer intended. And so, if students aren’t
understanding the content of questions, I think you’ll
agree that that’s problematic. And so, we would have liked to
see these instruments offer some evidence of
substantive validity. It’s also important to note
that only three of the sixteen instruments had
information on fairness. And fairness is important
because it helps us to understand whether a measure
is valid for comparing scores between different
subgroups of students. This speaks to whether a
measure is biased against some groups of students. So, again, we would have liked
to see these instruments offer some evidence of fairness. And finally, I’ll just quickly
call out that only five instruments had
information on generalizability. You can see the definition here. Generalizability refers to
whether scores from the measure correlate with other modes of
measurement for the same skill. So, an example of that would be
a student self‑report instrument measuring perseverance that is
also correlated with a teacher report measure for perseverance. This helps us to
understand validity overall, which is to say that the
instrument measures what it intends to measure. I want to open it up
to another poll here. And I suspect that as I’m
talking about some of these psychometric terms, there may be
some confusion about what some of these terms mean. And so, I just want to get
a read on the audience. What is your level of
familiarity with psychometric terms such as
substantive validity and consequential validity? Do you recognize these terms
and understand them well? Do you just recognize
them, don’t recognize them? Okay. All right. Let me transition
to the next slide. We’re getting a few more. Okay. All right. So, as I suspected, it
looks like some of you are not incredibly familiar
with some of these terms. So, I want to
reassure you that first, we define these
terms in the resource. But also, we’ve developed
a worksheet for schools and districts that can be used to
help you better understand those components of validity and
reliability that might be most important to your
school or district. So, I’ll just begin with the
first page of this worksheet. It begins by asking some
overarching questions help your school or district identify
an initial list of instruments. So, it asks about the specific
skills that are to be measured and those could include
collaboration, perseverance, self‑regulated learning. It also asks about your
target group of respondents. So, are you looking to implement
the measure with high school students or other
types of students? It also asks schools to
identify your purpose for using the instrument. So, all of these questions
really help your school or district to narrow in on
what you want to get out of an instrument. Then we move on to questions
that help you better understand what components of reliability
and validity might be most important to your
school district. And so, using the highlighted
cells here as an example, you can see that
in the first column, we begin by asking the question
are you interested in connecting your students’ social and
emotional learning skills scores to other consequential outcomes
such as achievement scores, graduation rates,
and attendance? Now, if your school
indicates that yes, you are interested in
connecting the SEL scores with consequential outcomes,
then in the right column, we implore you that you should
consider information presented in appendix B for
consequential validity. We also direct you to relevant
tables within the body of the report and the goal here is to
clue readers into the specific components of reliability and
validity that are most important to them and then also direct
the reader to the section of the report that
provides information about those components. So, we really hope that this
worksheet can help our readers to make use of our review. Alright. Now that I’ve shared key
findings from the report, I’m going to transition
to a conversation with our representatives
from Champlain Valley School District. As I mentioned earlier, we’re
lucky to be joined by Jeff Evans, CVSD’s Director of
Learning and Innovation, and Cassandra Townshend, CVSD’s
Director of Behavior Systems. Jeff, I’d like
to start with you. I was hoping that maybe you
could tell us before we dive into more substantive questions,
if you could maybe tell us a little bit about your district. Perhaps you could share a little
bit about the community and the students that you serve.>>JEFF EVANS: Sure. So, we are in the
Burlington area, or the Champlain
Valley in Vermont. And we have six
schools in our system. We have about four thousand
students total which believe it or not, is the largest
system in our state. We’re a very small state. And we have a pretty wide range
of socioeconomic positioning. And we also have, I think,
about 450 or so faculty members, about a thousand employees total
when you count up staff and faculty throughout our district. And I mentioned earlier that
we’re in a state that now requires
proficiency based learning. Our district had actually
started that work about three or four years in
advance of the law. And so therefore, we feel as
though we’re pretty pleased with the trajectory we’ve had. We feel like we’re a pretty
highly functioning system with a transition that has been
significant and challenging at times.>>JOSHUA COX: Thank you, Jeff. Could you talk to us a little
bit about what the district has been doing to support SEL over
the last couple years since developing the competencies
that you identified in that mission statement?>>JEFF EVANS: Sure. If you could put that
next slide up there, that will help me do that. So, we starting this
partnership with REL. We started having
conversations about 2016. And so, in the time that
has passed since then, we’ve evolved quite a bit
around how we think about social and emotional learning. So, I’m going to look at
specifically some things the high school has
done in that time. The first thing is, they
created what they call an engagement survey that
they give all their students. They have about 1,300
students in their high school. And they give it to
them twice a year. And the survey asks a range of
questions aimed at finding out how connected students feel
to the school community, how engaged they feel, what kind
of relationships they have with the adults in the building
and with their peers. And so, a number of the
questions are academic in nature, but also, many of them
are social/emotional in nature as well. And what they do is they’ve
built a platform where it creates a scoring
range, and actually, the screen is inaccurate;
they’ve made some changes. The scoring range now is from
zero to seventeen and the higher the score, the greater concern. I’d like to just tell a quick
story that I heard yesterday. This past Friday, the high
school gave their first round of the engagement survey. They do this through advisories. We all have – we have an
advisory system where every student has a teacher who is
an adviser of theirs and each adviser has about twelve
students, three from each grade. And students all took the survey
at the same time in the morning. And in the afternoon, the
principal got onto this platform and immediately he could do a
search for the students who had the highest concern scores. And he noticed a few and he
started digging in and looking at the actual
responses from students. There was a student particularly
that concerned him that he did not know. He got up, went and found
the adviser of this student, had a conversation. The adviser was quite
surprised by the responses. This girl presented as
a very high‑functioning, quiet student who was
doing quite well in school. And so, then they decided
let’s just go find her and have a quick conversation with her. They realized that she had left
the building without notifying anyone and they then
called the parents. The parents were
unaware that she had left. They searched for her. They eventually found her and
had a conversation with her. It was revealed that you know,
the girl had struggled with depression in the
past, has had therapy, had stopped taking her meds for
quite some time and was having some pretty despondent thoughts. And so, they were able to
respond to this information immediately, information
we never had in the past. This girl would have flown
under the radar for years quite possibly. So, this particular
construct has been really useful. I’ll talk a little bit at the
end about how Josh’s work can combine with the work that
the high school is doing to enhance that. But that’s one of the things
the high school has done now. And also, another thing is they
started a program just this past year we call RISE. It is around personalized
learning experiences where the last two weeks of the
school year are shut down. The school year is over and
the last two weeks are used for passion‑based student
interest experiences. So, every teacher
teaches something they’re passionate about. Students get to, through
surveys, identify passions. They also get to do
independent projects. Of the 1,300 students, about
250 did independent projects. We worked with community
partners in doing this. Basically, it’s a two
week curriculum based around these passions. Everybody has to create
learning targets for our proficiency based system. And every teacher who designs
an experience has to design a learning target or two that are
specific to one of our leaves around self direction or
responsible citizenship. And so, now we are
embedding this into the classroom experience as well. And so that has – that’s been
pretty exciting and it’s the first time that we’ve really put
responsible citizenship and the self-direction standards at
the center of the experience. They’ve been sort of peripheral
and causal in the past, but this is an experience where
this really is a focus of a lot of the learning. And then finally, I’ll point out
– and there’s a lot more than three things, but I want
to point out the three primary evolutions. The third thing is in the time
since we started this work with REL, we’ve adopted
the CASEL competencies. I’m sure many of the folks
in this audience are familiar with them. So, they have their
big five competencies. We’re doing sort of a crosswalk
between our language that we put in our standards and that REL
used in their report with the language that we’re now using. And we find a very tight
alignment there but we’re starting to embed these
competencies in our PLC work we do every week with our faculty
and staff and embedding it into a lot of our classroom
instruction as well. So, those are some of the
things we’ve done since then. I’ll talk a little later about
how this report can inform that.>>JOSHUA COX: Thanks, Jeff. So, there is a question from Jim
Vetter that we received in the chat and it is since CVSD
has adopted the five CASEL competencies, what led you to
focus on assessing the three specific constructs that you
selected rather than assessing the broader range
of SEL competencies? I think that really kind of
speaks to where CVSD was at the time. I think you were really
focused in on the responsible and involved citizenship and
self direction and then we translated those into the
research literature at that point in time. But since then, I think
you’re speaking to the – Jeff, that since then, CVSD has sort
of transitioned to adopting the CASEL 5 and so you’re sort of
making that transition using – by doing these
crosswalks of the terminology. Is that right, Jeff?>>JEFF EVANS: Yeah,
that’s exactly right. It’s a matter of timing. When we started
the work with REL, we had not yet adopted CASEL. And so, since then, we have been
having those conversations about how to do this crosswalk.>>JOSHUA COX: Great. The next question I
have is for Cassandra. And so, when we initially
started working together, I believe that you were mostly
focused on measuring SEL in your secondary school students. Do you have plans to track
those skills in other grade levels as well?>>CASSANDRA TOWNSHEND:
Absolutely, Josh. Yes. We’ve made some great momentum
in establishing SEL as a district priority, clearly, and
really having the need and the desire to expand beyond
just the high school. At CVSD our K‑8 schools are all
implementing positive behavior interventions and
supports, PBIS. That is very similar and a
wonderful framework that we have discovered to really help
our students and our younger students engage in social and
emotional learning practices through that framework. So, we don’t see
PBIS as contrary. In fact, it’s very
complementary to SEL work, especially in our K-8 schools. And so, having that consistency
across our K-8 schools is allowing us at
different buildings, to look at what are we teaching
around social and emotional learning around those CASEL 5s? How are we measuring those? We look at some of our schools
are doing some universal screeners to identify additional
targeted or tier 2 supports for students needing
a little bit more. Some of those, for example,
are the strengths and difficulties questionnaire. We’ve had some schools engage
in that as a universal screener. We’ve also had some of our
schools engage in the student risk screening scale which
really is a simple tool that helps to identify, as
Jeff mentioned previously, sort of identifying
those internalizers. When we think about behavior,
it’s not only what we see on the outside but often what’s
really going on internally. So, we’re building on what our
schools are currently doing and then really expanding our
knowledge base as we continue to move forward with SEL at
the K-8 and K-12 district.>>JEFF EVANS: I can add
just a little bit there too. Josh, right now our
grades 5-8 all have common learning targets. They’re academic
and social/emotional. So, teachers are continuously
communicating with students and families about those standards
that live within the SEL leaves, that we call them.>>JOSHUA COX: Thank you. So, I was wondering if you could
talk a little bit about maybe what you learned from the
process of working together on this report, including how you
might use the report’s findings.>>JEFF EVANS: Yeah. If you go to the next
slide, I can address that. Thank you. So, I met with the high school
folks not too long ago and we started looking at the report. We are still in the process of
initially digesting the report and looking at the
measures that you – I’m sorry, the tools that you measured. But some of the possibilities
that have come out – excuse me – of those meetings
are as follows. In the engagement survey, that’s
a pretty fluid construct right now that we’re
constantly revising. We want to use some of this
information in the report to inform revisions as we
move forward with the survey. And I want to add that
the survey used to be high school only. Now it’s being given in our
K-8 as well for the first time this fall. Also, probably more
importantly is that when we get concern scores that I referenced
earlier, we need a response. We need a systemic
response to those scores. And we are hoping that these
tools that you analyze can help us dig deeper and get more
information about students who have, for lack of a better term,
popped on this survey and gotten on our radar screen in terms
of us building response plans for then. So, we’re really excited about
that because right now it has been a bit of a struggle to know
exactly how to respond and whom should respond, but they’re
building those response protocols and looking at the
possibility of using these tools as part of those protocols. The other thing is we constantly
revise the learning targets and scales that we just
started this process for RISE. These tools can help
us do that as well. And then we really think it’s
going to be helpful as we continue to look at universal
screeners and how we gather information with our CASEL
work moving forward throughout the district.>>JOSHUA COX: Thanks, Jeff. Cassandra, I was going to ask
a little bit about maybe your future plans related to SEL at
CVSD and I believe that you’re actually working on
some continuous improvement process plans. Is that right?>>CASSANDRA TOWNSHEND: Yeah. That’s accurate. One of the things that we’ve
really worked hard at is making SEL a priority. So, it is actually in our
continuous improvement plan so that we’re all working from
the same lens and priority, because we feel that it is
incredibly valuable to do that. Other future activities
definitely will include involving community
families – community, families, stakeholders in this process. Social and emotional
learning as we all know, it’s not just an isolated
fifty-minute course that you take once a day. It’s really embedded in all that
we do and has to be prioritized across the curriculum. One of the things that we’re
really focusing on now that we have a sense of assessment,
we’re looking at specifically how are we developing
district-wide social and emotional learning standards? Because we can
assess all day long, but how do we know we’re
explicitly teaching to those five CASEL competencies? And so that is some of the
future work is really thinking about developing and we’re
continuing to work on this SEL standard as well as providing
opportunities for teachers. Professional development
is key for sustainability. This is all of our work. I see one of the questions here
is about, you know, teachers. How do we get teachers
interested in this? Teachers have a lot
to do with this. They greet the students on
day one in the beginning of the class, middle of the
day, at the end of the day. But our job is to make it as
streamlined as possible so it’s currently embedded in
everything that we do. It’s not viewed as an add-on. So, we’re continuing to provide
professional development opportunities so that our
teachers feel well equipped to deliver the instruction but also
make some positive relationships with our students as well.>>JEFF EVANS: If I could add
just a couple of things to that too, things that help us
integrate SEL pedagogy without it feeling like an add-on
are things like let’s have consistent protocols for
morning meetings, for instance. So, everybody has got the
same thing happening in morning meetings. Lots of teams have put together
these social thinking units that start the year. And we’ve got teachers who
present to other teachers throughout our district at the
beginning of the year around a lot of these practices
and so we’ve seen these practices increase. Also, in terms of them treating
it like it’s just another thing they’re responsible for, often
they will speak to us about the challenge of classroom
management and an increase in dysregulated students. And we have to keep
bringing it back to, well, it really is about these social
and emotional learning skills and developing these at an
early age and developing and reinforcing them consistently
throughout the system.>>JOSHUA COX: Thank you. So, you’ve already started to
speak about some of the major challenges that you’re dealing
with and I’m wonder if maybe we can just have sort of a two-part
question where you can talk first about maybe some of your
biggest accomplishments related to SEL and second, about maybe
some of the biggest challenges.>>CASSANDRA TOWNSHEND:
Absolutely, Jeff. I would say that in
any major initiative, the biggest accomplishment
is to prioritize it at the district level. And we’ve done that. We have a leadership team who
has prioritized this as one of our top three, top four
priorities in our district. We have embedded it in our
continuous improvement plan which is also our guiding
document that we reflect on. Another accomplishment I would
say is that not only have we written this down and
have prioritized it, but we’re
actually acting upon it. In our district
leadership team meetings, we have developed
an SEL design team. That team has been designed and
developed to talk to the folks in our district
about SEL practices. We recently conducted an
inventory of SEL practices to get a better understanding of
what is actually happening in our schools, what is all the
great work that is happening so that we can build
from what’s working well. And so, we continue to do that. That team is also working with
constituents in our district to help develop our social and
emotional learning standards as well. So, we’re not only
saying this is our priority, but we’re actually boots on
the ground and making sure that we’re making – we’re
moving the needle, essentially. Some of the biggest
challenges I would say, Josh, are not dissimilar to the
challenges many districts face in terms of implementing
large scale initiatives. For us, especially in the world
of SEL and as you mentioned previously, there’s
multiple frameworks, there’s multiple practices. We started with the
three competencies. Now we’ve adopted the
CASEL competencies. So, there’s lots of movement
back and forth but I do think that for us, we’ve been
fortunate and the state of Vermont has been very
generous in supporting PBIS as a framework not only for
behavior but for embedding best evidence‑based practices
across all things behaviorally, socially, emotionally as well. And so that’s been
something that we’ve been able to benefit from. Again, one other challenge that
I would highlight too is just building a
consistent understanding. Various people have different
understandings of what SEL is and how it should
be implemented. You know, it’s a paradigm shift. This is the work for
all of us in education. As Jeff mentioned previously, we
are seeing more students coming to school
dysregulated, unprepared, not necessarily having the
social and emotional readiness for school. And so, as the result of that,
our mission is to really have our students be positive and
responsible, involved citizens, be self directed. And so, it’s really incumbent
on us to focus on social and emotional learning so
that we can get to that end. And we’ll continue
to work on that.>>JEFF EVANS: I can add a
challenge that’s been pretty specific to the high
school because in K-8, you’re really not – there’s
not a tremendous emphasis on measurement and grades
because you’re not creating a transcript. Once you’re in high school,
you’re creating this permanent transcript that is used to
demonstrate your college and career readiness. And so, one of the big
challenges for us in the high school and one of the reasons we
started this relationship with REL was that there was so much
variability in how teachers viewed students’ self direction
skills, collaboration skills. It was incredibly inconsistent. And some teachers were really
struggling with creating that causal relationship between
social and emotional skills and academic performance. So many times, I would have
conversations with folks that would say listen, now that we’ve
– ow we’ve separated in our grading system academic
performance from our habits of learning, I’ve got kids who
are still pretty successful and proficient in academics but
are woeful in some of their self‑direction skills. And my response is there
really ought to be that causal relationship where it should
be almost impossible to be academically proficient if
you are woeful in social and emotional learning. So, there has been a lot of
professional development. There has been a lot of work
done on unifying our view and message around how
to instruct, support, and assess the social
and emotional pieces. We still have a long way to go
there because there’s much more greater – there’s much greater
variability in how we do those than there is in how we
view our academic standards.>>JOSHUA COX: All right. I want to thank both you,
Jeff, and Cassandra for sharing those responses. With that, I do want to just
turn this over to Julie Riordan who is going to facilitate a
sort Q&A with the audience with myself, Jeff, Cassandra, using
those questions we’ve received. I’ll turn it over
to you know, Julie.>>JULIE RIORDAN: Great. Thanks, Josh. Thanks, everyone, for
keeping the chat active with your questions. So, I will try to get through
all of them if we can in the next few minutes. So, Paul Smith had question
about using the surveys for screening purposes. I think this was when we had our
poll up about the systems for using these instruments. So, I’ll ask Josh this question,
if he knows anything about, you know, whether any of these
instruments have been used for screening purposes,
but at the same time, ask folks in the audience to
also let us know if you have used instruments in such a way. Josh, do you have any
information about whether these are been used as
screening tools?>>JOSHUA COX: Yeah, I’m not
sure if they have been used for screening tools. I can just say that, you know,
the measures included in this resource, you know, measure a
student’s competence in three specific social and emotional
learning skills and – sorry, I lost my train of thinking. Sorry. Universal screening
is really kind of, typically looks at a
comprehensive picture of student’s social and emotional
competence and targeted screeners, on the other hand,
really look at student’s competence in specific skills. Really, in this search, we
didn’t specifically set out to exclude universal screeners. We would have certainly
been open to including them, provided they met our
eligibility criteria. I will just call out that
the reason that many of these screeners just as the Devereux
Students Strengths Assessment really fail to meet our
eligibility criteria is because they’re not available online
for free or at a low cost. And so, you know,
that being said, I just want to note that if
you’re using the universal screener and
you’re happy with it, I would invite you to share
the tool you’re using in chat so that others can learn
from your experience.>>JULIE RIORDAN: Great. Thanks, Josh. So, we got a couple questions
about professional learning and professional development. Diana asks whether or not any
of these instruments provide professional development
materials to help educators learn more and Russel Montgomery
asking whether any professional learning has been developed for
teachers to build social and emotional learning in students. So, that’s a more general
question, but I – you know, maybe, Josh you could
start, and you know, Jeff and Cassandra chime in
if you have anything to add.>>JOSHUA COX: So, this isn’t
a piece that we specifically looked at when we were reporting
out on the information related to the instruments. So, that being said, I think I
would turn it over to Jeff and Cassandra to see if the
instruments that you’re currently using, whether or not
there’s professional development materials that are
available for them.>>CASSANDRA
TOWNSHEND: Yeah Josh. So, some of the
materials, absolutely. Especially with PBIS. So, again, thinking about PBIS
as complementary and really the framework in which to embed
many of our SEL practices. So, a lot of the evidence based
practices at our district is really subscribing to our – one
in particular is Second Step. Many of our schools – and we’re
trying to build consistency around this in our K-8 system. So, with this, making sure that
schools are attending to social and emotional learning and
Second Step is one of the evidence based curricula out
there that can provide great resources,
professional development. We also have our school
counselors who often will co-teach with classroom teachers
as a way of modeling how to talk about social and emotional
learning competencies. Again, building the knowledge
base of our teachers who do have a lot on their plate. How do we make it easier for
them to understand and also realize this is just part of
what we do as educators when we’re doing math instruction? We do need to talk about
self‑regulation if we’re going into a small group activity. So, yeah. We’ve also
subscribed to as a district, that is also complementary
under sort of our PBIS umbrella, is Responsive Classroom
and Developmental Design. Jeff spoke about
the advisory model. We use that in many
of our middle schools, making smaller communities
with multiple staff members and teachers across subject areas. And that’s really been a great
avenue and venue for teaching of some of the social and
emotional learning competencies. I would also
recommend CASEL’s website. It has great resources as
well not only from like, as a district, what
are your priorities. It can walk you through that but
also some great resources that other schools have been willing
to share around professional development
materials for teachers.>>JULIE RIORDAN: Great. Thank you, Cassandra and Jeff. We have another couple
of questions around the student population. So, someone asks have you seen
any implications for using these measures with your
students with disabilities? Are there frameworks that are
particularly better or less suited to that population? So, that’s one question. Another is I’m interested to
know if any of your schools or districts include
student teacher experiences at residential outdoor
learning centers. That is a more specific one. And then a third which is about
whether or not these instruments are mostly used for typical
students or students with special needs. So, I think this one might
be Cassandra and Jeff, this might be one for
you all to take first. And then Josh, if you have
anything that relates to that.>>CASSANDRA TOWNSHEND: In terms
of the question about students with disabilities, one of the
things that we really focus on is access to all. All means all. So, all students,
despite disability, have access to all
universal curricula support. What we do though, is in a
situation as we would with any student who has
different learning needs, is we look at the content and
would differentiate the content to meet those students’ needs. Again, believing that all
students – all means all, all students have access
to all universal supports, with certain students or with
students with diverse learning needs it’s important to really
focus on that differentiation. That’s where we would spend most
of our energy on if that were the case.>>JULIE RIORDAN: Great. Thank you, Cassandra.>>JOSHUA COX: Yeah. I would just add that, so,
earlier in the presentation I mentioned that only three of
the sixteen instruments had information on fairness. And I would say that that
sort of speaks to – so, that is essentially indicating
whether a measure is valid for comparing scores between
different subgroups of students. And so, it kind of speaks
to whether a measure, like I said before, was biased
against some groups of students. And so, I think it speaks to
this issue that only three of the sixteen instruments had
information on fairness. It’s something we should
be mindful of as instrument developers are putting
instruments out there, that are, in fact, making sure that these
are valid for all students.>>JULIE RIORDAN: Great. Thanks. And so, we did have
another question about school counselors also. Melissa Mariani asks, what
about school counselors? Did the district use any
SEL specific programming or curriculum?>>CASSANDRA TOWNSHEND:
That’s a great question. So, absolutely. Second Step is something
we have used as a district. One of the things that has
been really helpful is with our school counselors in our
district and their understanding of ASCA and the National
Standards for School Counseling, they’ve been able
to help us develop, based on our
graduation standards, they’ve been able to help us
think about how do we create learning targets for students in
different grade levels based on our graduation standards. And so, that’s something that
we’re continuing to work on. But they’ve been really
helpful in that aspect of it. The Second Step is huge. I think that’s probably the
biggest one that we’re using right now in our district. Every school does
have lot of autonomy, as many of our schools do, and
so it really depends on the population that they’re serving
may dictate what specific curriculum they may choose.>>JULIE RIORDAN: Great. Thanks for that. We have a question from –
about the SEL program and evidence-based practices
that high schools are using, so somewhat related. I think it sort of
touches on that as well. So, with the Second Step, that
other question about can you speak to the student
engagement assessment? What does it look like and how
did you determine the questions? How is data compiled?>>JEFF EVANS: Yeah, so I think
the original intent was to – actually, let me
back up a little bit. A couple years ago, I attended
a workshop on gathering data to inform school improvement and
presenters there highlighted four categories that you should
be gathering and analyzing data on all of the time and the four
categories are demographics, student learning, school
processes, and perception data. And so, perception data was
defined as how do students, staff, community members, family
members perceive the experience they’re having, the value of
the experience they’re having. And so, we as a district, when
we start analyzing these four categories, realized we were
pretty ill-equipped in the perception data realm. And so, what we did was we
looked at what do we have where we actually asked the students
and families how they feel about these experiences. It was pretty inconsistent
across the district. It wasn’t really used in any
kind of way that was advancing our practice. And so, the high school said
let’s start with perception data from our students first. And so, they built this. A team of educators
built this survey. They wanted it to
not be very long. So, I think it’s about
nineteen or twenty questions. It doesn’t take that long to do. It asks a range of questions
like I have at least one adult in the building whom I trust;
if I am having any trouble, I can go to them. My life outside of
school is manageable. The classes I’m most engaged
in are X and here’s why. Things like that. So, we really started by
asking a really wide range of questions. And now what we’re doing is
we’re trying to revise that tool more. But we built it
pretty organically. We looked at developmental
asset screeners. We borrowed some of the
language from them as well. But we really wanted it to be
an initial opportunity for us to give kids a voice and to
recognize some things in students that we probably
weren’t recognizing in the past because we, quite frankly,
weren’t asking them enough or efficiently enough.>>JULIE RIORDAN:
Thanks, Jeff, for that. Well, listen. I think we’ve gotten to as many
questions – I think we almost got to all of the questions here
and a few of you answered each other’s questions in the
chat, so it was great to see that as well. I want to thank you
for joining us today. As I said earlier, today’s
webinar recording will be archived and uploaded to
IES’s YouTube channel. If you registered
for today’s webinar, you will receive an email from
us with a link to the recording when it is ready. Just a reminder that
we’d like your feedback. So, please take our survey. Jenny has posted it in the chat. This kind of feedback helps us
to produce these events to be most engaging for all of you. So, we really appreciate
your feedback on that. Again, thank you for your time
and if you have any further questions that we didn’t get
to, please feel free to contact today’s presenters, Joshua Cox,
Jeff Evans, Cassandra Townshend. Their email is here on
the screen and again, will be on the slide deck
if you can download it.