Frequently Asked Questions – School-Age

Welcome to our Frequently Asked Questions for Devereux Center for Resilient  Children’s (DCRC) School-Age Inititaive, Resources, and Professional Development opportunities!

Should you have general questions about Devereux Center for Resilient Children’s (DCRC’s) Philosophy, or questions across age ranges for our Resources and Professional Development, visit our Main FAQs page here.

If you have a question about DCRC’s School-Age Initiative that is not included below, please submit your question here!

  • Q1. How well do you need to know the child to complete the DESSA or the DESSA-mini? Does this differ among various types of programs? How does one's knowledge of the child (or lack thereof) impact the rating?

    Answer: A rater must have sufficient opportunity to observe the child’s behavior over the four weeks prior to completing the DESSA. Therefore, we recommend that raters should have contact with the child for two or more hours for at least three days per week for a four-week period. This translates to approximately 24 hours of exposure to the child. This guideline was determined through feedback from teachers during the development of the DESSA.

    It is important to keep in mind that although this is the recommendation, it is only an estimate. Especially in after-school settings, one must be aware of the many factors that play a role in a rater’s exposure to a child, such as staff-child ratios, types of interactions in the program (e.g., help with homework, large or small group activities, free play, etc.). Therefore, a rater who has less exposure than recommended in the manual may still know a child well enough to complete the DESSA and can do so accurately, depending on these various factors.

    Because the scores are based on the number of times specific behaviors have been noted, a rater’s insufficient opportunity to observe the child could lead to an erroneously low rating.

  • Q2. What factors impact the integrity of the data? For example, if a staff member or parent is having a difficult day with the child, does this impact the rating?

    Answer: Because the DESSA and other parent, teacher, or staff report measures are based on the perspective of the rater, it is expected that responses may be impacted by the experiences a rater has with a child. To reduce this rater influence, it is important to carefully explain the purpose of the rating and how it will be used.  It is also important to review the general administrative guidelines with raters (pg. 39 of DESSA-mini Manual or pg. 49 of DESSA Manual).
  • Q3. Are there concerns if the pretest and posttest ratings are completed by two different staff members? Staff transience may limit continuity among pre/post raters.

    Answer: Ideally the same rater should complete both pretest and posttest ratings. However if this is not possible, it is expected that there could be some variation in scores when two different staff members provide ratings. If the ratings are providing information on an individual child’s progress, it is important to consider the possible rater differences when interpreting results. If the ratings are being used for program evaluation purposes, it is expected that any fluctuations in scores due to different raters would be cancelled out when combined with the ratings from an entire program. Also, as mentioned above, reviewing the purpose of the assessment and the general administration guidelines will result in more consistent use across raters.
  • Q4. What kind of training does one need to complete the tool? Do the various options on the Likert scale need to be defined?

    Answer: Training required to complete the DESSA or the DESSA-mini is very minimal. Raters need to be aware of the general administration guidelines for completion of the assessments. This type of training would take no more than 10 minutes.

    The various options on the Likert scale do not need to be defined for the rater. When individuals complete the assessments, they are reflecting back on a child’s behavior over the past four weeks and selecting the option based on their general impression. Providing a definition of the options would not necessarily make the ratings any more accurate, given that raters will ultimately base their rating on their recollections and general impression. For example, even if “frequently” were defined as once a day, raters do not actually record the number of times they see the behavior and then go back over the previous four weeks and calculate the average number of times a child engaged in that behavior per day.  They still rely on their general impression of the child’s behavior.

    Additionally, not all the behaviors on the assessments have the same base rate.  In other words, children generally engage in some of the behaviors more frequently, while others are more rare occurrences.  Raters seem to take these differences into account when completing the assessments.  Support for this recommendation comes from the high reliability of the assessments, meaning that raters generally complete the assessments in a consistent manner without having the Likert scale options defined.

  • Q5. Is it necessary to define the DESSA or DESSA-mini items for the raters? Would it be appropriate to develop a local rubric to help staff understand how the items might look different for a first grader versus an eighth grader?

    Answer: It is not necessary to define the items of the DESSA or DESSA-mini for raters or to develop a local rubric. By doing so, it would actually cause the assessments to be less valid, as they were developed to be completed using only the instructions at the top of each assessment. During the development of the assessments, we ensured that each of the items could be answered by teachers, after-school staff, and parents of children in grades K-8.  Additionally, we determined that raters typically understand the developmental level of the child for which they are completing the assessment and keep this in mind as they are completing a rating.
  • Q6. Can we combine/aggregate both parent and staff responses?

    Answer: No. Each rating provides the unique perspective of the rater. A child’s behavior may differ depending on the environment, and parents and staff may observe genuine differences in the child’s behavior. We would not recommend combining these responses because meaningful differences in behavior may be lost. These differences in the child’s behavior in different settings, during different activities or with different adults can help us better understand the child’s social and emotional strengths and needs.
  • Q7. Are there concerns regarding quality control? Will agency staff be more likely to reflect youth more positively because the information is reported to a funder?

    Answer: It is important to communicate with staff both the purpose of completing an assessment and how the information from the assessment will be used.  If staff members incorrectly view the assessment as a reflection on their performance, they may answer inaccurately.  Therefore it is important to clearly articulate the purpose and use of the assessment prior to completing the ratings.
  • Q8. What programs have the best capacity to incorporate the DESSA when considering retention and program dosage?

    Answer: Many factors play a role in a program’s ability to incorporate the DESSA as a planning tool.  First, a program should have manageable staff-child ratios. Second, it will be easier to incorporate the DESSA in programs that provide a variety of opportunities for children to demonstrate social and emotional skills. Third, programs that have experienced and trained staff members that are already mindful of children’s skills and interactions may have a greater capacity to incorporate the DESSA.  In terms of retention, programs with lower staff turnover would be better able to provide ongoing support to children planned for with the DESSA.  Finally, we believe that program dosage would not impact a program’s ability to incorporate the DESSA; however dosage may impact the outcome of any strategies or interventions planned.
  • Q9. What are the guidelines for sharing results with parents?

    Answer: When sharing results with parents, it is important to always start the conversation with a description of the purpose of the DESSA and how the information is being used.  Then, we would recommend that you focus on the child’s strengths, if any are apparent. Both the DESSA and the DESSA-mini are strength-based assessments, meaning that all the behaviors measured on the assessments are positive behaviors children engage in.  By first starting with the child’s identified strengths, parents are often more at ease and comfortable during the meeting. Next, the child’s typical areas should be shared with parents.  Finally, any identified areas of need should be shared with the parents. It is important to discuss any needs as areas in which staff/parents will provide additional support and instruction – not as deficits. It is very important to be mindful of the language used in any conversations with parents. Most importantly, it should be stressed that the purpose of the DESSA is not to categorize or label children, but to identify their strengths and needs so that parents and teachers can work together to help the child acquire social and emotional skills that are essential to success in school and life.
  • Q10. Can you explain what population the norms for the DESSA are based on and how recently the norms were collected?

    Answer: The DESSA was nationally standardized in the U.S. in 2005-2006. Ratings were collected from two groups: 1) parents and other relatives living with the child and 2) teachers and after-school program staff. The norming sample closely approximated the Kindergarten through 8th grade population of the U.S. (in 2006) with respect to age, gender, geographic region, race, ethnicity, and socioeconomic status. This final standardization sample included ratings on 2,494 children (parents provided ratings for 1,244 children; teachers/after school staff provided ratings for 1,250 children. Full information on the demographic characteristics of the standardization sample is provided in the DESSA manual.

  • Q11. Can a parent provide ratings on the DESSA-mini?

    Answer: No, the DESSA-mini is only intended to be completed by teachers and after-school program staff. The DESSA-mini was only standardized using ratings from teachers and after-school program staff and therefore, only norms for this group are provided for scoring and interpretation. Because parents would be providing information only on one or a few children, we would recommend that they use the DESSA.
  • Q12. What are the ratings within each construct - typical has multiple numerical values (2 and 3) and need has 0, 1, 2? How does that work? Where do the numbers come from? What does that mean?

    Answer: This is individual item analysis.  We reviewed and analyzed the ratings of 2,500 students on each individual item.  As you would expect, most students received ratings in the middle of the scale (the typical range, usually 2s or 3s), with fewer children receiving high scores (strength range) or low scores (need range).  The exact values vary a little from item to item because the actual frequency of those behaviors varies in children.  For instance, most children will “Frequently” (score of 3) follow rules (# 35), but most children only “Occasionally” (score of 2) do chores without being reminded (#23).  The key point is that we empirically determined what range of scores represented a need, a typical score, or a strength for a child on each item. In general, the lowest 16% of scores on an individual item was deemed to be the need for instruction range; the middle 68% was identified as typical, and the top 16% was labeled the strength range.

    The importance of this is in planning to help the child.  By looking at the individual items, you can determine what specific skills the child is struggling with (i.e., a need) and what skills the child is really good at (strengths).  Using our “Strengths-Needs-Strategies” planning framework, you can ask the child, parent or teacher, which need items are of most concern.  Then identify relevant strengths that the child can use to address the need, and then develop a strategy.  For instance, in one of our case studies, the child struggles with (has a need or a very low score on) item 69 – “use available resources (people or things) to solve a problem.”  However, he had a strength on 11 – “get along with different types of people,” item 16- “say good things about his/her classmates,” and item 22 – “contribute to group efforts.”  Our strategy was to remind him of how well he works in groups and gets along with others and suggest to him that the next time he is stuck trying to solve a problem, he could ask one of his friends for help.  It is important to note that we always start with the child’s strength when presenting the strategy.  So we might say something like, “You know, Charles, you are really good at getting along with others.  You are always saying nice things about your friends and you work so well in groups.  You know, I bet the next time you are stuck trying to figure out a problem, you could ask one of your friends for help.  You are always so nice to them and work well in groups with them that I bet they would be happy to help you.”

    Finally, as is probably obvious by now, the individual item analysis is what makes the transition from an assessment (a score) to a strategy to help the child that is based on relevant, meaningful, empirically identified strengths.  As the National Association for the Education of Young Children says, “Assessment only has value if it leads to an improved outcome for the child.”

  • Q13. Since the DESSA covers K-8, what would you suggest for our high school students? Could we use the DESSA with them if there are no other appropriate assessments?

    Answer: DCRC would not recommend using the DESSA with High School students.  We have no norms on that group.  We do plan on developing a self-report form for secondary school students soon.  In the meantime, two tools to use for this age group might be:

    • “Resiliency Scales for Children and Adolescents” by Sandra Prince-Embury, published by Psychological Corporation
    • “BERS-2” by Michael Epstein, published by Pro-Ed.

  • Q14. What are some ways I can look at outcomes using the DESSA-mini?

    Answer: There are at least three ways to look at outcomes on the DESSA-minis:

    1) Examine percentages: With this approach, you can examine the percentage of students falling within the strength, typical, or need for instruction ranges on their SET at pre and posttest.  In a way, this approach is the simplest and most informative.  Ultimately our goal is for children to have social and emotional strengths.  If we can shift the program/school as a whole towards fewer needs and more strengths, we are doing well. The limitation of this approach is that it can take advantage of small changes.  That is, if a student begins with a score of 59 and ends with a score of 60, they will appear to have moved from the typical to strength range, but a 1T-score point change could be accounted for by measurement error.

    2)  Use the progress monitoring approach (Cohen’s d-ratio) described in the DESSA-mini manual. This approach allows you to characterize the amount of change between successive DESSA-mini administrations as no, small, medium or large changes using widely accepted guidelines.

    3) Use inferential statistics:  Standard pretest-posttest comparisons can be done using statistical software that allows you to conduct T-tests or ANOVAs.  These tests are useful in determining whether statistically significant change has occurred for groups of children, such as in a class, program, school, etc.

  • Q15. What are the computer-generated options for DESSA and DESSA-mini? How would someone receive the computer-generated report and its recommendations?

    Answer: The DESSA and DESSA-mini are now being offered by our NEW school-age publisher, Apperson, Inc!  Apperson’s S.E.L.+ Compass product allows you to administer and score your DESSA assessment online! By taking your assessment online, you will no longer need to worry about tracking down the paper assessments, hand-calculating the scores, filing them and creating your own reporting. Apperson offers two subsciption types via our S.E.L.+ Compass product; a baseline needs assessment and a comprehensive assessment system. For more information about these web-based resources, click here
  • Q16. Who helps design and implement strategies to support children’s DESSA results?

    Answer: Generally, strategies are chosen and implemented by staff on site. This may include teachers, program directors, counselors, or other school/program staff.  An essential part of our approach is that strategies to promote social and emotional competence are woven into the everyday activities that teachers and staff are already engaging in.  Parents are involved in the planning process whenever possible, especially if they also completed a DESSA.
  • Q17. Should teachers and staff administer the DESSA or DESSA-mini to all students 4 times per year? Or just administer the mini to students at the beginning of the academic year to determine who needs the full DESSA and follow up (monitor progress) with those students who have exhibited problems? Can you clarify?

    Answer: Generally the DESSA-mini would be administered to all  students at the beginning (time 1) and at the end (time 4) of the year.  DESSA-minis are typically used at times 2 and 3 only with those students receiving targeted strategies to monitor their progress in acquiring social and emotional competency. Some schools and programs have chosen to use the full DESSA with every student at the beginning of the year rather than the DESSA-mini.  Often the reason is that the teachers value the wealth of information provided by the classroom profile and feel that it is worth the extra time required to complete the full DESSA.
  • Q18. Some of our programs serve children with social and emotional disabilities. Any advice for interpreting results on the DESSA?

    Answer: By definition, children with emotional disabilities, as a group, are going to have lower social and emotional competence. In our criterion validity study included in the DESSA-mini manual, the average T-score across the four minis for a seriously emotionally disturbed (SED) sample was 39 with a standard deviation of about 6.  However, it is not uncommon for students who have been identified as having social and emotional disabilities to have one or more scale scores that fall in the typical or even the strength range.  These strengths are important in the planning process.
  • Q19. Can we fill out the 72 item DESSA as a team together (4-5 teachers that have the student) or would this impact the validity of the assessment?

    Answer: It is not recommended that your team fill out the DESSA together, instead, we’d recommend having the person who knows the child best complete the DESSA. Another alternative would be to have a few team members who know the child well complete the DESSA individually and then discuss the results. The reason for this is because there is great value in getting the unique perspective of each rater. It is possible that the child’s behavior may be quite different in the presence of different people, so their individual ratings would reflect these differences and provide richer and more useful information to the team. Additionally, it’s possible that if the DESSA is filled out as a group, one person who sees the child’s behavior differently may be likely to go along with the rest of group, rather than voice their unique perspective.
  • Q20. What amount of a change in a student’s DESSA and or DESSA-mini results should our program expect?

    Answer:   The amount of change noted in student’s scores depends on a number of factors including how much risk and adversity the child is experiencing, the degree of fidelity in which social and emotional learning strategies are being implemented,  the consistency of the raters, etc.   However, in general, we typically find that a good program doing a decent job of implementing social and emotional learning activities sees on average a 3-5 T-score point change across one program year. Another good way to consider this is that a common rule of thumb in the social sciences is that change of only 1 T-score point is negligible or meaningless.  A change of 2, 3 or 4 points is a small change.  A 5, 6, or 7 point change is a medium change, and any change of 8 points or more is a large change.  Usually, social science and education folks are very happy to get a medium change over a one-year period. Click here to access a template that calculates DESSA and/or DESSA mini score changes. 
  • Q21. Can the results of the DESSA-mini be used to determine different areas of a student's social and emotional competence on which to focus?

    Answer:  The DESSA-mini does not include one question from each scale on the full DESSA.  The fact that there are 8 scales on the DESSA and 8 questions on the DESSA-mini is a coincidence.  In addition, as a screener, the DESSA-mini is not designed to provide data that would indicate exactly which scale(s) on the full DESSA you need to focus.  Rather, if the student’s DESSA-mini score falls in the need range, you can follow-up with a full DESSA and it is at that point where you can determine the areas of focus.