Frequently Asked Questions – School-Age

Welcome to our Frequently Asked Questions for Devereux Center for Resilient  Children’s (DCRC) School-Age Initiative, Resources, and Professional Development opportunities! Should you have general questions about Devereux Center for Resilient Children’s (DCRC’s) Philosophy, or questions across age ranges for our Resources and Professional Development, visit our Main FAQs page here. If you have a question about DCRC’s School-Age Initiative that is not included below, please submit your question here!


    DESSA – Assessment – General

  • Q1. What is the DESSA Comprehensive System?

    Answer: The DESSA Comprehensive System is designed to help school and after-school program staff promote the healthy social-emotional development of all children. The Comprehensive System involves first using the 8-item DESSA-mini to screen the social-emotional competence of all children in a class, school, or program.  Children who are identified as having low social-emotional competence scores on the DESSA-mini should be considered for additional instruction.  The second step of the Comprehensive System involves assessing those students who receive low scores on the DESSA-mini with the full 72-item DESSA.  The DESSA provides more detailed information about a child’s strengths and needs across eight social-emotional competency domains which can be used to help individualize instruction.  As instruction is provided, the alternate forms of the DESSA-mini and the Ongoing Progress Monitoring Form can be used to monitor the progress children are making in acquiring social-emotional competence.  Additionally, both the DESSA and the DESSA-mini can be administered as pretest and posttest measures to evaluate individual child or program level outcomes.
  • Q2. What is the difference between the DESSA and the DESSA-mini?

    Answer: The DESSA and the DESSA-mini are measures of social-emotional competence of children in kindergarten through 8th grade. The primary difference between these two measures is that the DESSA-mini is designed to be a screener of social-emotional competence. A screener allows for a quick and efficient snapshot of one or more children’s overall social-emotional competence.  As such, it can serve as a baseline needs assessment tool to determine how many students within a population might be in need of social-emotional instruction. Additionally, the DESSA-mini can be used as a repeated measure for use in ongoing evaluation of improvements in overall social and emotional competence, as well as serve as an overall outcome measure for social and emotional learning program impact.  The full DESSA is designed to be a more detailed assessment of social-emotional competencies, providing more specific information about the strengths and needs of children across eight social-emotional competency domains.  The full DESSA provides the user with information that can directly lead to individualized instruction for children identified as needing additional social-emotional support across the eight domains.  Like the DESSA-mini, the DESSA can also be used to examine changes in social-emotional competence over time and as an overall outcome measure for social and emotional learning programs.
  • Q3. How were the items and scales developed for the DESSA?

    Answer: To create the items and scales for the DESSA, the authors began with a thorough literature review on resilience, social-emotional learning, and positive youth development, followed by an examination of other strength-based assessments, such as the Devereux Early Childhood Assessment (DECA). Next, readability and usefulness of the items were examined during a national pilot study.  Upon obtaining a pool of strength-based, observable items, national standardization occurred.  Utilizing the standardization dataset, the DESSA items were organized into logically derived and statistically validated scales based on the CASEL Framework ( The authors chose this framework because it is well-established in the research literature and is being incorporated into many state and local social and emotional learning standards. Some minor adjustments were made, such as the inclusion of optimistic thinking as a construct.  Reliability and validity studies were then conducted.  The DESSA ended up with 72 items organized into 8 scales. For more information, please refer to pages 13-20 in the DESSA manual
  • Q4. How were the items on the four forms of the DESSA-mini selected?

    Answer: The eight items on each of the four forms of the DESSA-mini were selected from the 72 items on the DESSA using three criteria. First, it was important for the total score on the four sets of items (the Social-Emotional Total T-score) to correlate as highly as possible to the Social-Emotional Composite (SEC) T-score on the full DESSA. This is because the primary purpose of the DESSA-mini as a screener is to predict the score that the student would receive on the full DESSA assessment. To meet this goal, items with the highest correlation with the DESSA SEC T-score were rank ordered, and the first 32 items (eight items for four forms) were chosen.  Second, it was important for the four DESSA-mini forms to have as few items as possible while still ensuring high reliability and predictive validity; therefore eight items were selected for each form.  It is important to understand that the 8 items per form has nothing to do with the 8 DESSA scales. Rather, these 32 items correlate the highest with the SEC T-score.  Third, it was important for the four DESSA-mini forms to yield the same score for a given child, indicating that they are equivalent and can be directly compared.  Combinations of the 32 items were tested until the means, standard deviations, and internal reliability coefficients were sufficiently similar to determine the final composition of the four DESSA-mini forms.
  • Q5. Do these assessments come in Spanish?

  • Q6. Can the DESSA be used in brief out-of-school-time programs, such as summer programs?

    Answer: Yes, a number of brief out-of-school time programs are currently using the DESSA.  Devereux recommends that programs such as these consider providing additional instruction to staff prior to using the DESSA in order to enhance staff’s comfort and reliability with completing the ratings.  In addition to discussing with staff the purpose, use, and general administrative guidelines for completing the DESSA (pg. 39-40 of the DESSA-mini Manual or pg. 49-50 of the DESSA Manual) it may also be useful to talk to staff about how to prepare for a rating during pre-service training.  For example, it may be helpful to spend time reviewing the DESSA items at the very beginning of the program so that staff know what kinds of behaviors they will be asked to rate.  Additionally, Devereux suggests that staff intentionally observe their students during a variety of different tasks and settings (e.g., small and large group activities; challenging and easy tasks; cooperative learning situations; etc.).  That is they, to the extent possible, arrange a variety of different opportunities to observe the students’ social and emotional skills prior to completing a rating.
  • Q7. What programs have the best capacity to incorporate the DESSA when considering staff retention and program dosage?

    Answer: Many factors play a role in a program’s ability to incorporate the DESSA as a planning tool.  First, a program should have manageable staff-child ratios. Second, it will be easier to incorporate the DESSA in programs that provide a variety of opportunities for children to demonstrate social and emotional skills. Third, programs that have experienced and trained staff members that are already mindful of children’s skills and interactions may have a greater capacity to incorporate the DESSA.  In terms of retention, programs with lower staff turnover would be better able to provide ongoing support for children who have received planning with the DESSA.  Finally, we believe that program dosage would not impact a program’s ability to incorporate the DESSA; however dosage may impact the outcome of any strategies or interventions planned.
  • DESSA– Administering the Assessment

  • Q1. Who can complete the DESSA and DESSA-mini?

    Answer: The DESSA can be completed by parents (including guardians or other adult caregivers who live with the child) and teachers (including after-school program staff or other professionals who interact directly with the child on a regular basis). The DESSA-mini can only be completed by teachers and after-school program staff.  Devereux recommends that parents use the full DESSA because they would only be providing information on one or a few children, rather than a classroom of children needing to be assessed.
  • Q2. Who should serve as the DESSA rater in a team teaching situation?

    Answer: Devereux recommends that the teacher who knows the child best complete the DESSA rating.  An alternative to this approach would be to have multiple team members complete the DESSA individually, as long as they know the child well, and then come together as a group to discuss the child’s results.  A child’s behavior may be quite different in the presence of individuals interacting with the child during different situations and experiences. Ratings completed separately may reflect these unique perspectives and provide richer and more useful information to the team.  When two teachers complete separate ratings of a child, the rater comparison table (found on page 74 in the DESSA Manual) can be used to determine where teachers see significant differences in children so that the information can be discussed and used in planning.
  • Q3. Is it appropriate to use the DESSA with high school students?

    Answer: Devereux would not recommend using the DESSA with high school students, because it was developed for students in kindergarten through 8th grades and we do not have norms on a high school sample. We are currently in the process of developing a high school version of the DESSA. If you would like to help us develop this assessment by participating in the national standardization, please visit this link:
  • Q4. Should the DESSA and DESSA-mini be used with children who have special needs?

    Answer: We encourage schools and programs to use the DESSA and DESSA-mini with all children in the class, not just those already showing behavioral concerns and not excluding those with specific disabilities. While we promote the use of the DESSA/DESSA-mini with all children, we also caution programs that staff need to be sensitive to specific situations and not use information gained from the DESSA in ways that would be harmful or detrimental to a child and his/her family. For instance, if a child with severe disabilities scored entirely in the area of need on the DESSA scales, the individual interpreting and sharing that information with families need to be sensitive to the child’s particular condition. It should be noted that many developmental disabilities do in fact impact social and emotional development and therefore in many cases we would expect to see lower DESSA scores. This information is still valuable as it will likely support goals already identified as well as offering a strength-based method for setting goals.  The information gained through the DESSA should be considered one piece of a much larger system of information gathering and assessment. Results should be communicated with families and used to develop comprehensive plans that identify goals and strategies for all the areas of need including those that will promote social-emotional competencies for the child. Additionally, for children who score in the need range across the board on the DESSA scales, we recommend looking at the individual items on the scales to determine the child’s relative strengths and goals. This information, as well as continuous observations, can be very useful in the planning process.
  • Q5. Is it appropriate to translate the DESSA Record Form for parents who do not speak English or Spanish, or who speak limited English or Spanish?

    Answer: Devereux recognizes that many schools and programs serve children whose parents do not speak English or Spanish well enough to complete the standardized English or Spanish version of the assessment. Devereux values and encourages the involvement of parents, and a translator can be used to ask the parents the DESSA items.  However, we recommend that any DESSA rating obtained through a translator should not be scored.  This is because we are not sure how the translation would affect the scores, since it is no longer standardized and we would need to conduct an equating study to really understand the impact on scores.  Instead, the rating should be used to gather qualitative information to gain a better understanding of the child. When a translator is used, it is recommended that the translator consider any cultural differences in the meaning of items.  That is, a behavior that is valued and viewed as appropriate in the dominant US culture for which the DESSA was developed may not be a culturally valued behavior in the culture of the parents receiving the translated version. Not only the language, but the meaning of the items themselves may differ and this needs to be considered in reviewing the responses. Devereux collects translation requests, and if we are to receive enough requests for a given language, we will consider doing a formal translation and the necessary research studies so that it can be scored and interpreted like the English and Spanish versions of the DESSA. Please contact [email protected] to make a specific language request.
  • Q6. Is it appropriate to read the DESSA Record Form to parents who have difficulty reading?

    Answer: Yes, as long as the questions are read exactly as they are written, using a neutral tone of voice.  Readers should not attempt to influence the ratings in any way.
  • Q7. Does our school need parental permission to do the DESSA and DESSA mini?

    Answer: Each school adopts policies on what sorts of activities require parent permission and, in doing so, shape the norms that parents come to expect. The best answer to this question is likely to come from your local administration. We find that the majority of school policies do not require parent consent for Tier 1 screeners in an RTI framework or for curriculum based measures in the context of universal social emotional instruction.  Thus, many schools do not require specific consent for the DESSA when used for these purposes, just as they would not require parental consent for screening and assessment in other curricular areas (e.g. math, reading). A minority of schools do seek parental consent for matters of routine practice and/or when they are using the DESSA for other purposes (e.g., as part of a comprehensive evaluation being conducted on an individual child or a subgroup of children, as part of research study or enrichment activity rather than existing educational practice). We believe it is always important to adhere to district policies and local norms when beginning a new initiative, but we rarely encounter district policies or local norms that require parental consent for the universal administration of a strength-based instructional tool like the DESSA.
  • Q8. What guidance does Devereux offer about how well a teacher rater should know the child to complete the DESSA or DESSA-mini?

    Answer: A rater must have sufficient opportunity to observe the child’s behavior over the four weeks prior to completing the DESSA. Therefore, Devereux recommends that raters should have contact with the child for two or more hours for at least three days per week over a four-week period. This translates to approximately 24 hours of exposure to the child. This guideline was determined through feedback from teachers during the development of the DESSA. It is important to keep in mind that this recommendation is an estimate. Many factors play a role in a rater’s exposure to a child, such as staff-child ratios and the types of interactions that occur within a program (e.g., mentoring, help with homework, large or small group activities, free play, etc.). This is especially true for out-of-school time and after-school settings.  Therefore, a rater who has less exposure than recommended in the manual may still know a child well enough to complete the DESSA and can do so accurately, depending on these various factors. The most important question for the rater to be able to answer is: “Do I feel that I know the child well enough to confidently respond to the questions on the record form?”  If the answer is ‘yes’ and he/she has known the child for a minimum of 4 weeks, he/she can proceed to conduct the rating.
  • Q9. Can we combine/aggregate both parent and teacher/staff responses?

    Answer: No. Each rating provides the unique perspective of the rater. A child’s behavior may differ depending on the environment, and parents and staff may observe genuine differences in the child’s behavior. We would not recommend combining these responses because meaningful differences in behavior may be lost. These differences in the child’s behavior in different settings, during different activities or with different adults can help us better understand the child’s social and emotional strengths and needs and lead to improved planning and intervention.
  • Q10. What advice do you provide for reducing rater bias?

    Answer:  Because the DESSA and other parent or teacher/staff report measures are based on the perspective of the rater, it is possible that ratings could be influenced by a variety of factors, such as the day-to-day experiences the rater has with a child or misunderstandings about how information from the ratings will be used.  To reduce this rater influence, it is important to carefully explain the purpose of the assessment and how the information from the assessment will be used, followed by the opportunity for raters to ask questions. If staff members incorrectly view the assessment as a reflection on their performance, they may answer inaccurately.  It is also important to review the general administrative guidelines with raters (pages 39-40 of the DESSA-mini Manual or pages 49-50 of the DESSA Manual) prior to completing any ratings.  Discussing with staff that the rating is based on the child’s behavior over the last four weeks may help reduce the possibility of raters being influenced by the child’s behavior (for example, during a “bad day”).  It is expected that responses may be impacted by the day-to-day experiences a rater has with a child.  It is also possible that outside factors could influence a rating (for example, reflecting a group of students more favorably because that information is being reported to supervisors or funders).
  • Q11. Is it necessary to define the items or the Likert scale options (frequency anchors) for parents or teachers completing the DESSA?

    Answer: No, neither the items nor Likert scale options (frequency anchors) should be defined for the rater. During the development of the DESSA, the authors determined that each of the items could be answered by parents, teachers, and program staff of children kindergarten-8th grade using only the instructions at the top of the Record Form.  It was also determined that raters typically understand the developmental level of the child for which they are completing the assessment and keep this in mind as they are completing a rating.  Raters should interpret each item according to their own experiences and perspectives.  For example, it is likely that the word “inappropriate” has different meanings to different raters.  The rating reflects the adult’s interpretation of the child.  If asked what an item on the assessment means, you should reply, “What does it mean to YOU?”  Once the rater has responded, encourage the rater to answer the question according to that interpretation and to use that same interpretation consistently when rating all additional children.

    The same is true for the Likert scale options. When individuals complete the assessment, they are typically reflecting back on a child’s behavior over the past four weeks and selecting the option based on their general impression. Providing a definition of the frequency options would not necessarily make the ratings any more accurate, given that raters will ultimately base their rating on their recollections and general impression. For example, even if “Frequently” were defined as once a day, raters do not actually record the number of times they see the behavior and then go back over the previous four weeks and calculate the average number of times a child engaged in that behavior per day.  They still rely on their general impression of the child’s behavior.  Additionally, not all the behaviors on the assessments have the same base rate.  In other words, children generally engage in some of the behaviors more frequently, while others are more rare occurrences.  Raters seem to take these differences into account when completing the assessments. Support for this recommendation comes from the high reliability of the assessments, meaning that raters generally complete the assessments in a consistent manner without having the Likert scale options defined.

    If you are asked about the meaning of the Likert scale options, the authors recommend making it clear to raters that their interpretation of the frequency terms, whatever that definition may be, should be used consistently when rating children. Also, it may be helpful if raters are not sure about how to most appropriately rate a child, to actually put the frequency labels right into the question when thinking about the item and reflecting on which statement most accurately reflects their observations of the child.  Following is an example for an item that reads “cooperate with peers or siblings”.  The rater might say, “Maria occasionally cooperates with peers or siblings” or “Maria frequently cooperates with peers or siblings.”  After hearing both statements and reflecting, the rater should feel more confident selecting the appropriate frequency option for that particular item.

  • DESSA – Scoring and Interpreting the Assessment

  • Q1. What scores are typically reported on the DESSA and DESSA-mini?

    Answer: Scores on the DESSA and DESSA-mini are reported as T-scores and percentile ranks.  Three descriptive ranges are also provided to characterize performance on the measures.

    – The “Strength” range refers to T-scores of 60 and above. This means the child’s scale score was one standard deviation (or more) above the norm.  In a typical population, about 16% of children will be classified as within this range, based on the standardization sample.

    – The “Typical” range refers to T-scores of 41-59 inclusive.  In a typical population, about 68% of children will be classified within this range.

    – The “Need for Instruction” range refers to T-scores of 40 and below.  This means the child’s score fell one standard deviation (or more) below the norm. In a typical population, about 16% of children will be classified as within this range.

  • Q2. How do I score blank items on the DESSA?

    Answer:  Refer to the DESSA Manual (pages 53-55) for more information on scoring blanks.  General guidelines are as follows:  1) There can be no more than three items left blank on the entire DESSA, 2) There can be no more than one item left blank on any individual DESSA scale.  If these two conditions are met, the value that appears in the rectangular box with a black border on pages 4 and 5 of the DESSA Record Form should be used as the item raw score for that item. This is the typical or most common score for this item.
  • Q3. Why aren’t there different norms for different age levels?

    Answer: All of the items on the DESSA are appropriate for children in kindergarten through the 8th grade. In addition, the parent and teacher norms are also appropriate for this age range. One of Devereux’s goals has always been to keep the administration, scoring, and interpretation of the DESSA as simple as possible. One way to keep things simple is to avoid age-based norms. Therefore, during the development of the DESSA the authors tested items for age trends and eliminated any items that showed such trends. As a result, there are not separate norms tables for students of different ages. As an example, consider that the DESSA only asks how often a student engages in a behavior, such as “ask somebody for feedback”.  It does not ask how they engage in that behavior. For instance, a student in kindergarten might say to a teacher, “do you like my drawing?” whereas a student in the 8th grade might ask a teacher to critique an essay that he wrote as part of applying to a special academic enrichment program.  In fact, based on the roughly 2,500 students in our standardization sample, we know that how often students engage in these behaviors does not differ across the kindergarten-8th grade range, although how the students engage in this behavior might. So you can use the same set of norms for all children in kindergarten through the 8th grade with confidence. 
  • Q4. Why are there different norms for parents and teachers?

    Answer: In the standardization sample, there were significant differences between the ratings provided by parents and teachers. This is to be expected, as behavior often differs across environments and in the presence of different adults. Consequently, separate norms for the two groups were created.
  • Q5. What is individual item analysis and how is it used?

    Answer: Individual item analysis is a way to determine whether a child’s score on each DESSA item is within the “strength”, “typical”, or “need for instruction” range.  During the development of the DESSA, the authors reviewed and analyzed the ratings of approximately 2,500 students on each individual item.  As you would expect, most students received ratings in the middle of the scale (the “typical” range, usually “Occasionally” or “Frequently”), with fewer children receiving high scores (“strength” range) or low scores (“need” range).  The exact values vary a little from item to item because the actual frequency of those behaviors varies in children.  For instance, most children will “Frequently” follow rules (item #35), but most children only “Occasionally” do chores without being reminded (item #23).  The key point is that the authors empirically determined what range of scores represented a “need”, a “typical” score, or a “strength” for a child on each item.  In general, the lowest 16% of scores on an individual item was deemed to be the “need for instruction” range; the middle 68% was identified as “typical”, and the top 16% was labeled the “strength” range.

    The importance of individual item analysis is in planning to help the child.  By looking at the individual items, you can determine what specific skills the child is struggling with (i.e., a “need”) and what skills the child is really good at (“strengths”).  Using our “Strengths-Needs-Strategies” planning framework, you can ask the child, parent, or teacher which “need” items are of most concern.  You can then identify relevant strengths that the child can use to address the need, and then develop a strategy.  For instance, in one of our case studies, the child struggles with (has a need or a very low score on) item #69 – “use available resources (people or things) to solve a problem.”  However, he had a “strength” on item #11 – “get along with different types of people,” item #16- “say good things about his/her classmates,” and item #22 – “contribute to group efforts.”  Our strategy for this child was to remind him of how well he works in groups and gets along with others and suggest to him that the next time he is stuck trying to solve a problem, he could ask one of his friends for help.  It is important to note that we always start with the child’s strength when presenting the strategy.  So we might say something like, “You know, Charles, you are really good at getting along with others.  You are always saying nice things about your friends and you work so well in groups.  I bet the next time you are stuck trying to figure out a problem, you could ask one of your friends for help.  You are always so nice to them and work well in groups with them that I bet they would be happy to help you.”

    Finally, the individual item analysis is what makes the transition from an assessment (a score) to a strategy to help the child that is based on relevant, meaningful, empirically identified strengths.  As the National Association for the Education of Young Children (NAEYC) says, “Assessment only has value if it leads to an improved outcome for the child.”

  • Q6. Can you explain what population the norms for the DESSA are based on and how recently the norms were collected?

    Answer: The DESSA was nationally standardized in the U.S. in 2005-2006. Ratings were collected from two groups: 1) parents and other relatives living with the child and, 2) teachers and after-school program staff. The norming sample closely approximated the kindergarten through 8th grade population of the U.S. (in 2006) with respect to age, gender, geographic region, race, ethnicity, and socioeconomic status. This final standardization sample included ratings on 2,494 children (parents provided ratings for 1,244 children; teachers/after school staff provided ratings for 1,250 children). Full information on the demographic characteristics of the standardization sample is provided in the DESSA manual (pages 13-19).
  • Q7. What advice do you provide on interpreting DESSA results for programs serving children with social and emotional disabilities?

    Answer: By definition, children with emotional disabilities, as a group, are going to have lower social and emotional competence. In our criterion validity study included in the DESSA manual (pages 35-38), the average T-score on the DESSA Social-Emotional Composite (SEC) for a sample of children identified as seriously emotionally disturbed (SED) was 36.4 (standard deviation equaled 7.9).  However, it is not uncommon for students who have been identified as having social and emotional disabilities to have one or more scale scores and individual item scores that fall in the typical or even the strength range.  These strengths are important in the planning process.
  • Q8. What are the computer administration and reporting options for the DESSA and DESSA-mini?

    Answer: The DESSA, DESSA-mini, and DESSA-Second Step Edition are now being offered by our school-age publisher, Apperson, Inc.  Apperson’s Evo Social & Emotional platform allows you to administer and score your DESSA assessment online, and receive immediate individual student and group results.  In addition, the Evo platform now provides universal, group, and individual strategies that align with the eight DESSA constructs. For more information about these web-based resources, click here.
  • Q9. What are the recommended guidelines for sharing DESSA results with parents?

    Answer: When sharing results with parents, it is important to always start the conversation with a description of the purpose of the DESSA and how the information is being used.  We would then recommend that you focus on the child’s strengths, if any are apparent. Both the DESSA and the DESSA-mini are strength-based assessments, meaning that all the behaviors measured on the assessments are positive behaviors children engage in.  By first starting with the child’s identified strengths, parents are often more at ease and comfortable during the meeting. Next, the child’s typical areas should be shared with parents.  Finally, any identified areas of need should be shared with the parents. It is important to discuss any needs as areas in which staff/parents will provide additional support and instruction – not as deficits. It is very important to be mindful of the language used in any conversations with parents. Most importantly, it should be stressed that the purpose of the DESSA is not to categorize or label children, but to identify their strengths and needs so that parents and teachers can work together to help the child acquire social and emotional skills that are essential to success in school and life.
  • Q10. Is it appropriate to use the DESSA-mini for tracking progress on the eight DESSA scales?

    Answer: No. The purpose of the DESSA-mini is to provide an overall estimation of the student’s social and emotional competence.  The 8 DESSA-mini items are not intended to represent the 8 DESSA constructs; they were selected to predict the total score on the full DESSA assessment. The DESSA-mini is not intended to assess or measure progress on the 8 individual scale constructs – you will need to use the full DESSA to get at this level of specificity. 
  • DESSA– Outcomes and Results

  • Q1. Can pretest and posttest ratings be completed by two different teachers or staff members?

    Answer: Ideally the same rater should complete both pretest and posttest ratings on the same child or classroom/group of children.  If this is not possible, for example because of staff turnover, a posttest rating can be completed by a different teacher or staff member.  It is expected that there could be some variation in scores when two different individuals provide ratings. If the ratings are providing information on an individual child’s progress, it is important to consider the possible rater differences when interpreting results. If the ratings are being used for program evaluation purposes, it is expected that any fluctuations in scores due to different raters would be cancelled out when combined with the ratings from an entire program. To help reduce these differences, it is important to review the purpose of the DESSA, as well as the general administration guidelines with the raters prior to completing the assessment.
  • Q2. What are some ways I can look at outcomes using the DESSA-mini?

    Answer: There are at least three ways to look at outcomes using the DESSA-mini:

    1) Examine percentages: With this approach, you can examine the percentage of students falling within the “strength”, “typical”, or “need for instruction” ranges on their overall score (the Social-Emotional Total or SET score) at pretest and posttest.  In a way, this approach is the simplest and most informative.  Ultimately our goal is for children to have social and emotional strengths.  If we can shift the school, classroom, or program as a whole towards fewer needs and more strengths, we are doing well. The limitation of this approach is that it can take advantage of small changes.  That is, if a student begins with a score of 59 and ends with a score of 60, they will appear to have moved from the typical to strength range, but a 1 T-score point change could be accounted for by measurement error.

    2)  Use the progress monitoring approach (Cohen’s d-ratio) described in the DESSA-mini manual (pages 60-61). This approach allows you to characterize the amount of change between successive DESSA-mini administrations as no, small, medium or large changes using widely accepted guidelines. Click here to access a template that calculates DESSA-mini score changes.

    3) Use inferential statistics:  Standard pretest-posttest comparisons can be done using statistical software that allows you to conduct T-tests. These tests are useful in determining whether statistically significant change has occurred for groups of children, such as in a class, program, school, etc.

  • Q3. How many times per year should a school or program administer the DESSA-mini and DESSA if using the DESSA Comprehensive System?

    Answer: Generally, the DESSA-mini would be administered to all students at the beginning (time 1) and at the end (time 4) of the school/program year.  DESSA-minis are typically used at times 2 and 3 only with those students receiving targeted strategies to monitor their progress in acquiring social and emotional competencies.  Typically, the DESSAwould be administered to students falling within the “Need for Instruction” range on the DESSA-mini at time 1 in order to gain a better understanding of the child’s strengths and needs across the eight DESSA scales for planning purposes.  The DESSA would then be readministered at the end of the year (time 4) for these students who received targeted planning throughout the school/program year. Some schools and program have chosen to use the full DESSA with every student at the beginning of the year rather than the DESSA-mini. Often the reason is that teachers value the wealth of information provided by the Classroom Profileand believe it is worth the extra time required to complete the full DESSA.
  • Q4. Can I do a pretest-posttest comparison to examine changes in a child’s social-emotional competence across Devereux assessments (e.g., a pretest DECA-P2 and a posttest DESSA)?

    Answer: Yes, however a number of recommendations apply. Click here to access these recommendations as well as the values required for significance when comparing across Devereux assessments for both parent and teacher raters.
  • Q5. How can the DESSA be used to measure outcomes for individual children, groups of children, and program evaluation and quality improvement purposes?

    Answer:  The value of the DESSA is that it not only provides a psychometrically-sound measure of social and emotional competence in individual children and groups of children, but it can also be used for progress monitoring and program evaluation purposes. Progress monitoring consists of administering an assessment such as the DESSA multiple times throughout a school year in order to examine how children are responding to a particular intervention or program. The goal of progress monitoring is to obtain feedback during the intervention or program so that the nature or intensity of the intervention or program can be adjusted to maximize the chance of a successful outcome. Progress monitoring may occur multiple times over the year. In contrast, program evaluation refers to examining the progress or change observed over a defined period of time such as a school year to determine overall program effectiveness.  Download a document that will provide guidance on using our assessment to both monitor the acquisition of social-emotional competencies and to evaluate the impact of intervention such as social and emotional learning (SEL) programs . 
  • Q6. Where do I find the information needed to interpret changes in a child’s T-scores from pre-test to post-test on the DESSA?

    Answer: Refer to the DESSA Manual, pages 75-77, as well as the tables for parent and teacher ratings found in Appendix B in the manual. The complete “advanced interpretation” tables allow you to interpret all scores (both scores that improved as well as scores that worsened) between DESSA ratings using the standard error of prediction approach.
  • Q7. What amount of a change in students' DESSA and/or DESSA-mini results should our program expect?

    Answer: The amount of change noted in students’ scores depends on a number of factors including how much risk and adversity the children are experiencing, the degree of fidelity in which social and emotional learning strategies are being implemented,  the consistency of the raters, etc.   However, in general, we typically find that a good program doing a decent job of implementing social and emotional learning activities sees on average a 3-5 T-score point change across one program year. Another good way to consider this is that a common rule of thumb in the social sciences is that change of only 1 T-score point is negligible or meaningless.  A change of 2, 3 or 4 points is a small change.  A 5, 6, or 7 point change is a medium change, and any change of 8 points or more is a large change.  Usually, social science and education folks are very happy to get a medium change over a one-year period.   Click here to access a template that calculates DESSA and/or DESSA mini score changes.
  • DESSA – Professional Development and Resources

  • Q1. Do I need to buy DESSA Record Forms if I use Apperson's web-based Evo Social & Emotional platform?

    Answer: No. In addition to completing the DESSA directly online, a pdf of the DESSA can be downloaded for parents and teachers/program staff who do not have access to the internet. Once the pdf is completed, the ratings can be entered into the system to obtain the scores and available reports. For more information on the Evo system, click here.
  • Q2. What training options are available?

    Answer: Training required to complete the DESSA or the DESSA-mini is very minimal.  Raters need to be aware of the general administrative guidelines for completion of the assessments. A variety of options are available, including recorded and live webinars, in-person training, and ongoing technical assistance.  Click here to visit the Professional Development page on our website or contact Debi Mahler, Director of Professional Development, to talk about options for your school or program: (866) TRAIN-US, or, [email protected].
  • Q3. Are their strategies available that align with the eight DESSA competencies?

    Answer: Yes! Strategies that align with the DESSA are now available on Apperson’s Evo Social & Emotional platform. Universal, group, and individual student strategies are available for each of the 8 DESSA scales.  For more information about these web-based resources, click here.