Social Skills Programs

There are hundreds of social skills programs on the market. Most are untested and unproven – in a rigouous and scientific way. Almost all claim to be “Evidence-based”, but that does not mean that they are Research-Based or that they demonstrated to be effective and actually deliver measurable results in a scientific study or practice.

Intstitute of Education Sciences

The Intstitute of Education Sciences (IES) is a goverment research institution that looks at the research behind educational practices. One of their markee brands is the What Works Clearinghouse. The WWC is a great resourse for reviewing if the program you want to use in your classroom is effective (scientifically speaking).

The IES has recently released their findings on Social Skills Training in early education settings – and the results are NOT encouraging. See their findings here and download the full report. Other meta-analysis studies by Sarup Mathur, Robert Rutherford and others in the early 2000’s confirm the low effect-size of most social skills training research studies.

4 Reasons for Low Effect-Size of Social Skills Training Programs

There can be a lot of reasons for the dismal effectiveness of student training or intervention program.

  1. They simply don’t work and are poorly designed. (Probably not the case with most interventions).
  2. They are not implemented with fidelity or integrity or at all. We adults don’t follow through with our part of the intervention.
  3. They are not implemented correctly. We adults sometimes pick and choose only some pieces of the intervention to implement. But, like weight-loss programs, if we choose to still eat chocolate and cake daily, we won’t see results whether we use Weight-Watchers, Jenny Craig, or Adkins!
  4. The intervention does not match the needs of the student. Some programs work great when used for their intended purpose – but not so much when they are used in a way they were not intended to be used. Kind of like driving a convertable Corvett or Ferrari during a blizzard in Minnesota in February – not the best choice and definitely not the best performance or traction!

The Conclusion:

Match your intervention to the students needs, then implement it correctly and with fidelity. And always monitor/measure the effects. That is the only true way we can decide if the intervention is working or not!

So, what will you do with this information? Let me know and leave a comment below.