Our team leader, Andrew Henry, previously served as the director of a state agency. While all of us are excited to put an effective solution into the hands of SEAs, he is especially eager to see positive change in his former work environment.
Here is, in Andrew’s own words, a breakdown of exactly how Stepwell can make self-assessment in the area of special education much easier.
Going about the business of providing software to educators, I am often asked by SEA administrators how systems like Stepwell can help them get better outcomes for their own work. It is a good question, one that we have spent years crafting better and better answers to.
Education is awash with data. In the short space of a decade, simple student count and summative assessment data have given way to real time, formative assessments and click-level data describing student interactions with electronic learning resources. In fact, more data was created in the past two years than in the rest of the history of the human race! All this data means we can have an unprecedented view into the processes that make us better. For an SEA, data should give insight into strategic planning for the highest impact on the education community.
One of the first things SEAs do to have the deepest effect on local programs is align its strategies with its goals for the system. The steps are familiar to anyone involved in School Improvement Planning:
- Gather data
- Study the data and reach conclusions
- Plan for success
- Implement and monitor the plan, course correcting if necessary
The concept is simple enough, but as anyone that has worked through it knows, the challenges to successful strategic planning can be formidable. As the former director of a state agency whose job was to collect, analyze, and distribute education data, I wish I could say that gathering relevant and actionable information, the very first step, is easy. In reality, the challenges are many; data is out of date, exists in many formats, and stored in disparate systems. Often times, data quality is suspect and significant effort is required by analysts and technologists to create comprehensible reports.
It is entirely reasonable that SEA users expect their systems to provide on-demand data, powerful performance metrics, and flexible reporting. How else can they make informed, data-driven decisions about policy effectiveness, program impact, or effective technical assistance practices? How else can they articulate the “why?” of a strategic decision or address the later question, “How well are we doing in achieving it?”
In designing Stepwell, we were conscientious of the need to gather, organize, and visualize results, compliance, and process data. Performance and outcomes data is loaded into Stepwell through performance reports, letters of findings, and other official reports, putting at the fingertips of SEA administrators the answers to questions such as these:
- How many outstanding findings are there from last year?
- How does that compare to this year and what does it look like over time?
- How many districts responded in a timely way?
- How do students perform in districts that are or are not in compliance?
- How many districts need technical assistance with which indicators?
Process data is generated by the day-to-day use of Stepwell to complete workflows. These data reveal previously invisible nuances of the compliance process, including critical inflection points in intervening with LEAs, as well as how LEAs perform currently and in the past. This allows state administrators to observe past patterns and compare districts, indicator completion, and results.
All of this data in one place provides state education agencies with a way of taking the temperature of an entire state. Aided with a clear, data-backed understanding of how the state is doing, stakeholders can move on to crucial improvement planning.