How Good is our Return on Learning for Classroom Apps?
School districts across the country spent more than $3 billion on ed tech products last year and chose from an assortment of 2,500 apps on the market. With so much money at stake and so many choices, educators need to know which products have a positive impact on student outcomes and the conditions under which impact is the greatest.
Join us for What Ed Tech Apps Work Best for Learning? Monday, November 5, 2018 12:00pm Pacific time, 3:00pm Eastern
With learning today so dependent on technology, districts need to know if they are investing in the programs that can make the greatest difference for their students. With increasing budget constraints and more aggressive goals for student improvement, they need to spend their money on programs that show a return onlearning.
Districts are asking: Are we getting our money’s worth?
The way to find out is to capture and leverage ed tech data to measure program effectiveness. Impact on student scores is key, but impact happens only if student use the apps and engagement with them is high.
The way to find out if we are using the best possible apps for our needs is to assess data across the three aspects of investment, engagement, and impact.Tracking subscription implementation and maintenance costs indicates investment.Measuring program adoption, usage, and perception shows engagement. And connecting program usage and student achievement demonstrates impact.
Recent advances in data mining provide a way to get data on tool cost and usage and map this information to outcome data such as assessments and then use these relationships in ways that make the information accessible and actionable.
There are three steps in the process to measure the data for engagement and usage, investment costs, and assessment and impact. Clearly, using a sophisticated data retrieval tool streamlines the process.
To understand engagement across the district, measure usage by tracking activity for each app. Questions to ask are:
- Which students are using each tool?
- How often do students use the tool?
- How long do sessions last?
- How active are students in the tool?
In addition, measure how students feel about each tool. Questions to ask are:
- What percentage of students like the tool?
- What percentage of students dislike the tool?
To understand costs, measure the investment of ed tech programs across the district. Questions to ask are:
- How many program licenses are at each site?
- How much is the dollar cost per license?
- What is the duration of the license?
In addition, note the distribution of licenses. Questions to ask are:
- How many active/inactive users are there for each license?
- How many active users are at each grade level or site?
- What is depth of user activity? (Based on frequency, duration, and quality)
To understand the impact, analyze the relationship between usage and student achievement. Ask what the impact of program usage is on the subjects in which they are used such as math, science, and ELA. In addition, consider other relevant data such as the impact of ed tech usage on attendance and on student behavior.
Taking these steps -- measuring usage, costs, and assessment data -- will help a district gain insight into the impact, engagement, and return on investment of their current technology programs.
The knowledge they gain will in turn help them develop comprehensive, cost-effective tech plans and allocate their technology resources for maximum impact. The result will be to drive student achievement using the climate that promotes the best outcomes and devise strategies to replicate that climate throughout the district. Using data to drive the impact of technology dollars is the way to know you are getting your money’s worth.
Districts around the country have been assessing their technology implementations using BrightBytes’ Learning Outcomes module, which captures data using a district’s web proxy and employs advanced research-driven analytics to correlate usage with student achievement data from any system. Dr. Ryan Baker, Director of the Penn Center for Learning Analytics, analyzed data correlated from these results to learn what works around the country.
For example, (Spoiler Alert) the study found that most of the top ten apps for impact for math per days used were also in the top ten for impact for math per time spent. However, an interesting exception was Kids Discover Online; students who spent lots of time using this system did very well, but students who used the system on many days did very poorly. This pattern of results may suggest that Kids Discover Online is best used in an intensive fashion for specific content.
Click here to read the previous entry in this blog series:
Tech & Learning Newsletter
Tools and ideas to transform education. Sign up below.