This site uses cookies to improve your experience. To help us insure we adhere to various privacy regulations, please select your country/region of residence. If you do not select a country, we will assume you are from the United States. Select your Cookie Settings or view our Privacy Policy and Terms of Use.
Cookie Settings
Cookies and similar technologies are used on this website for proper function of the website, for tracking performance analytics and for marketing purposes. We and some of our third-party providers may use cookie data for various purposes. Please review the cookie settings below and choose your preference.
Used for the proper function of the website
Used for monitoring website traffic and interactions
Cookie Settings
Cookies and similar technologies are used on this website for proper function of the website, for tracking performance analytics and for marketing purposes. We and some of our third-party providers may use cookie data for various purposes. Please review the cookie settings below and choose your preference.
Strictly Necessary: Used for the proper function of the website
Performance/Analytics: Used for monitoring website traffic and interactions
They are essentially media comparison studies or, to be more precise, license comparison studies. It should surprise no one that media comparison studies find no significant difference in student learning. ” This is a theme in media comparison studies that has been repeated in the literature for decades.
You probably wouldn’t be surprised to hear that every education technology (edtech) publisher says their product works, and they all have some sort of supporting evidence. Highly credible edtech evaluation lists give top marks for just one RCT (randomized controlled trial). Download the ProgramEvaluation Rubric (blank).
If you’re merely looking to check a box saying “It has evidence” before you commit to a program, you’re asking for something that can be generated by any program. To measure recent program impact in Texas , we used quasi-experimental methods to compare the growth of ST Math students versus matched comparisons within the same subgroup.
Administrators selecting educational technologyprograms for their schools or districts face big decisions due to the time and money at stake, so having accurate and relevant information about the programs’ impact on student performance elsewhere should be a critical part of the decision-making process.
We organize all of the trending information in your field so you don't have to. Join 34,000+ users and stay up to date on the latest articles your peers are reading.
You know about us, now we want to get to know you!
Let's personalize your content
Let's get even more personalized
We recognize your account from another site in our network, please click 'Send Email' below to continue with verifying your account and setting a password.
Let's personalize your content