This site uses cookies to improve your experience. To help us insure we adhere to various privacy regulations, please select your country/region of residence. If you do not select a country, we will assume you are from the United States. Select your Cookie Settings or view our Privacy Policy and Terms of Use.
Cookie Settings
Cookies and similar technologies are used on this website for proper function of the website, for tracking performance analytics and for marketing purposes. We and some of our third-party providers may use cookie data for various purposes. Please review the cookie settings below and choose your preference.
Used for the proper function of the website
Used for monitoring website traffic and interactions
Cookie Settings
Cookies and similar technologies are used on this website for proper function of the website, for tracking performance analytics and for marketing purposes. We and some of our third-party providers may use cookie data for various purposes. Please review the cookie settings below and choose your preference.
Strictly Necessary: Used for the proper function of the website
Performance/Analytics: Used for monitoring website traffic and interactions
For almost a decade, selling edtech products to schools and districts has felt dangerously like selling a home over the internet. We describe edtech products with all the excitement and adjectives of a fresh listing on Zillow. Now the good news: Over the past year, we’ve seen a broader set of research practices applied to edtech.
You probably wouldn’t be surprised to hear that every education technology (edtech) publisher says their product works, and they all have some sort of supporting evidence. In many cases, it’s just one study. Educators aren’t the only ones stuck in this “one good study” paradigm. Results may vary.
A new bar has been set for state-grants-funded digital math programsevaluation. The Utah STEM Action Center has released its 2020 annual report , which includes the results of their comprehensive statewide programevaluation. Utah’s commitment to rigor. A word about repeatable results at scale.
The federal Elementary and Secondary Education Act, for example, limits the use of school-improvement funds to interventions that benefit student learning as documented by at least one well-designed and well-implemented research study. It is possible that price transparency for edtech pricing could lead to a similar market improvement.”
Independent education research firm WestEd recently published the largest ever national studyevaluating a math edtechprogram. See an overview of the study in the following infographic: Download the Infographic. Which states were included in the study? EdTechEvaluation Resources: ProgramEvaluation Rubric.
Federal ProgramEvaluations and Program-Related Reports: The First-Year Implementation of the Technology Literacy Challenge Fund in Five States (American Institutes for Research, 2000). Federal ProgramEvaluations and Program-Related Reports: National Educational Technology Trends Study (NETTS).
While acknowledging that randomized controlled trials have their place in what should be an “edtech efficacy portfolio,” Andrew pointed out that the time and expense required for this type of study usually makes them feasible only once every five or ten years, limiting their ability to show progress over time and include the latest data.
Department of Education (DOE) Institute of Education Sciences (IES) as one of six math programs to be further investigated under what IES terms “replication studies.”. “We We look forward to using ST Math to help IES realize its vision of a healthier and better-informed market in education programevaluations.
Department of Education (DOE) Institute of Education Sciences (IES) as one of six math programs to be further investigated under what IES terms “replication studies.”. “We We look forward to using ST Math to help IES realize its vision of a healthier and better-informed market in education programevaluations.
Five PRIME Factors for EdTechEvaluation As someone who has evaluated and published findings on an edtechprogram for two decades, closely watching the marketing hype in the math edtech market in particular, I’ve come to believe something you may find shocking: All edtechprograms work.
The ability to come away with a thorough understanding of the results and outcome of a study—and why they're so significant—carries many challenges. When discussing research that evaluatesedtechprogram effectiveness, we need to talk less about access to it and more about the nature of the information itself.
leads the Marketplace Research initiative at Digital Promise which is focused on increasing the amount of evidence in the edtech marketplace. Formerly a high school English teacher, Christina left the classroom to study education policy with a desire to improve student outcomes by offering a practitioner’s perspective to education research.
leads the Marketplace Research initiative at Digital Promise which is focused on increasing the amount of evidence in the edtech marketplace. Formerly a high school English teacher, Christina left the classroom to study education policy with a desire to improve student outcomes by offering a practitioner’s perspective to education research.
We organize all of the trending information in your field so you don't have to. Join 34,000+ users and stay up to date on the latest articles your peers are reading.
You know about us, now we want to get to know you!
Let's personalize your content
Let's get even more personalized
We recognize your account from another site in our network, please click 'Send Email' below to continue with verifying your account and setting a password.
Let's personalize your content