Could we all just agree that if we accept the claim that a simple 4.7 minutes is sufficient to move kids from the lowest quarter of reading scores for their grade level to the "top half" of the grade? First, let's begin with some basics. How were the students scores measured? I suspect it was with the assessment AR uses. That is a flaw in the first place. And if the same assessment is the post-measurement, there is a second flaw. Batting around percentages also bothers me. At the end of the "study" (and trust me when I assert this is not a study but simply number-crunching, data-mining at its worst). Did the students move from 25% to 50% or are we talking percentiles and not percentages? Hard to tell from this article.
Student at the top read only 19 minutes a day; at the bottom (or what they designate the bottom), the amount of time reading using the software was 14.3 minutes. I have questions: why were some kids given more time to read than others? why were the lower scoring kids not given more time? what are they reading online? is any reading outside of online considered? was there a control group? I could go on, but I think there are some wide gaping holes here that need to be filled in before anyone proclaims that 4.3 minutes is key to getting scores higher. And that is the claim as it is presented.
So what happens when folks swallow this piece of misinformation? You got it. More programmed reading online. Less time for reading for pleasure or simply just time for reading period. Get the program; dedicate 4.3 minutes, and VOILA!
The article goes on to throw in factors such as vocabulary growth and comprehension to really muddy the waters which were already pretty murky. We have all seen the chart showing how 20 minutes a day increases vocabulary and test scores. Somehow, those extra 4.3 minutes amounted to hundreds of thousands of words and better comprehension. Of course that is because the magical AR program makes sure kids are reading the right books. Of course, the article also claims that AR lets kids select their own books (they don't), and kids were seeking harder books on their own (maybe, maybe not).
There are other things that trouble me here: references to immigrants, a side note indicting that maybe some teachers helped out here. But this is exactly what happens when a powerful company publishes "research" citing millions f kids and their data. For now I will head to the actual report mentioned in the article so I can finally know the answer to the question: what are their favorite books? It is something that brings me much amusement each year. For now, I think I will spend at least 4.3 minutes reading a real book of my choice.