Issue 59

Technology is no panacea

Technological progress is essential for improving  living standards. It plays a major role - albeit one that is difficult  to accurately measure - in improving productivity, and therefore income,  for people everywhere. There will always be short-term 'losers' (e.g.  horse and buggy operators when motor vehicles were introduced), although  even they - and to a greater extent, their children - will benefit from  the increased productivity.

But sometimes new technologies are  enthusiastically embraced without much concern for whether or not there  are actually productivity gains to be had. Take classroom education  technology:

Schools across the country have jumped on  the education technology bandwagon in recent years, with the  encouragement of technophile philanthropists like Bill Gates and Mark  Zuckerberg. As older education reform strategies like school choice and  attempts to improve teacher quality have failed to bear fruit, educators  have pinned their hopes on the idea that instructional software and  online tutorials and games can help narrow the massive test-score gap  between students at the top and bottom of the socioeconomic scale.
...
But  much of the data shows a negative impact [of education technology] at a  range of grade levels. A study of millions of high school students in  the 36 member countries of the Organisation for Economic Co-operation  and Development (OECD) found that those who used computers heavily at  school “do a lot worse in most learning outcomes, even after accounting  for social background and student demographics.” According to other  studies, college students in the US who used laptops or digital devices  in their classes did worse on exams. Eighth graders who took Algebra I  online did much worse than those who took the course in person. And  fourth graders who used tablets in all or almost all their classes had,  on average, reading scores 14 points lower than those who never used  them—a differential equivalent to an entire grade level. In some states,  the gap was significantly larger.

The [longish] article lists several reasons for why classroom education technology might be  producing those negative outcomes, including distraction, the lack of  motivation when learning from a computer versus a person, and the  inability of computers to demonstrate the "social usefulness" of  knowledge.

I tend to agree. It would appear that educators (or  rather, the administrators which spend educational budgets) have been  seduced by the allure of technology as a solution to all of their woes,  investing heavily in devices and programmes that will do little to help  students who, for whatever reason (there are many), are already  struggling. The approach reminds me a lot of the blind embrace of  artificial intelligence and machine learning, the potential of which has  been (and continues to be) massively overrated. The article continues:

Judging  from the evidence, the most vulnerable students can be harmed the most  by a heavy dose of technology—or, at best, not helped. The OECD study  found that "technology is of little help in bridging the skills divide  between advantaged and disadvantaged students." In the United States,  the test score gap between students who use technology frequently and  those who don’t is largest among students from low-income families.

Technology  is no panacea. Politicians and educational administrators would do well  to take a step back, slow down and actually think through the role  technology can play in the classroom before bulk buying the latest  device or educational app that promises to solve everything. Implemented  properly, technology should produce educational gains for all, not just  for students who are already doing relatively well. Simply spending  more money on classroom education technology without first addressing  the more fundamental, actual reasons why some schools and students are falling behind will only worsen the educational divide.

Enjoy the rest of this week's issue. Cheers,

— Justin


Other bits of interest

Another artificial intelligence misapplication

People,  especially those in the tech industry, need to slow down and think a  bit more. In this case, AI has been used to answer the wrong question.  Essentially, Google's AI breast cancer screening system is flawed [it's a  tweet storm so the grammar is poor and it's riddled with typos]:

...if  you unleash AI on a problem and prove you are better at finding biopsy  proven cancer you have no idea if you are changing the ratio of
harmless to curable to spread-already.

And without knowing that you don't know you are helping.

When  you change the rules around how the study is interpreted, you cannot be  sure that the net result is better EVER if you find more cancer and  EVEN if you find less non-cancer. Because you don't know: harmless from  curable from spread-already ratios in what you find.

Don't  get me wrong, AI can and will be used for good and I don't doubt the  motivations of those at Google who ran the experiment. But people need  to think things through a bit more (something AI cannot do) and only use  it in areas to which it is suited.

Learn more:

Regulation is also imperfect

Never  forget about enforcement. Europe's 2018 "General Data Protection  Regulation" (GDPR) was the holy grail of tech regulation, endlessly  praised by techies as a model for the rest of the world to emulate. I  was skeptical, writing in Issue 1/2019 that:

...while  regulations such as GDPR might sound all warm and fuzzy, in practice  they have few privacy-boosting effects. Indeed, the only thing the  average user will have noticed as a result of the GDPR is the return of  the annoying pop-up in the form a “we use cookies” consent box. Worse,  it has already worked to further centralise power in the hands of Google  and Facebook.

It's now 2020 and that still holds:

Aside  from a €50 million fine that France's privacy regulator imposed on  Google in January, there have been no fines or remedies levied at a U.S.  giant since the GDPR came into effect. And the two nations most  directly responsible for policing the tech sector — Ireland and  Luxembourg, where the largest tech firms have their European  headquarters — have yet to wrap up a single investigation of any  magnitude concerning a U.S. firm.

Big  firms like regulation, particularly complicated regulation. They just  hire lawyers and lobby politicians, delaying a resolution and in the  process starve smaller competitors, both actual and potential. Note that  the 1 January 2020 California Consumer Privacy Act (CCPA), which has a  more clearly defined scope than the GDPR, has already seen Microsoft and  Mozilla agree to implement it across the United States and the world,  respectively.

Learn more:

That's all for now. If you enjoyed this issue, feel free to share it via email


Issue 59: Technology is no panacea was compiled by Justin Pyvis and delivered on 06 January 2020. Join the conversation on the fediverse at Detrended.net.