Wednesday, January 22, 2025
6.0°F

One teacher's message to Tom Luna

Coeur d'Alene Press | UPDATED 13 years, 11 months AGO
| February 3, 2011 5:24 AM

I received a copy of this letter from a friend with whom I worked 13 years ago in Provo, Utah. Rick, now a junior high school teacher in Pocatello, initially sent this letter to Tom Luna.

With the Legislature taking up Students Come First and a possible vote from the Senate Education Committee next week, it’s time to debate all aspects of the proposal. The Press will editorialize on the subject this Sunday. - Mike Patrick, editor

By RICHARD ROBBINS, ED.D.

I’m writing this letter in response to your Students Come First plan. Along with caring about children, which is clear to my cohorts and students alike, I want to establish my other bona fides for writing this letter. I have years of job experience in journalism and advertising as well as spearheading the computerizing of many facets of those businesses. About eleven years ago, I returned to teaching. In addition, I hold a doctorate in Instructional Technology. To be quite candid, the current debate centers, in large part, on some areas in which I am an expert.

 Somehow, the agenda of contributors to this debate seem to be on the table in this discussion, so I want to establish that I am not a member of the Idaho Education Association and I am a Republican. Beside that, with my present and likely future job assignments and doctorate in the very field being thrust to the forefront of this debate, my continued employment is secure. In short, my agenda is to help Idaho students learn in the most effective and efficient manner possible.

 For the rest of this letter I will address two areas of your plan: first, to replace a certain number of teachers with student computers, and second, your discussion about increasing class size.

 The idea that a computer in itself increases learning is flawed. An excellent analogy for this concept would be to tell someone you are going to give a person a plough and they, therefore, will become a farmer. It just doesn’t work.

 My own, limited doctoral research showed no improvement in vocabulary retention through the use of a multimedia tool over a printed text. Broader studies show that this is not my conclusion alone. It is the conclusion of some of the pre-eminent researchers in multimedia education. Probably the best single compilation of research on computer-assisted education is The Cambridge Handbook of Multimedia Learning, edited by Richard E. Mayer, who is, by far, the world’s leading researcher in multimedia education. In the book is a meta-analysis of research in the field of multimedia learning by Richard Clark and David Feldon. They make a series of powerfully persuasive findings they entitle Five Common but Questionable Principles of Multimedia Learning. They address five claims about multimedia learning that are commonly repeated, but have research that runs solidly against them. These erroneous claims about multimedia are 1) it yields more learning than live instruction, 2) it increases learner motivation, 3) it provides animated pedagogical agents that aid learning, 4) it accommodates different learning styles and so maximizes learning for more students, and 5) it facilitates student-managed constructivist and discovery approaches that are beneficial to learning. (p. 97)

 Clark and Feldon, through research findings that have been substantiated by dozens of well-documented and peer-reviewed studies, conducted in the United States, France, Germany, Australia, and New Zealand across all social classes and grades, confirm that there is no research to show that simple access to a computer results in increased student learning. The results just don’t change. These five assumptions fail time and again. Unfortunately, your plan hinges on these five common errors.

 You must understand that virtually all studies that have shown a positive correlation between learning and computer-delivered instruction alone suffer from the same flaw. They are not isolating the effect of the delivery system. They are, in reality, measuring the effect of the instructional design behind the sample material. In other words, the effects rely on the quality of the methods used to present the material. This is entirely different than ascribing the learning effect to the technology used to present it.

 Your plan also suffers when it comes to organizational research. Putting vast resources into hardware or systems upgrades without training is a classic organizational mistake. Repeated organizational studies have shown that in order to make such widespread upgrades effective, the greatest part of any such investment needs to be in personnel training. Your plan not only doesn’t allow the training necessary to make such a plan work but it guts the ranks of the people who are needed to support it.

 In addition, there are the realities of handing a ninth grade student a laptop. Kids drop, lose, and kick things regularly. Computers are not known for their resistance to damage. A textbook dropped in the mud or off the bus will get a corner of binding tape and go on to be used for years. This is not the case with a computer.

 Then there are the infrastructure requirements for the districts. These amount to huge investments in firewalls, access, distribution portals, and so forth. Districts will be forced to bear the brunt of these costs and liabilities. These costs add to the cannibalization you decry.

 I don’t think I’m inaccurate in saying that I’m easily in the top 1% of educators when it comes to inclusion of on-line technology in my courses. Computers are great at replicating materials. They allow access to an enormous body of resources, they provide student-timed access to materials, and they can be tools to provide vastly reduced costs of access. They are excellent for showing conditions that change over and time and predicting the effects of those changes. Their uses as learning tools are great. However, they simply are not independent learning devices. Just having a computer is not a substitute for a qualified teacher directing student work. All of these require direction and resources to make them effective. What teachers and students need is the ability to access and archive digital materials from which teachers may draw appropriate selections of content. That content should be available, in controlled formats, to students to augment classroom instruction.

 The last thing I’m going to mention in relation to this portion of your plan is an intangible element. I’ve taught on-line. I’ve designed on-line courses. I use multimedia and distance-delivered content regularly in my classroom. I know that teaching students is not what computers do. There is energy, hope, and inspiration possible in a classroom that is a vital part of learning and growing. It is very difficult, if not downright impossible, to replicate that in an on-line environment.

 In your plan, you reference “some research” that indicates the importance of class sizes in primary grades but that these effects dissipate over time. Therefore you draw the conclusion that we can increase class sizes in the higher grades and not hurt student achievement. This is a highly flawed assumption and I’ll tell you why.

 First and foremost you must understand that the absence of proof is not the proof of absence. Just because you haven’t an effect for something doesn’t mean that effect doesn’t exist.

 Second, any study to determine the effect of class sizes in the upper grades will suffer from cross contamination of data and too many variables to effectively measure what impacts a students’ day.

 Suppose I come up with some measure of effectiveness that rates teachers high, medium, and low. Then we set up a study in third grade. Say we have Mrs. Smith and Mr. Jones. We've decided Smith is effective and Jones is medium. One class has 25 kids and one has 20 kids. Our study is controlling for four to six variables. That's not too bad. We run an analysis of variance and figure it out. No big deal.

     Now, we run the study again in 7th grade. We have students with 5 teachers during the day, three levels of effectiveness, and high and low class size. That's trying to control for 30 variables. Statisticians are always concerned about cross contamination, or the effect one group involved in the study may have another. That effect goes way up in the upper grades. In fact, we encourage students and teachers to work together.

     So, at this point, I may have 30 treatment conditions in my study. Then, I add in the cross contamination we encourage anyway.

     If I have a study that runs over two or three trimesters I will need to allow for 60 or 90 possible combinations of treatment conditions. Just for fun, I may choose to include whether students are involved in extracurricular activities, and which activities those are. To make all this work, I'll probably set up some sort of regression model to analyze where class size fits into all this. I'll run the numbers and the equations will render some results that appear to be realistic but, what I really have is useless because there is just too much going on in secondary schools to isolate the single effect of class size.

 Third, we have a sad and foolish reliance on NAEP data. Since the implementation of NCLB, we’ve been testing kids more and more. They are sick of it. On NAEP days the first question I hear from students’ mouths is ‘does this matter to my grade?’ I always believe in being honest with kids and I tell them it doesn’t impact their grades but that they should do their best because it’s a measure of the school’s effectiveness. Some respond that they will but many respond with a wink and a smirk. I’ve heard kids talk about making patterns with their answers, filling in random spots, picking the most absurd answers – you name it. Sadly, I must say their response to the ISAT is often the same. They simply use different methods to vent their frustrations.

 The interesting thing about test burn out is that it’s very difficult to quantify because of its very nature. A sound approach to measuring test burn out would probably include qualitative surveys but employing qualitative data seems beyond the scope of our current politically charged appetite toward education.

 When you propose increasing class sizes, you seem to be saying that by delivering an adverse treatment – increasing class sizes – we will achieve a positive outcome, that the savings associated with this adverse treatment will result in more effective teachers.

 By casting student migration aside, you also imply that student migration will cease to be a concern. It will not. The added classroom populations will only be added to students who migrate between schools.

 I know this has been a lengthy letter. I appreciate you taking time to read it. These are important issues that go way beyond the media sound bites we are hearing. I hope I’ve established the fact that I care about kids, I have the bona fides to discuss these issues, and that the relevant research has some excellent direction. We must avoid the path you have laid out. The results are known and they are not good.

 I am a teacher who instructs many students each day. I live on an Idaho teacher’s salary. I don’t have deep pockets to host elaborate media blitzes but, I have initiated an online newspaper designed to provide a voice for teachers regarding your plan. The first articles are drawn from this correspondence. I encourage you to look at the ideas in it. It may be reached at the following address: http://www.ida.net/users/rrobbins

MORE IMPORTED STORIES

Computers win ... and students lose
Coeur d'Alene Press | Updated 12 years, 7 months ago
The promise and pitfalls of online instruction
Columbia Basin Herald | Updated 4 years, 8 months ago
Superintendent explains request for tech levy
Daily Inter-Lake | Updated 10 years, 9 months ago