Algebra is just mumbo jumbo to most people. Seriously.

If you asked 100 high school graduates to explain how algebra works, and why it works, I’d guess that 99% of them couldn’t, not in sufficient detail to show that they really deeply understand it. Remember that I am talking about high school graduates, so these people have almost certainly had many years of algebra and algebraic concepts taught to them. Most of these people will only be able to give you some of the rules of algebra at best, and some of them don’t even remember that much.

Algebra is an amazing tool for solving problems though! Formulate a problem as an equation, and unless the equation is too complex, there is an algebraic algorithm to solve that equation, and hence the problem you formulated.

Maybe it is such a useful tool that people don’t really need to understand how it works, maybe they can get by without a deep understanding, but still be able to follow the rules of algebra and use it to solve problems. I don’t really buy that argument though, simply because people who don’t understand something are prone to make mistakes, and not be able to check their work with a reasonable level of accuracy.

Computers are also mumbo jumbo to most people. If you asked people to explain how computers work, most of them cannot. There are actually very few people in the world who can explain from start to finish how a computer works, and there is no one that can explain every single piece of a computer. Computers are still amazing tools though, and give people the ability to solve problems that would otherwise be intractable.

I think computers are a useful tool despite our lack of understanding of how they work. Like algebra, computers are a block box in which we put our inputs and get outputs and don’t understand how the inputs are related to the outputs. Given this similarity, we should look at other reasons why using a computer might be superior to algebra.

There are some significant differences between using computers to do computation, and using algebra to do computation. The first is that using a computer, the error rate is much lower. Obviously you can still press the wrong buttons, enter the wrong information, read the information the computer gives back to you improperly, so there is error, but I’d argue that this error is much less than the standard error rate for algebra. The second benefit of using computers is that they are much faster than doing even moderately complicated algebra by hand, including entering the computation into the computer. In the case that doing it by hand is faster, then I’d say you should do the calculation by hand.

The largest difference between using a computer to do the calculation and using algebra is that algebra is a single use tool. It can only be used to turn an equation into a solution. A computer can be used for so much more.

Granted we should consider computational mathematics to be a broader tool than just plain algebra, if we want a more fair comparison with a computer, but I’d argue that all of the same problems exist with other areas of computational mathematics. As we increase the scope of computations we can learn how to use, the power of the computer becomes even more evident. It takes much less effort to learn how to compute a broader scope of problems using a computer than learning all of the individual computational methods. Witness the power of **Wolfram Alpha**, for example. Enter in a search phrase and all sorts of useful information comes up.

So in the consideration of using computers for solving computations, over a by hand approach, we can see postulate that the computer will produce less errors, be generally faster, and is more multipurpose than the pencil and paper model is. Furthermore, the computers can do a lot more as a tool than what you can do with algebra.

Another issue I see is that our current mathematics curriculums leave very little time to learn more important skills than computation. As Dan Meyer (@ddmeyer) points out, the **formulation of a problem** is more important than the actual solution. Learn how to formulate problems and understand how to verify that what you are doing makes sense, then spotting errors in computation becomes that much easier. Furthermore, I’d like to see mathematics education be **much more grounded in what is relevant**, than be a collection of different types of math which are taught for historical purposes or because they are the **ground-work for calculus**.

The question for me is, why aren’t we using computers more to do mathematics in elementary and secondary education? It can’t just be because people are scared of change, can it?

As high school Math teacher, I agree with everything you have stated here. Unfortunately, those darn standardized tests get in the way. I would LOVE to have my students answering authentic problems using real world tools. But, when they CANNOT use those tools on their standardized test I feel that I would be hurting my students by having them develop skills using computers in math and then NOT allowing them to use those tools on the standardized test.

Let’s get rid of standardized tests!!!! That would free us alot to do more real world, authentic assessment.

OR we could do like Denmark does. Let students use what ever tools they want on their standardized exam.

Yeah, but it’s still worth discussing this point. I think if enough math educators recognize that the old pencil and paper way of doing mathematics just isn’t efficient, realistic, or useful anymore, we might be able to see some changes made to standardized testing. Personally I’d like to throw them out completely, but given that this isn’t likely to happen, I’d like to see an overhaul of the curriculum so that the applicatioin of mathematics in the real world is emphasized and computational mathematics plays a much smaller role.

Look, now we have at least two math teachers that agree on this point, and I’m sure there are many more. It’s a start.

I think what’s important is that students aquire the ability to “think” and “reason”. Relying too heavily on technology when first learning basic concepts such as multiplication as related to areas and volumes can seriously undermine what is important; thinking and reasoning.

I’ll use an example of my experience in landscaping from a few years ago.

I went to a residence to measure an odd-shaped yard to determine the amount of top-soil and sod needed to finish that yard; this obviously affects my cost and is factored into the quote I give. I waited around the corner for nearly an hour, waiting for two members of another landscaping company to do the same. I was told by the home owners afterward that these two characters had spent over an hour coming up with a quote; I was in and out within 10 minutes. My quote was $100 higher than that of the other company but I still got the contract for some reason; I’ll let you figure out why. As an added bonus to me, the neighbor directly to the south also contracted me to do his yard, as did another neighbor two houses to the east……I give credit to my ability to think and reason; instilling confidence in those you work with and for is immeasureable.

Earl

I tend to agree with you on this one Earl, that one has a need to learn the concept before turning to the machine to help out, but this is the same reason that I don’t like students turning to the standard algorithm before learning how to think through a problem. The algorithms we perform, whether they are on paper or in a calculator are meaningless if you don’t understand what is going on.

Your example of the Russian peasant multiplication method you described before is a good example. It allows you to quickly multiply largish numbers, but more importantly, through the algorithm you gain a better understanding of what you are doing.

When I see people who have blindly learned the standard multiplication algorithm without understanding that they are essentially using the distributative property, or who get answers that make no sense (because they don’t know how to estimate the result of the operation) it is frustrating because I know this almost certainly leads back to poor or incomplete instruction when they were learning the algorithm.