Algebra
One of the bigger advances made my humanity in the past two or three thousand years was the development of algebra by the Hindus and then by Al-Khwarizmi. This was the method by which arithmetic was turned into a detective game in which a certain unknown quantity was to be found. I feel that a large part of this advance was due to using single letters to denote unknowns. As long as words were used, the whole idea was clumsy. For example, to solve "Three times a certain number plus five equals four.", you say the same thing, only less by five, and say "three times a certain number plus five minus five equals four minus five." Then you have to write the simplification of that, and so forth. It is a slow process and one that many would not have patience with. By using a single letter such as x to stand for the unknown, one could write it simply as:
3x + 5 = 4
Subtracting 5 from both sides yields
3x + 5 - 5 = 4 - 5, or
3x = -1.
It is much easier to deal with. Note also the simplification that results from omitting the times or multiplication sign everywhere except between two numbers.
Unfortunately, we seem to be getting away from this compact notation. Computer languages commonly allow multiple-character unknowns, or variables, such as COST or ACCOUNTS_PAYABLE. If you do this, you can no longer denote multiplication by juxtaposition. You have to say COST*HOURS. Further, statements can get quite complicated such as:
Set Northwind_DatabaseSet = ThisDatabase.OpenRecordset("MYTABLE", dbopendynaset)
No wonder people can't learn Visual Basic or C++. This sort of thing is just as long-winded as "Three times a certain number plus five equals four." We are back to the days before al-Khwarizmi. That is why I break the rules and frequently use i or a to denote a variable. It's time we shortened our variable names to make computer programming less of a tedious chore.
No comments:
Post a Comment