Webster’s dictionary defines technology as “the practical application of knowledge especially in a particular area” and “a manner of accomplishing a task, especially using technical processes or methods”. The etymology of both words, technology and technical, is the Greek word techne meaning art or skill, maybe even craftsmanship.
Steve Wozniak wrote a piece about technology in the latest issue of Popular Mechanics. To me, Mr. Wozniak is having a tough time making a point. In fact, I am not sure what point he is trying to make. Something about the pyramids of Egypt not being a technology.
I am not an expert on the history of architecture, but it should be clear that the pyramids were the manifestation of technology. The method of building something the size and complexity of the 138 pyramids in Egypt in less than a hundred years is clearly a process that involves considerable skill. This should be even more clear from the fact that no one has been able to explain how it was done until very recently. That’s a 4000 year old technology.
Maybe stone construction is not a major technology in Mr. Wozniak’s view. He does, however, define technology in terms of anything that makes us increase our productivity “Technology is really an amplifier of our abilities..” That’s probably a good operating definition for some situations, but not very helpful.
The transistor is Wozniak’s example of real technology, and that all the enabling technology of modern electronics rests on the transistor. I beg to differ. The underlying “technology” that created the need for something like the transistor was the notion of computers, general purpose logic solving machines, that were proposed by Turing and Babbage. If there were no “computers” being created experimentally by mechanical and electrical technologies, the need for the transistor would never have arisen.
The transistor is the response to the inefficient methods of creating logic solving circuitry. The first computers were relay and vacuum tube systems of massive size that were difficult to program. Though these systems were considered largely academic curiosities, the practical applications became clear. Solving tables of trajectory calculations for artillery based on a particular cannon, shell size and elevation angle meant a huge increase in precision for the military. Had it not been for Turing’s code breaking of the Nazi intelligence system, we might all be speaking German today.
The notion of a calculating engine to solve problems becomes the computer of recent vintage because the speed and cost of this capability has been increased 1,000,000 times since it’s inception. The complexity of programming has largely been eliminated by the creation of operating systems and “canned” software programs that do myriads of tasks at increasing speeds.
So which came first, the chip or the computer?