Okay, change of gears, from my doom and gloom last post to this. Thanks to the US Army I was thrust into cutting edge technology when I was just 19. I had to learn all about microwave technology and a brand new type of communications, satellite. The first four years of my army career was spent learning such things. At my stop in Italy, I also had to learn about tropospheric scatter communications, digital communications, and my first computer. That was in 1970. Yes, I am that old.
In 1981 I saw my first minicomputer, a DEC PDP-1. Information to boot it had to be toggled in using a bunch of switches on the front of the computer. But that was a big improvement from the torn tape (paper tape) boot we had to use on my first computer.
Shortly after that I went to work for MIT Lincoln Labs and did satellite tracking. We had a huge Sperry analog computer and the DEC PDP-11, the latest thing in 1981. Come 1985, I was working at MIT’s campus and had been assigned the position of PC hardware engineer. I was a department of 1, me, who had to be the guru of the IBM PC-XT. That computer was one step removed from their introductory PC and the first PC of any make for public sale. Remember, even though Apple developed the first home computer, they always have claimed they are not a PC. That was fine by me. Apple had this aversion to allowing computer developers to get at the inner workings of their computer. That meant writing our own software for it was somewhere between very difficult and impossible. From a laboratory perspective, I was developing for MIT’s numerous engineer laboratories, we had to be able to write very specialized data gathering programs.
After I left MIT, 1987, I went to work for the US Department of Transportation. They dearly wanted me because I was something of an expert, for them anyway, in an operating system that few understood back then, UNIX. It was the platform of choice at MIT. A year after I got there I suggested we install the Internet and almost without exception, people said, “What’s that?” The Internet back then was something developed by the Army and was called the DARPA-NET (Defense Advanced Research Projects Agency Network). The Department of Defense asked several universities, Carnegie-Mellon, Michigan, and MIT to help advance the network as it was starting to grow in the mid-1980s.
About 1991 I had a boss who insisted I get certified in the predecessor to HTML, SGML (Standardized General Markup Language). This was a format used to print documents in the late 1980s and 1990s. Wang computers and a company called Interleaf had computers dedicated to word processing. But in 1988 I had been introduced to the original Microsoft Word, and then in 1991 to WordPerfect. I told my boss that the need for dedicated word processors was no long valid. He reminded me that he had a PhD, this is true, and that I did not. I refused to learn it, saying it was worthless, and he shipped me off to another division. Thank you Dr. Bob. You were a horse’s ass but you helped me greatly.
In 1988 I headed up a US Air Force project to computerize the load planning of their C-141 and C-5A fleet. I used a platform called an “expert system.” This software incorporated the early ideas of artificial intelligence into mainstream computing. It was unpopular because of how specialized and how difficult it was supposed to be. I did not agree, but what did I know?
By 1993 we were using the first GPS satellites to track an Air Force Aircraft. Our test used the Air Force’s Chief of Staff’s B-737. It was a total success.
AFter that I worked on an FAA project to digitize their pilot certification center in a matter that a photograph could be quickly retrieved from mass storage. In 1998, 10 second retrieval time was fast. Now people would be looking to see what was slowing things up.
Lastly, I worked on a thing called wake vortex turbulence in parallel runways. When an aircraft lands it leaves a wake, same as you can see behind a boat. This wake can drift onto a parallel runway and cause problems for other aircraft that are landing or taking off. I had to mesh together millions of bits of data, create a graphical representation of the wake, and statistically analyse it. I did get credit for publishing a scholarly technical paper to the AIAA (American Institute of Aeronautics and Astronautics). If you need something to put you to sleep, I will happily forward you a copy.
The thing is, I have gone from the Intel 8086 processor of the IBM PC-XT to my triple core Pentium today. The thing is, we are ramping up our technology at a pace that is unheard of. Apple’s Tablet and Amazon’s Kindle Fire are examples of things still on the R&D tables just five years ago, and these are fairly rudimentary. The things you can do on those platforms is truly revolutionary, but, not particularly practical. That was the first PC’s problem too. That $3000 IBM PC-XT could not do too much but boy was it fun to brag about having one. But unlike the late 1980s, we have plenty of companies who will rush to add apps to those devices to suit everyone’s needs. I am already looking for a “day planner” app that is not available yet for my Kindle Fire. The device will be a wonderful data collection device once apps of that sort are written for it. Then it will have to be faster and support more graphics and multiple programs running at the same time. That will happen, probably within the next year.
Along with the computer advances, is the advances in telephonics. Smart phones have become a must have item. Ten years ago, just having a cell phone was a big deal. I got my first cell in 1996 from the then Cell One. I fully expect the cellular industry to wipe out the desk-top phone in the home. In time, businesses will also drop them. It only makes sense to have a mobile phones for dedicated lines. A smart way of doing business.
In the next year or two the swipe style of credit card usage will start to disappear as they are replaced by smart chip credit cards. Those are card that only require you to tap it on a pad and it automatically picks up your personal data. It will also greatly increase card security as the level of difficulty of creating bogus cards becomes extreme.
Pretty soon we will all have identity cards that will allow us to transfer basic information about us, address, phone number, etc. to establishments like hotels and other places requiring such info. Again, a smart chip will make this possible.
Many colleges are going to offer “distance education.” Course will be entirely taught over the internet. I have taken one such course. It was particular good because the video of each class was always there to review. It was like having instant notes on the entire class.
The growth industry of 2012 to 2020 is going to be wind farming. People who invest in such properties will be making money hand over fist. Along with solar cells, and hydro-electric, they are the epitome of green power production. Solar cells still have a way to go but wind turbines are here now, and working well. They will become a regular portion of our horizon. It won’t be long before people wonder why anyone would have objected to such an idea.
America is going to be the world leader for the foreseeable future in the export of food. Corn, wheat, oats, barley, and other grains, along with other vegetables. The world is starving and we have both the land and the resources to export ever-growing supplies of food to a hungry world.
If you want to know where we are in computing, make the year 1914 and Henry Ford puts his assembly line into full swing. The auto industry was going through rapid growing pains but it grew. That is about where we are in computing. We are in the late Ford Model-T era.