Prior to the commercial introduction of the handheld digital calculator by Texas Instruments (the developer of the integrated circuit, a.k.a., micro-chip) in 1972, the act of performing mathematical calculations was often by means of the slide rule, a handheld analog calculator, that “miraculously” performs division / multiplication by adding / subtracting log scales via sliding bars. Using the slide rule back then was both a science and an art; an art that is now lost for those engineers nurtured only by digital technologies. That is, with the slide rule there was a certain level of innocence for those folks, myself included, in that we had to accept a marginal level of accuracy (versus doing time consuming, long hand mathematics) based upon the granularity of the scale provided on the sliding bars – what is referred to as significant digits. However, with the digital calculator, the accuracy level is only restricted by the number of digits displayed on the device. For example, 8 / 3 on the slide rule would only permit a solution of 2.7 due to the principle of significant digits. However, on the digital calculator with 8 digit displays, for example, the answer would be 2.6666666. One consequence of such innocence, now also lost, was determining the placement of the decimal point in a series of calculations. For example, using the simple 8 / 3 calculation, the manipulation of the slide rule would be the same for 800,000 / 3 as it would be for 8 / 3. Hence, the approximation of 2.7 shown on the slide rule required the user to interpret this as 270,000 – versus the 266,666 displayed on the digital calculator. And, for a series of calculations, this could indeed be a challenging effort.
The point of the above is that the innocence of approximating answers based upon significant digits and placing the decimal in one’s head has been irreversibly taken from us with the onslaught of the digital age. Unfortunately, this lost of innocence, in my opinion, has placed a fatuous belief in the importance of absolute accuracy, versus the pragmatic perspective of what really is required for a given situation. And, I suggest this difference is proving to be very expensive unnecessarily in the development and deployment of technologies across the majority of the rail industry that doesn’t deal with high speed / high density operations. Simply stated: Rail time IS NOT Real time for the majority of railroads across the globe.
A primary example of Rail vs. Real is the collection of PTC efforts in the U.S. For the pragmatist, the timeliness and accuracy of train position and speed required for traffic control, traffic management, and enforcement (my generic term for PTC) for the majority of railroads across the globe, is rather basic and inexpensive to provide compared to the technical architecture being developed by the Interoperable Train Control (ITC) committees charged with designing PTC. To expand my point, I refer you to my previous posting on this blog as to the “The Simplicity of Complexity” where I refer to the “80/20” rule, i.e., where 80% of an objective of (choose a topic) can be achieved with 20% of the resources required to achieve 100% of the objective, if that 100% is even achievable. Applying the rule of 80/20 to PTC and traffic control /management (see my postings on VCTC), then unlike those individuals raised on digital precision, the position AND speed of trains in supporting PTC and effective traffic control / management is greater than that provided by fixed block signaling systems, but substantially less than real time. That means that engineers charged with designing virtual positioning approaches, e.g. GPS, and wireless data infrastructure to deliver the data for on-board enforcement as well as to the back office control / management systems, do not require anything approaching the complexity of the technologies being designed for PTC. Yet, our current breed of technicians that have been raised exclusively in the environment of digital communications, video games, and the instantaneous and unlimited throughput of wired IT architecture, do not have that 80/20 perspective. For example, why does a railroad need a level of positioning accuracy for PTC that far exceeds the accuracy of the braking curve for enforcement? And, yet, that is what has been designed by ITC. As to wireless, why do the railroads need a 220 Mhz network in parallel to the already installed 160 Mhz infrastructure? Actually, I know the answer to this wireless question, and it has to do with the failure of the railroad technicians (and their management) to develop a pragmatic strategy of what they could have done to replace their analog 160Mhz platform with a digital trunk system (e.g., TETRA) that would have greatly increased the capacity of that infrastructure to not only handle PTC, but also to readily handle the wireless voice requirements of crowded metropolitan areas such as Chicago and Kansas City. Ok, so perhaps that last sentence is a bit technical here, but it shouldn’t have been for technicians that should have sought out pragmatic solutions.
The bottom line is that US railroads, for PTC implementation alone, will be investing $Billions more than that which is really necessary. One could perhaps make the argument that such an investment will have other benefits in the future such as developing an industry-based strategic operations plan involving the effective interchange of trains, chain of custody, asset and shipment management, etc. BUT, that strategy does not exist … and nor will it until railroad executives are compensated via their personal objectives and associated bonus program to take such a viewpoint.
In a forthcoming posting I will be writing about the Innocence Lost: Rail Operations, But unlike the innocence lost for engineers, this will be a very positive perspective because it really addresses the Ignorance Lost.