Wednesday, April 1, 2020

The Year 2000 Essays - Software Bugs, Computer Programming

The Year 2000 Introduction Many may dismiss the predictions that there would be a worldwide chaos on January 1, 2000 as many computers programmed with two-digit year fields would mistake it to be 1900 and breakdown. However, we need not wait for the turn of the century for the trouble. Signs of early troubles are already everywhere to sufficiently warrant both IT and business managers to take the issue seriously, if they have not already done so. What seemed as a reasonable solution to costly storage problem in past, would now cost, by some estimates, 600 Billion dollars to organizations worldwide. No matter what the final cost comes out to be, even at conservative 300 billion dollars, it is no pocket change. All concern parties must have thorough understanding of the problem, its solutions, and possible ramifications. Importance of the Issue IT managers are not the only one who needs to understand the depth of Y2K problem. Business managers probably has more in stake here than any one else. If ti mely solutions are not achieved, many business stands to lose many billions of dollars in form of lost revenues. This loss does not include possible losses resulting from litigation and out-of-court settlements. Many firms, including some large ones, have continued to drag their feet on fixing Y2K related problems. Companies with Y2K problems now often cannot find people to work on those problems. Shortages of qualified people to work on Y2K projects are very evident globally. January 1st, 2000 is a non-flexible date that is sure to come without any mercy and possibility of extension. If companies are not addressing it by now, they are simply playing catch-up(14). The good news is that the technical know-how exists and many tools are available. For many organizations, problem can be adequately addressed even if they start now but for a higher cost, of course. Historical Perspective In 1956, Howard Aiken, a computer pioneer from Harvard University remarked: If it should ever turn ou t that basic logic of a machine design for numerical solution of differential equations coincide with the logic of a machine intended to make bills for a department store, I would regard this as the most amazing coincidence that I have ever encountered.(1) First, it was truly an amazing coincidence. Second, the first generation of computer programmers were also right by recording years as two digit numbers. It saved costly storage capacity. Third, by another amazing chance, the result of a correct decision will be an expensive (may be the most expensive) industrial accident of our time. Many articles have included stories that show the date problem surfacing as early as 1993: In 1993, associated Press reported that Mary Bandar of Minnesota was invited to attend kindergarten classes, since she was born in 88. She was born in 1888 and was 104 years old. C.G. Blodgetts auto insurance premium tripled when he was reclassified as youthful high-risk driver. He turned 101. What is the Year 2000 problem? It is not one, but series of problems. It Involves software, computer hardware, data, people, and large amounts of money. Nature of these problems can be explained simply: until 1989, all of the standards followed to create computer programs stated that only last two digits would be used to identify the year. As the millennium nears, most computers would regard 2000 as 1900. In late 1800s, Mr. Hollerith invented the punch card with 80 columns to help the U.S. Census done on time. Same cards has been in use for programming until recent time. Eighty characters of information was barely enough for a full name and address. To save space for more important information, programmers designated two-digit year field assuming all years would have same prefix of 19. Even when computers were later equipped with larger storage capacities in magnetic form, memory was the most expensive part of the machine and programmers continued to use two-digit year field to save memory. How the problem continued Programmers of 1960s and 1970s assumed that these program would long be replaced with new ones before the turn of the century. Though the computer hardware has come a long ways in past 30 years, the mainframe machine is

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.