Ethical software development requires the need to consider whether the development may, in the foreseeable future, cause damage or harm, however “software design decisions often depend on more than one ethical issue, possibly conflicting, where the appropriate ethical choice is not always clear cut” (Thomson & Schmoldt, 2001). In the 1970s the difference between developing with a two digit date representation and more made a significant difference to the amount of expensive (and perhaps unavailable) memory in their processes. In addition they were forced by their specification brief to adhere to pre-printed forms which carried only two digits for the year and the longevity of their programs was set at a few years, rather than decades. I see these as the reasons why the developers themselves were not unethical.
However it has been argued that those that briefed the developers and controlled the process and budgets for critical developments were extremely naive, not unethical, to have not considered the implications; some time spent on mathematical modelling would have highlighted potential problems. For example, satellite programs that were intended to be launched never to be retrieved could easily have had an estimated life span calculated.
We can judge these developments for ethics in retrospect as there were no major implications when a two digit year representation became important (i.e. on the turn of every century, rather than millennium).
The increase in the number of digits to represent the date, in effect, only buys time for software to become outmoded. The three digit date development used today is perfectly ethical, albeit confusing to the human eye as it is non-traditional, as it is used only as a mathematical base (four digits are usually still displayed to the user of the software), as long as those systems are not going to be used in critical systems after the year 2999, or we will experience the “millennium bug” again.
The four digit date, in the same respect, takes us to the year 9999 or possibly to 11999 if we use software to establish the first date of the new systems in the year 2000 (i.e. a four digit date of 1883 would be interpreted as 11883 rather than 01883). There is already talk regarding the “Year 10,000 Problem” and the requirement for five digits to represent the year to overcome this and it would seem logical, in the light of the unexpected date problem the programmers of the 1970s experienced in 2000 that we prepare for year issue eventualities given that we are not constrained by the same problems of the 1970s; memory is cheap and plentiful, we are not reliant on print and we have the benefit of hindsight to give us foresight. Moving from three digits to four seems reasonable, however, given the pace of software development technology, I would expect, as a software specifier that there would be no need to take five digits into account as the chance of software written today being used in 8000 years has a probability of zero. Perhaps I am being as naive as they were in 1970s.
BBC News: US satellites safe after Y2K glitch [Online]. Available at: http://news.bbc.co.uk/2/hi/americas/589836.stm (Accessed 8 November 2009).
Thomson & Schmoldt, Ethics in computer software design and development [Online]. Available at: http://www.sciencedirect.com/science?_ob=ArticleURL&_udi=B6T5M-429Y516-8 (Accessed 8 November 2009).
White, Doug: Y2K Frequently Asked Questions [Online]. Available at: http://homepages.wmich.edu/~rea/Y2K/FAQ.html (Accessed 8 November 2009).
Wikipedia: Year 2000 Problem [Online]. Available at: http://en.wikipedia.org/wiki/Year_2000_problem (Accessed 8 November 2009).
Wikipedia: Year 10,000 Problem [Online]. Available at: http://en.wikipedia.org/wiki/Year_10,000_problem (Accessed 8 November 2009).