Introduction

This web page is a rambling illustrated history of my lifetime experience with computers. I hope you find some of the pictures and anecdotes interesting. I have 15×11 fanfold printouts of most of my programs going back to the 1970s, but they're stacked in the garage and inaccessible until I find time to extract them and scan some pages for inclusion here.

1968–1973 (Pre-Computers)

When I was a youngster, the word "computer" wasn't often heard outside of science fiction movies and novels. Although many big companies were using mainframe computers for serious data processing during the 1960s (and earlier), the public rarely heard about what was going on and computers maintained an air of mystery and power around them. As a child, I can only recall seeing computers in TV documentaries or general science books.

The portrayal of computers in science fiction is a fascinating topic that would take me too far astray in this web page, but for now I must say that computers were supposedly so big and powerful that they had to be housed inside mountains and shielded by radiation (see Colossus: The Forbin Project) or they were only used by big governments and mad scientists (see Dr. Strangelove). Computers were portrayed in print as gigantic boxes with tape spools and a ticker-tape output which dispensed answers to men wearing lab coats. Many famous SF authors (Clarke, Asimov and others) portrayed computers in ways that seem utterly ludicrous to us now. There are no "thinking machines", there are no "robots" doing our dishes and ironing, we can't even get a computer to do what the HAL-9000 was supposed to be able to do in the future year 2001. Only the Polish writer Stanisław Lem seems to have understood the gulf between mind and machine. Most incredibly, no writers or filmmakers seem to have predicted the arrival of the "personal computer".

But back in the real world ... here are some computing tools available to students before they could get access to computers. Click to popup enlargements.

Frank Castle - First printed in 1909, this 1969 edition by Macmillan Press New York. A Hemmi Pi-fold slide rule from about 1970. Hemmi instruction book cover Hemmi page 51 No.259 duplex The Sharp PC-1001 scientific calculator (1973)
1974

In our last year of high school, several remaining students of a maths class who didn't go on a school excursion were asked by our maths teacher "What special topic, not necessarily in the books, would you like to look at?" I suggested something about computers, which was lucky as our teacher's wife worked at Monash University and had access to a MONECS. We were taught the basics of FORTRAN variables, looping and printing statements and we punched the chads out of our decks of cards using a paperclip. The cards were run overnight and we examined the results the next day. And so my life changed when I learned I could programme machines and bring order to chaos. My favourite FORTRAN function was F8(), which was a random number generator, thereby cementing another life-long hobby interest in random numbers.

A 1972 edition of the MINITRAN manual from Monash University. MINITRAN Page 3 MINITRAN Page 4 MINITRAN Page 5 MIDITRAN Front Cover
1975–1977

In these years I attended three different tertiary institutions, Monash, Caulfield Institute and RMIT. Monash did not have an IT/computer course in 1975 and first year students only had access to the MONECS machine mentioned in the previous section and the slightly more advanced MIDITRAN machine. The Burroughs B7800 mainframe was only available to students in advanced specialist courses.

In 1976 Caulfield Institute had an ICL 1900 mainframe with the GEORGE 3 operating system available to all students. The first subject of the first year course involved writing HP assembler, which I now think was a cruel introduction for unwary students and it culled the initial class size by half in several weeks. Suddenly we switched to COBOL, which thankfully prepared me for the business world a couple of years later. FORTRAN was still very popular with students for hobby coding. At this time I found the ALGOL language manual and was stunned by the indented and structured clarity of the code, something that wouldn't become common until PASCAL and C/C++ became established. At that time FORTRAN was still written in fixed columns with copious GOTO statements. I tried evangelising to my fellow students how superior ALGOL was, but they scoffed at me.

In 1977 RMIT had a dedicated computer science course that taught maths, statistics, FORTRAN, ALGOL and IBM 360 assembler. The assembler language was actually emulated by some software written by one of the staff. It was an incredible coincidence that I was learning 360 assembler as 5 years later it would become the primary language for 10 years during my career as a systems programmer. RMIT had a CDC CYBER 170 mainframe computer that was accessible to all students.

ICL COBOL Front Cover ICL COBOL Page 11 Guide to FORTRAN IV programming ALGOL 60 and FORTRAN IV
1977–1980

Computer Technology was a batch payroll data entry and processing company. The mainframes were HONEYWELL 2060 and 2070 computers. Most of the software was written in COBOL with a small vital component written in Easycoder assembler so that it could fit into the 128KB of memory. Only one man in the company could write and maintain the assembler component. I also taught myself Easycoder, and although I did not work on the payroll system, I wrote many important productvity tools for the operations staff. The name Easycoder was a complete misnomer, as it was one of the most complex processor definitions and assembly instruction sets I have ever seen.

Photos from about 1978 of the Honeywell 2060 and 2070 computers. The 2070 had a dual monitor display. One screen showed the processing log and the other could show the print queues and secondary information. The 2060 only had a keyboard and some push buttons to control job processing state. The 2060 was eventually cannibalised and added to the 2070 to give it 256KB of memory, 4 line printers, 12 removable disk drives and 10 tape drives.
1980–1993

In 1980 the empty space occupied by the Honeywell 2060 was filled with a FACOM M-140F computer, representing a giant technological leap forward for the company. The M-140F had a whopping 4MB of memory, 4 tape drives, a video monitor (with a light-pen) and unimaginable amounts of disk space (which soon filled-up). It was a mammoth effort to port the Honeywell COBOL code of the payroll systems over to the FACOM compiler and X8 operating system. The Easycoder code was converted to COBOL because there were no longer any memory restrictions. One conversion pitfall that drove the programmers mad during the conversion was the dreaded S0C7 ABEND . The Honeywell COBOL compiler or instruction set would convert any arbitrary binary data into a number, whereas the IBM instruction set was not forgiving and would crash with the S0C7 error if any attempt was made to convert invalid data into a number. There were so many of these errors during conversion and testing that some people joked that the old system must have been processing garbage as valid numeric data for a decade and no one realised.

The fisheye photo shows me sitting at the M-140 operator's console in 1980. The other photos show the installation of a new M-180 in the Homebush computer centre in NSW in 1988.

The book on the left was one of the most important in my life for several years. The Systems Programmer's guide defined the instructions, service calls and macros that allowed us to extend and customise the operating system in imaginative and useful ways.

In 1987 the X8 operating system which was later renamed as OS/X8 FSP was being phased out due to limits on its parallel processing capacity. It was replaced with OS/F4 MSP, which was Fujitsu's equivalent of IBM's MVS operating system (actually stolen from IBM). This represented a new and useful learning experience as I would later work on "real" IBM MVS machines.

I must pay tribute to the X8/FSP operating system, as I consider it to be one of the neatest, friendliest and most complete operating systems I have ever seen. Everything you could need was present in some form or other. The scripting languages (SCF, CLIST, etc) were way ahead of their time and even had parallelism constructs built-in. The JCL (job control language) was far more powerful that it's F4/MVS equivalent.

1993–2002
My first PC (May 1992)

In May 1992 I purchased my first PC, which cost a whopping $5000 at the time. It had 4MB of RAM, a 240MB hard drive a 14" monitor and two floppy drives. Sound cards were extra and CD drives were not common for another year or so. DOS 5.0 and Windows 3.1 were released only a couple of weeks earlier. I also purchased the Microsoft SDK 3.1 and C/C++ 7.0 which came with complete sets of manuals in a huge box. Over the coming months (and years) as I tried to digest the huge Windows API and the C++ language (as it evolved and became more and more complex) I discovered that basically everything in the PC world is a jumbled mess. In my mainframe days I experienced great discipline and documentation and everything generally melded together, probably because software came from a small number of large established companies. I soon discovered in the PC and Windows world that nothing is likely to work first time. I am still mentally scarred by my first year of attempting to write Windows apps in C++ and MFC due to the difficulty of wading through the jumbled mountains of functions and classes, then running endless experiments to find out how things really behaved before trying to salvage the useful working scraps into real apps.

I must pay a small tribute to OS/2 which came and went in a blink in the mid 1990s. Had the drivers and applications arrived more rapidly, OS/2 could have been a Windows killer. I actually wrote COBOL, C and REXX for 18 months on OS/2 and found it to be a surprisingly pleasant experience. I was also very impressed with the dynamic REXX language which was invented by IBM as a replacement for the reliable (but verbose) CLIST language.

I must also praise the arrival of the Java language in the late 1990s. After struggling with C++ and MFC for years it was a pleasure to work with a new language and environment that had clearly been designed carefully and elegantly with the future in mind. I was also impressed by the Java community who shared high quality code and libraries. It's sobering now (in 2021) that Java is regarded as an "old" language and has fallen a long way behind modern languages like C#, Kotlin and Swift. Kotlin is now the preferred language for app developers on the Android platform, a sure sign that the end of the age of Java is  within sight.

2002—2024

The "Computer Room" (Oct 2019) See also: Rear View
After the .NET Framework became established around 2002, I left the Java and C++ worlds behind and moved primarily over to writing software using the C# language. The .NET platform and the C# language along with the expanding sets of built-in libraries and NuGet packages have evolved into the most productive software ecosystem I have ever used. The only problem now is that (arguably) we're getting too much of a good thing… the software development ecosystem is becoming so large that it's becoming bewildering. Read on…

There are so many platforms, languages, kits, changing standards, fads, tools, delicate dependencies and operating systems that it's becoming a life-consuming task for a developer to try and cover all the required skills. Microsoft, Apple, Google, Amazon (and others) all have different cultures and styles and they're all pushing different Cloud platforms and their favourite languages like C#, Kotlin, Swift, Rust and Go. Sometimes I feel like I'm a struggling car mechanic working and training on six brands of cars simultaneously.

Software development platforms seem to be becoming more brittle. As I write software these days, whenever I'm using some new tool, kit or language I find I have to have a second browser window permanently open to search for reasons why everything I try doesn't f***ing work, and I can be sure that any or all of the following will happen:

  • An incomprehensible runtime crash, and Google searches for help will produce nothing useful.
  • Nothing will happen. No output, no results, with no clue why.
  • A set of delicate conditions that only Sherlock Holmes could uncover are required to make things work.
  • There will be a mountain of dependencies to get things working, which will require fresh installs and updates, possibly even complete operating system updates (for example, I need to purchase a new M1 Mac as I write this).

The worst thing in the 21st century of software development is unfortunately The Web ... seriously, read my blog post I'm in the future of the web and it doesn't work .

The story ends here for now, for in the 30 years since I migrated into the home PC revolution there is nothing really startling to describe you probably haven't seen yourself. I can assume that most people reading this have lived through the PC years and have seen the gigantic leaps in processing power, capacity and networking. I must say though that I am eternally grateful for the introduction of these things (roughly in ascending time order): the sound card, the CD drive, the DVD drive, the CD/DVD burner, the scanner, the LCD monitor, the memory stick, the portable hard drive and cloud storage and hosting. The things I am most glad to see gone are floppy disks and the CRT monitor.

2024—Future

In late 2024 a lot of my regular work dried-up, but I didn't go looking for fresh work, I decided to retire. See my blog post Easing into retirement for details about how I've become uttrerly fed-up with computers and software. I never thought the day would come, but this recent blog post titled Technology idiocy log provides growing examples of how I'm hovering on the edge of madness all the time because "everything f***ing doesn't work" (my epitaph).


See also: Windows 3.1 Screenshots | Random Numbers