Saturday 10 May, 2008

The computers that were...

It was the year 1987 and I was in class VI of a boarding school in Mussoorie. Our class teacher came in the morning and told us that we would start having a computer class once a week. There were mixed reaction among us. Some like me where very excited with the possibility, some sad on account that we have to learn one more subject, some did not care at all. Our principal had decided that all students of class VI onwards needed computer education. What a visionary!

On next Saturday morning when we gathered in front on Computer Lab, which was established a year earlier but was only used by senior students of class X, XI and XII. That was when I saw and touched the first computer of my life. To me computers were mysterious things that were capable of doing amazing tasks.

That computer was a BBC Micro. It was machine which was housed in a single unit having CPU and Keyboard. It was attached to a 14’’ green monochrome monitor. It had an amazing RAM of 32K and 32K ROM with 16K used for operating system “Acron Machine Operating System” and other 16K for BASIC interpreter. Any programming was to be done in BASIC. It had an external 5¼inch floppy drive for any persistent storage. It was on this PC I wrote my first program of adding two numbers.



As I went into senior classes and our school was able to get HCL PC, we moved over from BBC Micro to HCL PC. These machines seem to be light years ahead with 640K RAM, separate CPU with inbuilt floppy drives and a separate keyboard. The processor that I know now was Intel 8086 and the computer worked on an IBM DOS 3.0 operating system. It had an ability to run many languages besides BASIC and seemed extremely fast to me. A “DIR” command would just zoom past by.



In school we moved on to the later version of HCL PC called PC-XT, which was based on Intel 8088 processor and had a hard drive of 20MB. At that point of time it seemed to be infinite amount of memory. The fact you don’t need a floppy anymore was amazing. We also started using PASCAL and C programming language which seemed ages ahead of BASIC.


When I was in class 12th, for end of the year project we made software that could manage customer records of a telephone company and print telephone bills in PASCAL. To think of it now, I have not moved very far away from that project.

In my holidays one of my father’s friends had a DTP (Desktop Publishing) business and had few computers. He invited me to visit his office and allowed me to use some of his computers. There for the first time I saw a Modi Olivetti PC AT-286. It was the fastest computer that I had seen so far with 1MB of RAM and 12MHz of processor speed. It had an amazing program called Windows 3.11, which had a graphical user interface. However Windows 3.11 was just software and the OS was still DOS 5.0. The PC also had a mouse connected to it, using which was pretty cool.


Then I went to college with a thought that my seven years of computer education in school would put me ages ahead of the rest of my mates. I realized how wrong I was when I saw the first computer in my college. There I had my first encounter with a computer on a UNIX operating system.

As I walked into the computer centre, 3rd floor (That was the only floor accessible to first year students) I could only see row of monitors with keyboards in front of them. I tried to look for CPU but couldn’t find one. I was told that the lab only had UNIX server with dumb terminals. Dumb terminal; sounded very dumb to me. There were two UNIX servers with 12 dumb terminals attached to them. To all those who don’t understand what a dumb terminal is, it is a terminal and keyboard attached to a central server with terminal only acting as VDU (Visual Display Unit) and input device (Keyboard) and does not have any processing power in it. You flick a switch on the terminal and you would see a UNIX login screen, which you can login and start a session.

UNIX was difficult at first especially with only VI editor, but became fun eventually. Being on network with many other people had its own fun with using Unix Talk program. That was also my first attempt at hacking by emulating dummy login screen for capturing passwords of unsuspecting friends. Running a “C” program was not easy with every time compiling, lining and running the “a.out” but nevertheless it was a lot of fun.

The next change came in a year later when our college established a PC lab. The lab consisted of Pentium – 1 based PC each running a Red Hat Linux operating system" All the PC were in a network based on co-axial cables and had an amazing thing called connection to “INTERNET“. The internet connection was provided by government’s “ERNET” via VSAT antenna and it was available 6 hours a day. In those 6 hours our mail server would send and receive mail from outside world. We could use Linux mail for emailing any of our friends (Though we had a very few friends with access to emails then) and use Lynx text based browser to browse the net. Popular search engines were Lycos, Yahoo and Alta-vista. In 1996 access to internet was a pretty cool thing and we were able to register username on Yahoo and Hotmail, which people would be jealous of now!

However things moved on and soon PC in our college had Windows 95 operating system and things were more like we are used to now. However when I meet people who got used to computers after PC boom era, I feel they have a different view of computers. That’s when I always look back and think about the computers that were…

Tuesday 2 October, 2007

IT Schedules & Business priorities

IT schedules are extremely difficult to predict and plan. This is contradictory to how business operates - with plan and procedures. A schedule that is too aggressive or too relaxed would have negative impact on optimum progress. An overly aggressive schedule would lead to technical decisions made to fit the schedule rather than technical constraints, which eventually will hit the schedule sooner or later.

A relaxed schedule would make IT community work at a sub-prime level and hence have a negative impact. From pure technology perspective a relaxed schedule is better than an aggressive one as though being more expensive it is actually predictable. An perfect schedule is impossible to predict.

A way forward to get IT work done on an optimum pace is using Agile. With agile a long plan is not defined hence the IT delivery works on best possible pace, but then IT cannot tell business what it can deliver in long term. Now this makes things very difficult to business as without predictability it is very difficult to operate any business.

Business wants IT to be agile and be able to turn around things quickly, they also want IT to be cost effective, but on the other hand they also want very high predictability and certainty. These are contradictory requirements and like having the cake and eating it to.

But generally this is what happens:
- Business wants IT to deliver some business requirements
- IT would give a relaxed plan
- Business for obvious reasons would want it much earlier
- IT would agree for an early delivery but would fail to deliver
- Business would blame IT, but IT would say "I told you so"
- Eventually the IT delivery would take much longer that it should with much more cost and with many compromises

How do we solve this never ending story
- Use agile as delivery methodology (XP/Scrum)
- Business to take the bitter pill that uncertainty is a way of life with changed business and technological scenario
- IT to take the bitter pill that there are not permanent or perfect designs and architectural purity is only good for "Over the beer chats".
- Stop talking and start working