These problems are a spin-off of ‘Programming Pearls’ by John Bently. If you’d like a copy, please drop me an email at firstname.lastname@example.org!
#1. Is it cost effective to supply a programmer a terminal at home?
A terminal costs a few thousands of dollars. The salary of a programmer stands at about $50/hr or about $100,000 a year (if the programmer gets about $400 dollars a day).
Thus, once a programmer has invested about 40-50 hours, the employer gets free work. This is a frequently used tactic by the management!
#2. Which has the most computational power: a second of supercomputer time, a minute of midi-computer time, an hour of microcomputer time or a day of BASIC on a personal computer?
Hint: In a second, a supercomputer can do a hundred million 64-bit floating point operations, a midi-computer can do one million 16-bit integer additions, a microcomputer can execute half a million 8-bit instructions, and BASIC on a personal computer can execute one hundred instructions.
While this may look daunting, we only need to convert the scales and compare midi-computer, microcomputer and BASIC on a personal computer to a minute, hour and a day respectively.
We thus have, the first three machines almost on the same track (100 * 10^5 for a supercomputer, 10^ 5 * 16 * 60 for a midi-computer & 5 * 10^4 * 8 * 3600 for a micro-computer). But, the power for the BASIC on a personal computer is way behind!
#3. Suppose the world is slowed down by a factor of a million. How long does it take for your computer to execute an instruction? How long does it take for you to type your name?
It is much easier to understand such numbers if we turn them into scientific notation.
A computer that usually takes 1 microsecond to execute an instruction will now take a second. (Remember, 1 microsecond = 10^-6 seconds)
And if you type your name in 2 seconds, it will now take you almost a month! (Convert 2 * 10^5 seconds into months)