サンフランシスコの空の下から~Kyle's monthly topics~(英語)

Vol.6 【最終回】Thank you and goodbye!

この記事を読むのに必要な時間:およそ 6.5 分

I know that "all good things must come to an end" but still it is with a very heavy heart that I write this, my last column for Software Design. It has been a very good run, about twenty years writing for Gihyo, and I am truly fortunate to have been able to steadily contribute, even in such a small way, to the Japanese computer culture for such a long period of time. Few people, almost none actually, ever have that chance, and it was really a matter of fortuitous chance that I was even able to. As I mentioned a few months ago, it was an attendee at a Macintosh user group that I organized who eventually introduced himself as the editor of a new magazine about the Macintosh, and that he wanted to translate the article I was writing in the somewhat amateurish newsletter my friend and I put together each month for our fellow Mac enthusiasts. I could not have imagined that what began in May of 1989 would continue until 2009, nor could I have dreamed how much our computers would change over the decades. I remember the first time I attached a LaserWriter to a Mac and looked, really looked, at the output. By today's standards, of course, 300 x 300 dpi is primitive. My Canon PIXMA iP3000 Photo Printer prints full color at 4,800 x 1,200 dpi, and despite its age for me the color is beautiful. And even more stunning is the fact that today's printers can output a full color page at 4,800 x 2,400 dpi in about 90 seconds, all for just a few hundred dollars. When it was introduced in January 1985, the Apple LaserWriter used a 12MHz Motorola 68000 with 1.5MB RAM and 512K ROM. and it cost US$6,995, and it was one of those moments when history changed.

The LaserWriter (actually, any printer really) added a completely new dimension to personal computing, but so did the modem. It was the Bell 103A dataset standard, introduced in 1962, that eventually made it possible for Hayes to create its Smartmodem in 1981. Though the 103A offered full-duplex service at 300 baud over normal phone lines, the Hayes command set made it easy for a computer to control its interaction with the phone line. How well I remember typing ATDT and then a phone number into my copy of Scott Watson's Red Ryder terminal emulator. It was such a thrill to call a friend's computer and have what was really a private Instant Messenger system between us. And now, just consider for a moment what we have: Gigabit Ethernet. 24x7 100Mbps fiber connections. Video and voice streamed point-to-point and multicast.

Such advances are stunning, and I think it is worth stepping back from time to time and realizing just how far we have advanced from the first days of the personal computer. Consider the airplane. The Wright brothers, Wilbur and Orville, conducted the first successful powered, piloted flight in history on December 17, 1903. Orville flew only 12 seconds that day, and Wilbur flew more than five minutes for the first time only on November 9, 1904. But on July 21, 1969, two men walked on the Moon. Just think about that for a moment. In less than a lifetime, we went from the Wright Flyer to the Saturn V. Sixty-six years from ENIAC brought us to 2002, and it was at WWDC 2002 where Steve Jobs placed the venerable MacOS 9 in a coffin and we watch it slowly sink down into the stage.

But while we rejoice in our technological marvels, we also need to maintain a healthy amount of skepticism. I, for example, am very skeptical about using computers in our school classrooms, and the younger the child, the more skeptical I become. I fear that too many teachers allow themselves to be dazzled by the wonders of the Internet and overlook how distracting computers can be, that they can crowd out far more important knowledge that our children must learn. For example, I moved to Japan in 1987 to teach English, as many native English speakers do. I, however, had a computer lab of 40 Macintosh Pluses and my title was "Computer Assisted Instruction Specialist" or something along those lines. I was skilled with the Mac but not with CAI, and so I learned as I went, building "courseware" using Course of Action from Authorware. It was an alpha version, something like 0.4, but unfortunately although I could design instructional workflows with it, that version could not compile a workable application. I eventually clandestinely obtained an earlier version from the developer, 0.333, that did actually compile usable applications and I could then deploy applications to my students, but it was the design work that was the hardest. I see that Adobe ended up owning Course of Action, and has decided to discontinue it (http://en.wikipedia.org/wiki/Macromedia_Authorware). A good albeit sad illustration of the ephemeral nature of computer software. That experience still influences my skepticism. My employer had bought 40 Macs but had not invested in upping their memory or getting second floppy drives for them. Even worse, they weren't even networked. The administration wouldn't consider spending any more money on them. After all, they had bought the machines, hadn't they? That should be enough. I even had to buy my own development machine, a Mac II (well, I would have bought it anyway). So I fear that too many administrators and teachers, too, simply dump machines in their classrooms with little thought to how they are really going to be used, and that ends up hurting everyone. If you are a teacher who effectively uses computers to teach young students, I congratulate you!

Another area where we all need to be skeptical, or even fearful, is how computing technology is used against us, to deny us our liberties and intrude on our privacy. "Identity theft" is all too common, costing billions of dollars in stolen property and lost time. It can take years to fully repair the damage done when a thief steals one's identity and opens credit card accounts, buys expensive items, and so on. Of course, computers aren't necessary for identity theft to occur. I remember a scene in the excellent movie マルサの女 ("A Taxing Woman") when the police raided a gang office and find hankos hidden in many different places. Theft or counterfeiting a hanko is a form of identity theft that surprises most Westerners, and the electronic form is just as effective, if not more so. And what do you say when the government promises to protect your private information? Just consider the identification card that the British government issues to foreigners. A British newspaper hired cryptography experts to alter the data on an example card, which they accomplished in 12 minutes using a laptop and a cell phone. The British government's response? "We are satisfied the personal data on the chip cannot be changed or modified and there is no evidence this has happened." Perhaps, but perhaps not. The best option is, I think, to be skeptical of claims of complete security, that "of course" your privacy will be protected, whether the promises come from companies or governments. Guard your identity, your privacy, your liberty.

But by all means, do enjoy what our computers and the Internet can offer. Apple seems close to releasing a tablet-style computer, and I'm hoping for something newly revolutionary. The iPhone is the best example I've seen of a computer disguised within a communication/media/gaming device, and it seems likely that Apple's tablet will be even more so. It won't be like Apple's computing concept video from quite a few years ago--I think around 1987, just search for "Future Shock (Apple Concept)" on YouTube--but the pieces are in place for it within MacOS X: voice and handwriting recognition, ubiquitous cameras, VoIP and streaming video, even computer generated voices that sound almost 100% human (so far distant from the MacinTalk software running on the Mac when it introduced itself in 1984). Funny that the remodeled home interior shown in that concept video was only a wireframe; today of course it would be a fully 3D rendering, with complete navigational interactivity, far more advanced than the version of DynaPerspective (the Mac version of Dynapers) released back in the early 1990s that rendered 3D buildings one slow vertex at a time. We don't yet have a 3D gesture interface, but Apple has been doing some development work in that area (just read United States Patent Application number 20080307360 dated December 11, 2008 from Apple entitled "Multi-Dimensional Desktop" for some hints). About the only thing seen in the video but missing today is the rather creepy interactive conversation with the computer. It reminds me of HAL 9000. Let's hope that whatever Apple ships in years to come will avoid the tendencies of that particular computer.

So the future is bright, I think, and we should look forward to it not with fear but with cautious optimism. I hope that I've communicated at least a bit of my enthusiasm to you, and I hope that it has helped inspire you to approach technology with positive thoughts, to control it and not let it control you. I've heaped scorn on Windows because it seems so limited in its approach, so awkward in its design. I've praised alternatives like the Mac and BeOS and BSD because they reflect optimism and inspire to go beyond the staid and pedestrian. I've frequently talked about the history of computers because as George Santayana observed in volume I of his epic "The Life of Reason" published just over 100 years ago, "Progress, far from consisting in change, depends on retentiveness. When change is absolute there remains no being to improve and no direction is set for possible improvement: and when experience is not retained, as among savages, infancy is perpetual. Those who cannot remember the past are condemned to repeat it." Apple and Steve Jobs are students of history, and being so the company continues to make tremendous progress in pushing along the computer and now telephonic industries. Knowing where we came from has certainly helped me appreciate that much more just where we are now and where we might go in the future.

Mac enthusiasts are often accused of being fanatics, but Santayana's definition of fanaticism is fully applicable here: it is "redoubling your effort after you've forgotten your aim." Proponents of alternative technologies certainly haven't forgotten their aim, and Apple is an excellent example. It aim remains laser sharp and equally bright.

I have many people I should thank here, both inside and outside Gihyo, starting with Mr. Katsumi Yamada who brought me on board MacJapan, and my editors there over the years. Professor Naomi Kuratani who brought be to Japan straight out of graduate school. Professor Hitoshi Kobayashi who introduced me to Mr. Hisashi Sakamaki of Canon, who in turn hired me to join Canon's NeXT Computer Group. And of course the friends I have maintained over the years, who visit my tiny computer museum here in San Francisco during Apple's developer conferences, like 佐藤 徹 (Sato-san), 田畑 英和 (Tabata-san), 竹尾 哲也 (Takeo-san), 宮一 正人(Masato-san), and 大久保 丞 (Okubo-san), and the occasional reader who has contacted me directly. And finally, my translator for many, many, many years, Mr. Shimazaki Masaki, who no doubt has scratched his head over my sometimes peculiar English.

Please do keep in touch with me (my email address is above) and I will continue to comment on computers at http://meisoukuukan.wordpress.com/. Writing for Gihyo was a tremendous experience for me, a deeply important part in my life, for it gave me a precious and unique opportunity to express myself to people who share my passion for a quirky, equally unique computer and truly personal computing, and I am grateful to you for allowing me into your mind.