For Computerworld's 2,000th issue: A look back

22.05.2006
Computerworld published a 455-page edition on Nov. 3, 1986, to commemorate its 1,000th issue and the 40th anniversary of electronic computing. What's happened since then? Here, in Computerworld's 2,000th issue, we look at the 10 biggest IT happenings chronicled in our pages over the past 20 years.

The Internet goes commercial

The Internet scene in the mid-1980s was dominated by discussions of acceptable-use policies, through which government and academic users sought to restrict Internet access to, well, government and academic users. Unacceptable uses of the Internet, such as porn and spam, hadn't been thought of yet; in those days, "unacceptable" meant commercial. Today, billions of dollars in transactions flow through the Net every month.

Fortunately, Computerworld never ran a story with the headline, "Al Gore Invents Internet." The real inventors wrote a seminal report for the National Research Council in 1988 titled "Towards a National Research Network," which spurred the development of interconnecting high-speed networks and encouraged IT vendors to build TCP/IP into their products. In 1989, Tim Berners-Lee wrote a paper describing "a distributed hyper-text system," which would become the World Wide Web.

E-commerce became an obsession when the dot-com bubble started to inflate in 1997. Even after the bubble popped in 2000, however, corporate enthusiasm for the Internet hardly slowed. Today, some of the hottest ideas in computerdom -- Web services, VOIP, service-oriented architectures and utility computing -- are grounded in the Internet.

Monopoly musical chairs

IBM dominated computing until the late 1980s. But its 1981 release of the IBM PC and the acceptance of PC clones, which were packed with Microsoft Corp. 's software, created a desktop computing market that changed the face of IT and put Microsoft at the center of power in the industry. Software developers flocked to DOS, and later Windows, to create thousands of applications, helping propel Microsoft's desktop operating system market share to more than 90 percent in the 1990s.

The government's concern about a Microsoft monopoly started with a 1991 investigation and culminated in 2000, when a federal district court judge found the company guilty of violating the Sherman Antitrust Act. Microsoft now faces threats from Linux, Google and Europe's antitrust regulators.

The Y2k 'problem'

Nowadays, when you're prompted to enter a date, you'll see something like "mm/dd/yyyy." Quite an innovation, that four-digit year.

The first printed mention of a Y2k Armageddon was made in Computerworld in 1984. In 1993, we printed Peter de Jager's estimate that Y2k repairs would cost US$100 billion. As hysteria mounted, cost estimates soared to close to $1 trillion.

On Jan. 2, 2000, the whole thing was seen as a bad dream and promptly forgotten. IBM said the average large company spent up to 400 man-years on the problem. Was that effort wasted? No - how else could we have justified scrapping those old Cobol systems?

The new foreign face of outsourcing

The practice of IT outsourcing stretches back to 1949 with ADP's mission to be the payroll service for the world. In 1962, Ross Perot started Electronic Data Systems Corp. to be a general-purpose IT outsourcing shop. And when Lou Gerstner took over IBM in 1993, his turnaround strategy was largely based on pushing IBM's outsourcing services. But outsourcing became a contentious labor and political issue early this century when U.S. corporations stepped up sending IT work offshore during an economic downturn. India's offshoring revenues in fiscal 2005 skyrocketed 34.5 percent to $17.2 billion, with more than 1 million Indian IT workers serving overseas customers.

The rise of personal computing

"The concept of 'a PC on every desk' has gone from being a gleam in the bespectacled eyes of a young Bill Gates to a near campaign promise by H. Ross Perot," Computerworld wrote in 1992. But then we went on to suggest that the real impact of personal computing wouldn't be felt until well into the next millennium.

The proliferation of the desktop computer happened much faster than that, of course, as did the rise of personal computing -- via laptops, PDAs, cell phones and other devices. Personal computing is just one dimension of the epochal movement of computing away from centralized mainframes to client/server computing, multitier distributed computing, grids and more.

But unlike the emergence of the minicomputer and the server, the rise of the PC had special meaning for IT managers: It meant they were no longer in control. That Lotus 1-2-3 spreadsheet user was programming, whether the IT shop liked it or not.

Open-source rides into data centers

The emergence of open-source software predates Linux. Sendmail, originally written in 1983 for Unix, was the first open-source program to be widely adopted by IT departments. Today, it transfers about 70 percent of the Internet's e-mail.

In 1996, the open-source Apache project had become the most popular Web server software on the Internet; by last month, Apache was running more than 80 million sites, for a 62 percent market share. However, with IBM backing Linux, followed by Hewlett-Packard, Oracle and others, Linux has become the face of open-source. Today, the operating system's share of the server market is close to 30 percent.

Security: from nuisance to all-out assault

In 1988, Robert Morris' worm crippled 6,000 machines on the Internet. Today's Web-tethered companies have built defenses that would have stopped the primitive Morris worm in its tracks.

However, the relentless attacks on IT's security barriers are no longer spearheaded by troublemakers like Morris or famed hacker Kevin Mitnick. Incursions now come from organized criminals using sophisticated tools to steal information for blackmail, corporate espionage or identity-theft schemes. Last year, the FBI reported that 95 percent of companies it surveyed had seen their network perimeters battered by online criminals.

Privacy concerns have spawned numerous IT-related regulations, moving information security from a separate practice to a central but seemingly unsolvable part of information management.

The rise of client/server computing

Client/server computing, one of the biggest IT topics of the 1980s and 1990s, has followed a familiar path for technology: from hype, to disillusionment, to maturity, to decline. Computerworld's Jan. 1, 1990, Forecast issue said: "Get ready for a teeth-gnashing, roller-coaster decade as users make a painful transition to a networked, client/server computing environment." Indeed, early client/server rollouts were costly and unreliable. In 1993, we quoted an IDC analyst saying, "Folks who went early into client/server development are 'strategically realigning.' That means they are in full retreat." The mainframe wasn't dead, after all. But standards and tools evolved, and so did skills, and companies began to reap the advantages of client/server -- scalability, flexibility and ease of application development. More recently, developers have discovered that still more tiers can bring better performance, flexibility and scalability. Applications can be broken into presentation, business logic, data access and data storage layers, each residing where it works best. Stir in Web-based clients and Web services, and those advantages are magnified. Apparently, client/server was just a steppingstone.

Elusive software quality

Since the 1980s, a profusion of development methodologies have gained and lost favor. Computer-aided software engineering, object-oriented programming, the Capability Maturity Model, extreme programming and other ideas have each been touted as the next most promising way to build high-quality software.

The results have been mixed. In 2004, the Standish Group estimated that 18 percent of IT development projects were canceled, down from 31 percent in 1994. However, "challenged projects," ones that failed in part, held steady at 53 percent over the decade. Even the vaunted peer-group quality-control system of open-source can produce bad code. Research funded by the U.S. Department of Homeland Security found that Linux 2.6 had over 68,000 lines of flawed code.

The geeks go to business school

First you were manager of data processing, then director of MIS, then CIO. Over the years, the focus of IT managers shifted from running payroll systems to automating management information to optimizing business processes.

As the CIO emerged in the 1980s, IT managers were advised to "align IT with the business," "think strategically" and "get a seat at the business (or boardroom) table."

Many CIOs have done just that. But that's not the end. Some of the pundits in Computerworld 's "Future of IT" stories a year ago predicted that IT managers as we know them today will eventually disappear, subsumed into the ranks of general business management.