Web Servers at eBay The online auction sitb eBay is very popular. Indeed, it is so popular that its Web servers deliver hundreds of millions of pages per day. These pages are a combination of static IITivIL pages and dynamically generated Web pages. The dynamic pages are created from queries run against eBay's Oracle database, in which it keeps all of the information about all auctions that are under way or have closed within the most recent 30 days. With millions of auctions under way at any moment, this database is extremely large. The combination of a large database and high-transaction volume makes eBay's Web server operation an important part of the company's success and a potential contributor to its failure. The servers at eBay failed more than 15 times during the first five years (1995-2000) of the company's life. The worst series of failures occurred during May and June of 2000, when the site went down four times. One of these failures kept the site offline for more than a day—a failure that cost eBay an estimated $5 million. The company's stock fell 20 percent in the days following that failure. At that point, eBay decided it needed to make major changes in its approach to Web server configuration. Many of eBay's original technology staff had backgrounds at Oracle, a company that has a tradition of selling large databases that run on equally large servers. Further, the nature of eBay's business—any visitor might want to view information about any auction at any time—led eBay management initially to implement a centralized architecture with one large database residing on a few large database server computers. It also made sense to use similar hardware to serve the Web pages generated from that database. In mid-2000, following the worst site failure in its history, eBay decided to move to a decentralized architecture. This was a tremendous challenge because it meant that the single large auction information database had to be replicated across groups, or clusters, of Web and database servers. However, eBay realized that using just a few large servers had made it too vulnerable to the failure of those machines. Once eBay completed the move to decentralization, it found that adding more capacity was easier. Instead of installing and configuring a large server that might have represented 15 percent or more
of the site's total capacity, clusters of six or seven smaller machines could be added that represented less than one percent of the site's capacity. Routine periodic maintenance on the servers also became easier to schedule. The lesson from eBay's Web server troubles is that the architecture should be carefully chosen to meet the needs of the site. Web server architecture choices can have a significant effect on the stability, reliability, and, ultimately, the profitability of an electronic commerce Web site.