Musings on personal and enterprise technology (of potential interest to professional technoids and others)

Tuesday, June 30, 2009

Data Center Overload, Tom Vanderbilt, NYTimes 8-Jun-2009

Photograph Attribution: Simon Norfolk/NB Pictures, for The New York Times
Known for its bean and spearmint fields, Quincy, Wash., is also home to rows of servers in a 500,000-square-foot data center that Microsoft built in 2006



For those of us already keenly aware of the various business-related and environment-related benefits of reducing datacenter and related power consumption, Tom Vanderbilt's article is a must-read. Published in the recent New York Times Sunday magazine's Architecture Issue, Vanderbilt presents a refreshing and enlightening review of why we should continue to care deeply about these issues, and follow through with practical energy-reducing measures in managing technology and other infrastructure. This is one of the overall best articles on this topic seen by technosurfer in quite some time :)

The Architecture Issue - Data Center Overload - NYTimes.com: "...Much of the daily material of our lives is now dematerialized and outsourced to a far-flung, unseen network. The stack of letters becomes the e-mail database on the computer, which gives way to Hotmail or Gmail. The clipping sent to a friend becomes the attached PDF file, which becomes a set of shared bookmarks, hosted offsite. The photos in a box are replaced by JPEGs on a hard drive, then a hosted sharing service like Snapfish. The tilting CD tower gives way to the MP3-laden hard drive which itself yields to a service like Pandora, music that is always “there,” waiting to be heard.

But where is “there,” and what does it look like?"

“There” is nowadays likely to be increasingly large, powerful, energy-intensive, always-on and essentially out-of-sight data centers. These centers run enormously scaled software applications with millions of users. To appreciate the scope of this phenomenon, and its crushing demands on storage capacity, let me sketch just the iceberg’s tip of one average individual digital presence: my own. I have photos on Flickr (which is owned by Yahoo, so they reside in a Yahoo data center, probably the one in Wenatchee, Wash.); the Wikipedia entry about me dwells on a database in Tampa, Fla.; the video on YouTube of a talk I delivered at Google’s headquarters might dwell in any one of Google’s data centers, from The Dalles in Oregon to Lenoir, N.C.; my LinkedIn profile most likely sits in an Equinix-run data center in Elk Grove Village, Ill.; and my blog lives at Modwest’s headquarters in Missoula, Mont. If one of these sites happened to be down, I might have Twittered a complaint, my tweet paying a virtual visit to (most likely) NTT America’s data center in Sterling, Va. And in each of these cases, there would be at least one mirror data center somewhere else — the built-environment equivalent of an external hard drive, backing things up.

Small wonder that this vast, dispersed network of interdependent data systems has lately come to be referred to by an appropriately atmospheric — and vaporous — metaphor: the cloud. Trying to chart the cloud’s geography can be daunting, a task that is further complicated by security concerns. “It’s like ‘Fight Club,’ ” says Rich Miller, whose Web site, Data Center Knowledge, tracks the industry. “The first rule of data centers is: Don’t talk about data centers.”

Yet as data centers increasingly become the nerve centers of business and society — even the storehouses of our fleeting cultural memory (that dancing cockatoo on YouTube!) — the demand for bigger and better ones increases: there is a growing need to produce the most computing power per square foot at the lowest possible cost in energy and resources. All of which is bringing a new level of attention, and challenges, to a once rather hidden phenomenon. Call it the architecture of search: the tens of thousands of square feet of machinery, humming away 24/7, 365 days a year — often built on, say, a former bean field — that lie behind your Internet queries.

[etc...] As James Hamilton of Amazon Web Services observed recently at a Google-hosted data-center-efficiency summit, there is no Moore’s Law for power — while servers become progressively more powerful (and cheaper to deploy) and software boosts server productivity, the cost of energy (as well as water, needed for cooling) stays constant or rises. Uptime’s Brill notes that while it once took 30 to 50 years for electricity costs to match the cost of the server itself, the electricity on a low-end server will now exceed the server cost itself in less than four years — which is why the geography of the cloud has migrated to lower-rate areas.

The huge power draws have spurred innovation in the form factor of the data center itself. For its Chicago center, Microsoft is outfitting half the building with shipping containers packed with servers. “Imagine a data center that’s about 30 megawatts, with standard industry average density numbers you can probably fit 25,000 to 30,000 servers in a facility like that,” says Microsoft’s Chrapaty. “You can do 10 times that in a container-based facility, because the containers offer power density numbers that are very hard to realize in a standard rack-mount environment.”

The containers — which are pre-equipped with racks of servers and thus are essentially what is known in the trade as plug-and-play — are shipped by truck direct from the original equipment manufacturer and attached to a central spine. “You can literally walk into that building on the first floor and you’d be hard pressed to tell that building apart from a truck-logistics depot,” says Manos, who has since left Microsoft to join Digital Realty Trust. “Once the containers get on site, we plug in power, water, network connectivity, and the boxes inside wake up, figure out which property group they belong to and start imaging themselves. There’s very little need for people.”

“Our perspective long term is: It’s not a building, it’s a piece of equipment,” says Daniel Costello, Microsoft’s director of data-center research, “and the enclosure is not there to protect human occupancy; it’s there to protect the equipment.”

From here, it is easy to imagine gradually doing away with the building itself, and its cooling requirements, which is, in part, what Microsoft is doing next, with its Gen 4 data center in Dublin. One section of the facility consists of a series of containers, essentially parked and stacked amid other modular equipment — with no roof or walls. It will use outside air for cooling. On our drive to Tukwila, Manos gestured to an electrical substation, a collection of transformers grouped behind a chain-link fence. “We’re at the beginning of the information utility,” he said. “The past is big monolithic buildings. The future looks more like a substation — the data center represents the information substation of tomorrow.”


Tom Vanderbilt is the author of Traffic: Why We Drive the Way We Do (and What It Says About Us).”

Thursday, June 25, 2009

The doctor in your pocket & Health in the palm of your hand

Neil Seeman is the director and primary investigator of the Health Strategy Innovation Cell, based at Massey College at the University of Toronto. He has recently authored two excellent articles in the National Post, which IMHO suggest some very important uses of mobile technology in relatively simple ways, than can achieve quite meaningful healthcare improvements around the world:

Just 1 excellent examples among the many mentioned:

"Last summer, the Winnipeg Fire Paramedic Service started a protocol to transmit electrocardiogram readings of their patients to cardiologists' mobile phones. The approach--Winnipeg EMS is part of the Strategic Reperfusion Early After Myocardial Infarction, or "STREAM" study -- allows doctors to work closely with paramedics through life-saving decisions as patients are transported by ambulance. It is reportedly lowering mortality rates and speeding up treatment times."

You can read more great ideas from Seeman and his colleagues (and share your own!) on the new myhealthinnovation.com website, designed for those who wish to "share, vote, and say thanks for low-cost, low-tech health ideas".

Sunday, June 7, 2009

Safire: Who is a 'Straw Man'? vs. Brooks: 'Pilot System'

William Safire has penned an excellent piece regarding the phrase "Straw Man", in his New York Times "On Language" column, On Language - Who is a 'Straw Man'? - NYTimes.com:

"Accepting the Democratic nomination in a huge football stadium way back in the presidential campaign of ’08, Senator Barack Obama displayed his oratorical talent by using one of his favorite tried-and-true devices in argument: “Don’t tell me that Democrats won’t defend this country!”

Who was telling him that? To be sure, his opponents were claiming that a Republican administration would be stronger on defense, but nobody was telling him or the voters that Democrats preferred abject surrender. At the time, reviewing that speech, I noted the rhetorical technique: “By escalating criticism, he knocked down a straw man, the oldest speechifying trick in the book.”..."

Safire explains these fascinating origins of "Straw Man" in the context of political speeches. Those of us involved in technology and other business projects, know that a "Straw Man" can also be used in a business context, typically to refer to the first attempt at documenting a project's initial charter and/or requirements, or even a preliminary prototype of a proposed new or enhanced system's user interface.

As explained nicely on mindtools.com:

"When you begin a project or start looking into a problem, you often have incomplete information to work with. So you can spend time gathering facts and data until you are ready to build a really strong argument or plan, or, you can get going straight away and jump in with a not-so-complete solution, with the intention of finding a much better one, as you learn more and more.

That's the premise behind building a straw man - creating a first draft for criticism and testing, and then using the feedback you receive to develop a final outcome that is rock solid..."

A similar, and very concise definition of "Straw Man" can be found in J. LeRoy Ward's excellent Dictionary of Project Management Terms :

"Working draft copy circulated for comments or suggested changes."

Finally, I would like to suggest a connection between the idea behind a "Straw Man" in business and technology, to the very similar concept in Software Engineering, as originally explained brilliantly in the classic Fred Brooks work: The Mythical Man-Month: Essays on Software Engineering, Anniversary Edition (2nd Edition):

"The management question, therefore, is not whether to build a pilot system and throw it away. You will do that [anyway]."

Thus, Brooks suggests that one may wish to "plan to throw one away", since it is quite possible that you will, anyhow. And by building this "pilot system" into the plan as an explicit milestone, communication is simpler, and expectations are more realistic from the start.

So perhaps, although not identical, Safire's "straw man" and Brooks' "pilot system" are both trying to capture the same notion: That we sometimes need to start with something we know is imperfect, in order to be able to proceed from there to a truly excellent result.