Tuesday, February 27, 2007

Cooling Options

As I've mentioned in previous posts, I am a fan of overhead cooling and do not care for raised floors. An article at searchdatacenter.com discusses a study done by two IBM researchers. While the title of the article is "Raised Floor Bests Overhead Cooling, IBM finds", I like it because it is a balanced discussion. It talks about where raised floors work best and where they don't -- it all depends on the load and the configuration of the data center. Check out the article here.

Monday, February 26, 2007

Hosting.com Adds New Data Center Facility

Hosting.com has announced a second 16,500 sq.ft. data center to be built in Louisville,KY. Besides BCP and DR reasons, hosting.com said they evaluated facilities all over the U.S. before selecting Louisville. The new facility will be an impressive Tier III facility with N+1 cooling, N+2 power redundancy and DC power capabilities.

Saturday, February 24, 2007

Disaster Planning: Pandemic

Chris DeVoney has a very interesting article at Enterprise Systems Journal where he interviews Dan Lohrmann, CISO for the state of Michigan. They discuss the issue of pandemics and other disasters that aren't typically on the radar when you think of DR or BCP.

I always enjoy hearing about scenarios that government or private sector come up with to test their systems and procedures. Dan was the 2006 Information Security Executive of the year (congratulations Dan!). Data Centers tend to focus on how resilient the facility is, but what if the facility and network are still there, but there are no people to operate it?! Read the complete interview here.

Thursday, February 22, 2007

The Planet Exapnds Texas Data Centers

Dedicated hosting company The Planet plans on expansions to their Dallas and Houston data centers throughout 2007. After both sites are complete, their total data center footprint will be 175,000.
Check out thehostingnews.com article here.

Monday, February 19, 2007

Raised Floor

I just wanted to post a quick follow-up on the raised floor discussion. An article over at SearchDatacenter.com prompted me to do so. APC's CTO Neil Rasmussen states the following at the close of their interview:

The floor is the problem. I can make an air conditioner any size I want, but the problem is getting the air through the tiles. If I try to moving 25kW of air through tile, it's going to be 120 mph coming through floor tile. It's also very inefficient to push all that air over a distance. It takes a tremendous amount of horsepower to move it around. It's not uncommon to find just the fan taking more power than the servers in data centers.

He goes on to comment that the APC strategy is to get the cooling closer to the cabinet. When you are in a small "computer room" I suppose I can see this. When you have hundreds and hundreds of cabinets, this just seems like their way to make even more money on the deal. I am by no means an expert, but I believe their are plenty of cooling solutions that work (even for a 25 kW cabinet) that don't require cooling at the cabinet level.

Saturday, February 17, 2007

Network Computing: Data Center Design

Network Computing has a nice article on data center design. It's an outsourcers' notebook, lessons learned, on a 15 month project to design and build a 10k sq. ft. data center. Although it is only the basics of design and construction, it is a good overall article.

Side rant -- once again, a raised floor is listed as a requirement!! WHY???!!!???

Net Neutrality and the Carrier Hotel

Just a quick link over to an EzineArticles.com article by John Savageau on Net Nuetrality and the Carrier Hotel. I found it to be a very interesting read.

Friday, February 16, 2007

NASA & Google - Part 2

I have always been a Google fan, but this latest announcement with NASA is very cool. NASA is teaming up with Google (again) to use it's massive amounts of computing power and storage.

With the amount of data that NASA is generating it is no wonder they turn to Google, who has more than 10,000 servers (i.e.: 450k) over 13 data centers and expenditures of $1.5 billion on property and equipment in 9 months of 2006. NASA's Columbia super computer recently had an upgrade of 600 Terabytes, 20 StorageTek libraries and more. Columbia is connected to 1.1 Petabytes of storage with a SGI SAN (I love SGI equipment by the way).

I think Google will be alone for a while in the claim to have Exabytes of data in their possession. I wanted to link to what an exabyte was for those that didn't know how much storage this was, and came across an interesting 2003 CIO magazine article that states:

"It estimated that in 1999, the total of all human knowledge, music, images and words amounted to about 12 exabytes. About 1.5 of those exabytes were generated during 1999 alone. "

Since I probably won't get anyone at Google to bite on locating a data center where I live, I'll put the request out to them to divulge what they use for a storage infrastructure, management software and technologies used to connect storage equipment (their Bigtable publication is an interesting read by the way). It would be an amazing lessons learned story/white paper to learn what they use, how they implemented it and what they learned in the process.

Ok...enough suspense, here is the article at Byteandswitch.com.

Wednesday, February 14, 2007

Study: U.S. Data Centers Consume As Much Electricity as State of Utah

"A new report from the Alliance to Save Energy says computer data centers offer an important area for increasing the nation’s energy efficiency and notes policies and measures that could help mitigate the energy used by this emerging sector"

This article at EnvironmentalLeader.com covers the AMD and Alliance to Save Energy funded report. It also discusses H.R. 5646, the study to promote the use of energy efficient computer servers. The big quote of course is that the total consumption of U.S. data centers equals that as the entire state of Utah. I say, just throw a Google and Microsoft data center in Utah and then that statistic will become irelavent. :)

Tuesday, February 13, 2007

Hey Google -- Build Here!

With Google building mega complexes all over and everyone on the edge of their seat for where the next one will be, it's no wonder we find articles like this one at the Technology Evangelist blog. I have to give them credit -- it is a very good case they build and very creative.

I would also like to make the plea to Google to build a Data Center in my neck of the woods. I would personally do just about anything for the opportunity. We all think our own city is the best and while I won't go back into my location selection rant, I will say that at least the Technology Evangelist site location is closer to my area and a decent choice.

Monday, February 12, 2007

Googlegate

I had been seeing this story all over the net recently and wasn't going to post on it. That was until I ran across the Nicolas Carr blog post titled Googlegate in North Carolina. I just like the title. :)

It is interesting though that discusses the disclosures that "Google was granted as much as a quarter billion dollars in secret tax breaks for a plant expected to employ approximately 200 workers."

Of the articles I have read I think this one is very good, thorough and covers all of the details.

Digital Realty Buys Dublin Data Center

Digital Realty Trust continues to grow world wide! They have now purchased a Data Center in Dublin, Ireland. Yahoo news story here.

Sunday, February 11, 2007

Cincinnati Riverfront Park

I like the stories about technology parks that seem to be going up just about everywhere. Here is a link to a story on a 10k sq. ft., tier III data center being built with the possibility of growing to a 90k sq. ft. facility. It looks like Sara Lee Corp. is an early anchor tenant.

Saturday, February 10, 2007

Disaster: Network Outage Graphs

Todd Underwood has a really neat article recently on the Renesys blog. The article shows network graphs surrounding events such as the 2003 NorthEast power outage, hurricane Katrina and 2006 Taiwan quakes. It's pretty cool to see the graphs and his analysis of network patterns when the events were happening. Check it out here.

Monday, February 05, 2007

Amazon's EC2 Stories

Just a quick link to some success stories on Amazon's EC2 Service that are pretty interesting. I am really intrigued by the EC2 service and can't wait to hear more stories and perhaps someone else having a similar / identical service offering?

Sunday, February 04, 2007

WYSIWYB and John's 2007 Prediction

In 1999 a terabyte was a lot of storage. Today I can have a terabyte in a single disk on my system. In 1999 I upgraded my 56k modem to a screaming fast cable modem. Today.....well, I still have my cable modem. I know it is February, but I would like to make my prediction for 2007. I think 2007 will be the year of the home bandwidth upgrade. I think FiOS and other FTTP technologies will gain a lot of momentum as people demand better pipes (or tubes) to their homes.

A Robert Cringely column got me thinking about all of this. His recent column on WYSIWB was another excellent read on an exiting new technology from NuMetra. It may be dreaming (at this point), but NuMetra's technology would move the typically ISP over provisioning 20:1 ratio down to a What You Buy Is What You Get (WYSIWYG). Hey Robert -- watch the cracks about Vinton Cerf! :)

I almost wrote a post a while back on yet another Cringely column. He had an excellent piece on the Google monopoly and his thoughts. I thought I had an opinion on the article until I started reading all of the comments (270 in all). Afterwards I'm not sure what I think. :)

Thursday, February 01, 2007

Water-Cooled Systems

I have always been a little leery of water-based cooling. Water being so close to mass amounts of computer equipment just didn't make sense. With all of the water cooled systems that have come out in past years I have become more curious, but still wonder if it has a place in the data center. With the extreme density that some racks can have, I suppose their is some practical purpose for it.

Jonathan Heiliger, CEO of Aperture Development explains that the cost also plays a large factor, depending on what the density being cooled is. In this Processor article he states... “Generally, it’s cheaper to spread systems across the data center floor or build additional data center space than deploy water cooling today.”

No matter what your opinion on water is, it is worth the extra effort to prepare for it if building a data center today. Check out the rest of what I found to be a very good article here at Processor