I'm attending the periodic one day "MiniMonsters" meeting in San Francisco today. Great discussion with some of the best infrastructure minds in the online (and offline) world about topics like scalability, CM, finding and hiring great technical talent, and business continuity.
I've been thinking about doing a longer post on the WebMonsters group. It's a fantastic professional organization that has historically flown well below the radar of the mainstream but has lately taken a shift to more openness and transparency. Check out the Webmonsters website and I'll try to post more details at a later point.
Thursday, November 29, 2012
Sunday, November 4, 2012
Project Ginger Renderings
Just wanted to send a quick update with the latest renderings from our datacenter project in Chandler, Arizona. It's scheduled for phase I completion in about 4 weeks, and I couldn't be more excited.
Thursday, October 18, 2012
Adventures in Site Selection: Coach Class Canines and Ritual Blood Sacrifice.
Much has been written about the datacenter site selection
process, as it is the most important phase in the life of any mission critical
project. I’ve always wanted to share more about what it’s really like to go to
the far reaches of the earth in search of that one perfect slice of datacenter
heaven that we’re all seeking. Recently I’ve had one of my more memorable site
selection trips and I wanted to share it with you.
As I always say, there are many things you can change about
an existing datacenter – you can alter the next phase design, you can always
work to improve the efficiency numbers, you can choose alternate vendors for most
electrical and mechanical systems. But there is one thing you cannot change,
and that is the site that you’ve selected. In other words, it’s hard to pick
‘em up and move ‘em once they’re built.
That is why it is muy importante that you get it right the first
time. It’s such a huge financial commitment that you have one chance at it or
it might be your last site selection. You need to pick the right piece of dirt,
with the right size for growth, at the right price, in the right utility
district, with adequate amounts and qualities of energy, fiber, water, sewer,
with the right neighbors, the right political landscape, the right permitting
assurances, and 40 other factors all aligned as close to perfectly as
possible.
It all sounds like a tidy checklist that can be filled out
and validated, doesn’t it? I wish it were that straightforward. From my
experience there’s so much more to it than that. There’s the establishment of
relationships to ensure project success. The kissing of babies. The drinking of
the local beer, the eating of the local delicacies. The building of trust
between you and your new prospective suppliers and business partners. In my opinion these relationships end up being
far more important factors in the success of a datacenter project than all
other things combined. Do it right and you’ll have a community of people that
are cheering you on, or disregard the relationships and face headwinds for the
entire lifetime of your datacenter.
In order to establish these relationships, there’s no
substitute for spending time in the community and immersing yourself in the
local culture. Boots on the ground, literally. This is especially important in
radically different locales outside the USA. Learn enough of the local language
to get by (“please”, “thank you”, “good morning/evening”, “do you speak
English” and ability to count to 10 will get you amazingly far).
Once in awhile, actually more often than not, I have such
extraordinary experiences on these site selection trips that I think it bears
sharing.
It started before I had even arrived, on the long
international flight. A colleague of mine has taught me many tips and tricks,
and one of them is to try to be the last one to board and to visit the ticket
counter just before heading down the ramp to ask if there’s 2 or 3 seats
together that aren’t assigned so that presumably you could stretch out and get
some rest en route. Which I did. And which, to my surprise, the ticket agent
said yes, there was a pair of seats all the way in the back which would
certainly not be filled at this late date so I’d be able to settle in to them
for the 10 hour flight. So I trot down the jetway, smiling at my good fortune
and feeling sorry for all those poor souls who will be stranded next to sweaty
strangers in center seats while I’ll be lounging in relative coach class
luxury. I find my pair of seats and immediately spread out. I’m set. The flight
attendant announces that the door has been closed and all electronics with an
off switch…etc etc etc. This is gonna be good.
Then she appears. I
see her huffing down the aisle all the way from the front. She looks like she
just ran all the way through the terminal and made this flight by the skin of
her teeth. She’s walking down the aisle carrying a black bag slung over her
back, with a frenzied look caused by the battles she’s just endured. And I’m
thinking – “no, no! What are the chances?” And of course you guessed it, she
comes all the way to the back of the plane and stands there looking at me. I
have a seatmate.
And as I’m asking her if I can help her stow her bag, she
unzips it and out jumps her 15-pound dog. On my lap. The trip just got a lot
cozier.
And it just got weirder from there.
The next morning the team arrives at the first of our
prospective sites. Without being too specific about the exact location, it’s
safe to say that this particular site is in what I’d call the jungle. It’s
actually fairly close to the metropolitan area that we’re aiming to serve, but I’d still describe it as the jungle. All the basics have been checked out. Great
power availability, awesome accessibility and physical security, water,
network, sewer, land quality, it all checks out and looks very promising.
I always walk a potential site. It’s admittedly unscientific
and hard to quantify, but there’s something that I get from actually walking
the length and breadth of a plot of land. It just gives me a better sense of
the slope, drainage, soil composition and compaction, and just an overall feel for the site.
So as is my custom, I start walking the site with other team
members. This is one large piece of land, so we visit several spots that are
most likely our best site targets. The first two parcels look fantastic. Then
we go to the far corner of the site where one of the fiber entrances is
located.
As I’m climbing a small hill, I see it but have no idea what
it is. It looks like someone has left some trash. But as we get closer it
appears that it is two large dirty bowls and what looks like a red cloth of
some sort. Closer yet and we see that there’s a big carving knife (!!!) stuck
into the ground between them. And then it becomes clear that there’s some kind
of food in one bowl and there’s FRESH BLOOD ALL OVER THE KNIFE AND IN THE OTHER
BOWL (!!!!!!!!).
Macumba sacrifice |
I’m immediately and profoundly fascinated and freaked out
all at the same time. I can’t help but thinking – where do we put this in our
site selection spreadsheet??? We don’t have a column for the existence of
sacrificial rituals. At least we didn't.
Our local guides tell us that this is a sign of an ritual performed by some of the more devout local practitioners of the ancient religion Macumba. The blood? Most likely from a chicken (hopefully), and yes –
consumed by the participants. Further research shows that the ritual could be
used for both positive and negative purposes – an effort to bring wealth or to
punish a wrongdoer.
Nothing in my research whatsoever about it being a good or
bad omen for a datacenter project. I checked.
Later over beers I was relating the story of our discovery to some colleagues at a local watering hole. They didn't know exactly what I was talking about (think Lost in Translation) and so I just showed them the picture.
Their faces turned pale and they literally did the sign of the cross in unison. Repeatedly. While spinning around and making this funny spitting sound. Not sure exactly what that means but if I was a betting man I'd say they were advising that we find another piece of land.
All in all, just another memorable site selection trip.
Wednesday, September 12, 2012
7x24 Exchange
I'm very excited to report that we've been selected to present a session on our newest facility at the Fall 7x24 Exchange in Phoenix. If you never been to a 7x24 conference, I most highly recommend it as the preeminent gathering of mission critical professionals in the entire industry. We're fortunate that we'll be able to give a presentation about our Chandler, AZ datacenter to the membership, as they'll be a mere 20 miles or so down the road from our newest creation.
Please look me up if you're going to attend. We're already prepping for the session and it promises to be a great time with new and old colleagues.
kev
Please look me up if you're going to attend. We're already prepping for the session and it promises to be a great time with new and old colleagues.
kev
Monday, July 23, 2012
Project Ginger
In the heat of the desert sun there’s a very special facility being raised in Chandler, Arizona. It’s our latest massively modular CyrusOne datacenter, nicknamed “Ginger”, and it promises to be the most highly efficient and most innovative colocation facility in the state of Arizona, if not the entire Southwest. During my last visit to the site I was struck by the magnitude of the effort we’re undertaking there. We are building a HUGE facility unlike anything in the datacenter industry today on top of what was once 60 acres of alfalfa and hay, where flocks of sheep were grazing only a few months ago. The crews of men and women are hard at work making this a reality in some tremendously challenging conditions. I was onsite during the recent heat wave that affected the entire country, and as I pulled up to the construction trailer my rental car thermometer registered a toasty 126 degrees outside. In the words of present-day philosopher Paris Hilton –“that’s hot!”. The construction crews start the day in the wee hours of the morning and usually knock off by about 1 or 2pm to avoid the worst of the heat. A tip of my construction helmet to them – they’re out there every day making our vision a reality in conditions that most of us would find intolerable.
“But it’s a dry heat”
So some of you are wondering how a datacenter can be located in the desert and still claim impressive efficiency numbers. The bottom line is that the dry environment in Chandler is a great fit for our indirect evaporative cooling (IDEC) technology. Initial studies are telling us that we’ll be on full economizer (full evaporative cooling) mode for a significant percentage of the year and on partial economizer for the remainder of the year. That means dramatically lower PUE numbers, less environmental impact, and ultimately lower cost for our customers. Winner, winner, chicken dinner for everyone.
the roof, the roof, the roof is on fire!
We’ve sweated the design details on every aspect of this project, including the roof. It’s an innovative marvel in itself. It is peaked at the perimeter with a valley through the center spine, exactly the opposite of every other roof you’ve seen. This roof shape allows for the natural convection of hot return air from across the mission critical space back to the perimeter of the building where cooling units are installed. At the same time, the center valley collects rainwater and routes it away from the building to catch basins where it is reused for site cooling and landscaping needs.
Size matters
We decided to take a more innovative approach in the overall site layout as well. Instead of a single traditional rectilinear warehouse design we have specifically sized several separate structures to precisely match the performance of the cooling units mounted to the perimeter of the facilities. The steel trusses over the individual data halls have been engineered and sized to support the entire span without the need for intermediate support columns. This means that the full interior of the building is open for customer installations, free from the encumbrances and clearances required by the support columns in most large structures. In other words, it’s all tasty electrical and mechanical infrastructure on the outside, and all chocolaty compute and storage goodness on the inside.
Ginger or Mary Ann?
Why Project Ginger? The C1 design/construction team are all Gilligan’s Island fans (who isn't?), and we were sitting around discussing how the front structure was going to be an awesome Class A office building and the rear building was going to be the business end of things. That quickly devolved into the age-old dilemma that all red-blooded baby-boomer boys have faced – Ginger or MaryAnn? It was determined that the very sleek and sexy administration building would be nicknamed Ginger, and the mission critical facility in the back would henceforth be referred to Mary Ann. Many other comparisons emerged over a team happy hour, but those are left to the reader's imagination. Bottom line is that Project Ginger is like a reverse mullet - all party in the front, all business in the back.
And don’t think we’ve left out Mrs. Howell – she’s the C1 construction trailer.
Here’s a conceptual design video of Project Ginger. It pretty accurately reflects what you will see in just a few short months. Check it out and let me know what you think.
Labels:
arizona data center,
colocation arizona,
colocation facility arizona,
cyrusone,
cyrusone data center
Wednesday, June 20, 2012
Datacenter Knowledge post
A really nice post by the folks over at DatacenterKnowledge on our design methodology at CyrusOne.
Let me know what you think!
kev
Let me know what you think!
kev
Friday, June 8, 2012
7x24 Exchange
I'll be at the 7x24 Exchange in Orlando next week, where I look forward to meeting up with old friends and colleagues. A few sessions I'm particularly interested in attending:
ASHRAE New Environmental Guidelines & Data Center Energy Efficiency / Future-Proofing
Data centers are using an increasing amount of the total energy used by commercial facilities. However, these increases have a downside in that they cause a significant increase in the power required and the heat dissipated by the computing equipment, such that it is becoming very difficult to power and cool these systems in data centers or telecommunication rooms. This seminar includes ASHRAE’s just published equipment trends through 2020 which have a big impact on right sizing, future-proofing, and data center energy efficiency.
ASHRAE Class Changes Expand the Use of Chillerless Data Centers and More!
The first vendor neutral temperature standards were published in 2004 by ASHRAE. Prior to that, the temperatures were more based on anecdotal knowledge and worst case scenario (often 68 °F, (20 °C)). In 2004, ASHRAE established a recommended range of 68 to 77 °F (20 to 25 °C).
It is hard to believe, but now in less than 10 years, the recommended temperature range has widened to (64 to 81 °F) or (18 to 27 °C). Further, there are allowable ranges that go as wide as (41 to 113 °F) or (5 to 45 °C). There are also radical changes in the humidity ranges.
Seawater and Deep Lake Water for Data Center Cooling
Low temperature water from seawater and deep lakes is becoming an option as a data center heat sink. The first system in the U.S. at Ithaca, NY, which serves about 51 megawatts of cooling, will be studied for fresh water. A data center in Hamina, Finland will be reviewed for using seawater. Each will examine the potential advantages, disadvantages, and lessons learned from the unexpected. For data centers, the additional requirement to meet higher reliability standards will be also be reviewed.
Let me know if anyone would like to meet up in Orlando.
kev
Thursday, June 7, 2012
Datacenter Guys
People call me a “datacenter
guy”. Collectively, they call us all “datacenter guys”. Some of the best
datacenter guys I’ve known are women, but they’re still “datacenter guys”. I’ve
been thinking a lot lately during my travels as to what it means to be a datacenter
guy. Here’s what I’ve come up with so far.
First, some background on
what I do. Datacenter guys locate, design, build, or operate facilities that
house computers and electronic storage. This once niche specialty profession
has emerged to become very much in demand in recent years with the increasing
popularity of online goods and services. It’s good to be us! We get the rare
privilege of straddling the worlds of the internet and mission critical
facilities. Truly blessed.
For the uninitiated, some
datacenters are small rooms, the size of a closet. Most companies of any size have at
least one of these in some form or fashion on premises. Some datacenters are
quite large, spanning several hundred thousand square feet in size. These
facilities can hold hundreds of thousands of individual computers. So what is
so unique about a datacenter that makes it different from any other building?
Each one of those computers requires two critical things: 1) power
(electricity) to run and 2) a way to deal with the resultant heat that is
generated. Put 100,000 computers
in one confined space and you can imagine how much power you need to deliver to
the systems and how much heat they produce. If you’ve never been in a
datacenter, think about hundreds of shelves filled with running hair dryers and
you’ll get a pretty accurate picture of a production datacenter environment.
Datacenter guys are odd
birds. We are a mix of several professions – internet engineer, network and
server architect, real estate, IT, construction, negotiator, soils engineer, political
strategist, tax expert, psychologist. Perhaps that’s why it’s so hard to
characterize us – why we have our own designation. We’re datacenter guys.
One of the reasons I love
being a datacenter guy is because people generally smile when they say it.
“He’s a datacenter guy”, followed by the obligatory smirk. It’s kinda like
being a gynecologist. “What do you do for a living?” “I’m a gynecologist”. Similar
reaction, I’d think. Sometimes I wonder what it means. It could mean – poor
bastard, he’s a datacenter guy. Imagine what he does every day. Or it could
mean – geez I wish I were a datacenter guy. In any case it’s nice to get the
smiles. It sure beats saying “I’m a coroner”. They probably get a different
reaction, I’m guessing.
Datacenter guys travel a lot.
We have to. There’s not many datacenters being built in our neighborhood. If
there were, we’d want to live somewhere else. Datacenters are built in faraway
places in the neighborhoods with all the auto body shops and distribution
warehouses. Some of the best ones are built in the middle of nowhere, in
alfalfa fields or abandoned aluminum smelters or in Iceland. And to complicate
things, the best datacenter guys are notoriously cheap because we know all
travel costs are going against the project. So we end up doing wacky things to
save money, like staying inside the unfinished datacenter to save on hotel
costs. Or the local Convent for $60/night (yes, seriously).
Needless to say, we have to
leave our families to go to where the action is, and this makes it tough on the
loved ones that we leave behind. We’re far from the only profession that has to
log significant flight miles, but in the realm of the internet when most things
can be done via cell/txt/skype/email/irc we stand out as dinosaurs who still
have to go onsite to get things done. The
most fortunate of us have spouses and kids that understand why we aren’t home.
It’s because we’re datacenter guys.
Datacenter guys know there’s
something infectious, almost spiritual, in bringing a datacenter out of the
ground. I suppose it’s the same with any kind of major construction project,
like a skyscraper or hospital. I call it “the drug”. It’s the buzz, the energy
around a project when you design a datacenter from the ground up and you watch
it become a living, breathing hunk of reality before your very eyes. That
thrill of “topping out” steel framing with an evergreen tree if you’ve not
killed anyone so far in the process. You can tell when someone is on “the drug”.
They will leave family and friends, forego food and water, they will even leave
their current job, to go join a company that is designing and building. They’re
on the drug, poor bastards. They’re addicted.
People who have built their share
of datacenters are unique in the internet disciplines in that they have
probably experienced something firsthand that the vast majority of their other
colleagues typically don’t see – they have had someone on their project team maimed
or killed. I was at an internet industry dinner a few weeks ago when someone
asked a question of everyone at the table.
“Tell me about the most
memorable operational incident you had to handle at work.”
The answers all had some
variation on reactions to service outages.
First guy – “I remember our site
cratering just minutes before we were to stream the very first online
Victoria’s Secret fashion show.” Agreed, this sounded truly tragic.
Another – “I had 3 team
members resign right in the middle of a crucial maintenance window.” Admittedly
stressful.
Me – “I know of a father of
two who fell off a steel structure to his death, and several guys who went home
after their shift with fewer fingers or toes than when they started.”
Silence.
Get any semi-seasoned
datacenter guy to talking about the topic of danger on the job and we’ll regale
you with tales of horrendous incidents. There’s the story of the guy who
incinerated himself by leaning over a battery string with an adjustable wrench
in his pocket. I had another guy inches from sure death when a clutch exploded on a generator. Another fell off some scaffolding almost 2 stories tall and
fractured his back and pelvis. Needless to say, it’s a dangerous job building
and operating these facilities. Next time you’re touring a production quality
datacenter, notice that lifeguard’s crook that is hanging innocuously in the
electrical room. Strange, there’s no pool nearby. It’s called the “meat hook”,
and it’s used to pry someone off the switchgear as they’re being electrocuted
without actually having to touch them and becoming yet another organic
conductor.
Another unique thing good
datacenter guys do is to find that elusive perfect location for their next
datacenter. In the industry we call it “site selection”. An integral part of
any good site selection process is the negotiatation of incentive packages with
states, counties, and local municipalities. I’m convinced that this is an art
form that is born, not bred into some of us. It involves meeting with everyone
from Mary Jane at the local Chamber of Commerce to the Governor to the Senator.
In order to get the very best incentive packages, you need to please everybody.
Mary Jane wants you to sponsor the German Shepherd for the town’s first police
K-9 unit. And you can’t imagine how expensive they are to acquire and train. The
Governor wants you to commit to a certain number of jobs and capital spend that
you’ll bring to his or her state. And oh, by the way please site your datacenter over
at the Technology Park that I pushed through the legislature because it’s gone
nowhere since. And of course the Senator demands to be the first speaker at
your groundbreaking so that he’ll be able to take credit for all of the
aforementioned. Site selection is a
three ring circus but it is one of my favorite aspects of a new datacenter
project.
All in all, I think those of
us in the datacenter industry consider ourselves very fortunate. Our
professions require us to build grand things, dabble in a wide variety of
technologies, meet all kinds of interesting characters, and to see the world.
Nothing could be better.
Welcome to Fresh Air!
Welcome to my semi-personal blog. My name is Kevin Timmons, and I'm a datacenter guy with a passion for all things mission critical. I hope to use this space to discuss datacenter technologies, designs, and to generally geek out about what I'm doing as the CTO of an aggressive, emerging datacenter company called CyrusOne. Full disclosure - I'll be prominently featuring tidbits about my current projects and interests here at C1, although I'll try my level best to keep from turning this blog into a billboard advertisement for our products and services. But filter my content accordingly - you've been warned.
I anticipate that I'll be posting at best sporadically, really just as I come by what I think is worthy material that others might be interested in. I call it a semi-personal blog because I'll also be throwing in some personal tidbits from time to time. The goal is to create something that is not your typical datacenter blog - there's enough of those out there already.
Welcome to Fresh Air!
kev
I anticipate that I'll be posting at best sporadically, really just as I come by what I think is worthy material that others might be interested in. I call it a semi-personal blog because I'll also be throwing in some personal tidbits from time to time. The goal is to create something that is not your typical datacenter blog - there's enough of those out there already.
Welcome to Fresh Air!
kev
Subscribe to:
Posts (Atom)