MALE SPEAKER 1: A data centre's the brains of the Internet.MALE SPEAKER 2: The engine of the Internet.FEMALE SPEAKER 1: It is a giant building with a lot of power,a lot of cooling and a lot of computers.MALE SPEAKER 3: It's row, upon row, upon row of machines,all working together to provide the services thatmake Google function.JOE KAVA: I love building and operating data centres.I'm Joe Kava, Vice-President of Data Centres at Google.I'm responsible for managing the teams globally that design,build and operate Google's data centres.We're also responsible for the environmental healthand safety, sustainability and carbon offsets for our datacentres.This data centre, here in South Carolina,is one node in a larger network of data centresall over the world.Of all the employees at Google, a very, very small percentageof those employees are authorised to even entera data centre campus.The men and women who run these datacentres and keep them up 24 hours a day, seven days a week,they are incredibly passionate about what they're doing.MALE SPEAKER 2: In layman's terms, what do I do here?FEMALE SPEAKER 1: I typically refer to myselfas the herder of cats.MALE SPEAKER 4: I'm an engineer.MALE SPEAKER 3: Hardware site operations manager.MALE SPEAKER 2: We keep the lights on.MALE SPEAKER 1: And we enjoy doing it.JOE KAVA: And they work very hard,so we like to provide them with a fun environment where they can alsoplay hard as well.FEMALE SPEAKER 2: We just went past the three-million-man-hourmark for zero lost-time incidents.Three million man-hours is a really long time,and with the number of people we have on site, thatis an amazing accomplishment.JOE KAVA: I think that the Google data centres reallycan offer a level of security that almost no other companycan match.We have an information security teamthat is truly second to none.You have the expression, "they wrote the book on that."Well, there are many of our information securityteam members who really have writtenthe books on best practices in information security.Protecting the security and the privacyof our users' information is our foremost design criterion.We use various layers of higher-level securitythe closer into the centre of the campus you get.So, just to enter this campus, my badgehad to be on a pre-authorised access list.Then, to come into the building, thatwas another level of security.To get into the secure corridor that leads to the data centre,that's a higher level of security.And the data centre and the networking roomshave the highest level of security.And the technologies that we use are different.Like, for instance, in our highest-level areas,we even use underfloor intrusion detection via laserbeams.So, I'm going to demonstrate going into the secure corridornow.One, my badge has to be on the authorised list.And then two, I use a biometric iris scannerto verify that it truly is me.OK, here we are on the data centre floor.The first thing that I notice is that it'sa little warm in here.It's about 80 degrees Fahrenheit.Google runs our data centres warmerthan most because it helps with the efficiency.You'll notice that we have overhead power distribution.Coming from the yard outside, we bring in the high-voltage powerdistributed across the bus bars to all of the customised bustaps that are basically plugs, where we plugin all the extension cords.All of our racks don't really look like a traditional serverrack.These are custom designed and built for Googleso that we can optimise the serversfor hyper-efficiency and high-performance computing.It's true that sometimes drives fail,and we have to replace them to upgrade them,because maybe they're no longer efficient to run.We have a very thorough end-to-end chain-of-custodyprocess for managing those drivesfrom the time that they're checked out from the servertil they're brought to an ultra-secure cage, wherethey're erased and crushed if necessary.So any drive that can't be verified as 100%clean, we crush it first and then wetake it to an industrial wood chipper,where it's shredded into these little pieces like this.In the time that I've been at Google – for almost sixand a half years now – we have changedour cooling technologies at least five times.Most data centres have air-conditioning unitsalong the perimeter walls that force cold air under the floor.It then rises up in front of the serversand cools the servers.With our solution, we take the server racksand we butt them right up against our air-conditioning unit.We just use cool water flowing through those coppercoils that you see there.So the hot air from the servers is contained in that hot aisle.It rises up, passes across those coils,where the heat from the air transfersto the water in those coils, and thenthat warm water is then brought outside the data centreto our cooling plant, where it is cooled downthrough our cooling towers and returne