Facebook is opening its first data center in Asia

Facebook is opening its first data center in Asia. The company announced today that it is planning an 11-story building in Singapore that will help its services run faster and more efficiently. The development will cost SG$1.4 billion, or around US$1 billion, the company confirmed.

The social networking firm said that it anticipates that the building will be powered 100 percent by renewable energy. It said also that it will utilize a new ‘StatePoint Liquid Cooling’ system technology, which the firm claims minimizes the consumption of water and power.

Facebook said that the project will create hundreds of jobs and “form part of our growing presence in Singapore and across Asia.”

A render of what Facebook anticipates that its data center in Singapore will look like

Asia Pacific accounts for 894 million monthly users, that’s 40 percent of the total user base and it makes it the highest region based on users. However, when it comes to actually making money, the region is lagging. Asia Pacific brought in total sales of $2.3 billion in Facebook’s most recent quarter of business, that’s just 18 percent of total revenue and less than half of the revenue made from the U.S. during the same period. Enabling more efficient services is one step to helping to close that revenue gap.

Facebook isn’t the only global tech firm that’s investing in data centers in Asia lately. Google recently revealed that it plans to develop a third data center in Singapore. The firm also has data centers for Asia that are located in Taiwan.

Google gives its AI the reins over its data center cooling systems

The inside of data centers is loud and hot — and keeping servers from overheating is a major factor in the cost of running them. It’s no surprise then that the big players in this space, including Facebook, Microsoft and Google, all look for different ways of saving cooling costs. Facebook uses cool outside air when possible, Microsoft is experimenting with underwater data centers and Google is being Google and looking to its AI models for some extra savings.

A few years ago, Google, through its DeepMind affiliate, started looking into how it could use machine learning to provide its operators some additional guidance on how to best cool its data centers. At the time, though, the system only made recommendations and the human operators decided whether to implement them. Those humans can now take longer naps during the afternoon, because the team has decided the models are now good enough to give the AI-powered system full control over the cooling system. Operators can still intervene, of course, but as long as the AI doesn’t decide to burn the place down, the system runs autonomously.

The new cooling system is now in place in a number of Google’s data centers. Every five minutes, the system polls thousands of sensors inside the data center and chooses the optimal actions based on this information. There are all kinds of checks and balances here, of course, so the chances of one of Google’s data centers going up in flames because of this is low.

Like most machine learning models, this one also became better as it gathered more data. It’s now delivering energy savings of 30 percent on average, compared to the data centers’ historical energy usage.

One thing that’s worth noting here is that Google is obviously trying to save a few bucks, but in many ways, the company is also looking at this as a way of promoting its own machine learning services. What works in a data center, after all, should also work in a large office building. “In the long term, we think there’s potential to apply this technology in other industrial settings and help tackle climate change on an even grander scale,” DeepMind writes in today’s announcement.

Google adds new Singapore data center as Southeast Asia reaches 330M internet users

Google is adding a third data center to its presence in Singapore in response to continued internet growth in Southeast Asia.

It’s been three years since it added a second center in Singapore, and during that time the company estimates that something in the region of 70 million people across Southeast Asia have come online for the first time. That takes the region to over 330 million internet users, but with a population of over 650 million, there’s plenty more to come.

The local data centers don’t exclusively serve their immediate proximity — Asia data centers can handle U.S. traffic, and likewise — but adding more local capacity does help Google services, and companies that run their business on Google’s cloud, run quicker for internet users in that specific region. So not only is it good for locals, but it’s important for Google’s business, which counts the likes of Singapore Airlines, Ninjavan, Wego, Go-Jek and Carousell as notable cloud customers.

The search giant also operates a data center in Taiwan. The company had planned to augment Taiwan and Singapore with a center in Hong Kong, but that project was canned in 2013 due to challenges in securing real estate.

Google opened its first Singapore data center in 2011, and this newest facility will take it to around $850 million spent in Singapore to date, the company confirmed, and to over $1 billion when including Taiwan.

Why Microsoft wants to put data centers at the bottom of the ocean

Earlier this week, Microsoft announced the second phase of Project Natick, a research experiment that aims to understand the benefits and challenges of deploying large-scale data centers under water. In this second phase, the team sank a tank the size of a shipping container with numerous server racks off the coast of the Orkney islands and plans to keep it there for a few years to see if this is a viable way of deploying data centers in the future.

Computers and water famously don’t mix, as anyone who has ever spilled a cup of water over a laptop, so putting server racks under water sure seems like an odd idea. But as Microsoft Research’s Ben Cutler told me, there are good reasons for why the bottom of the ocean may be a good place for setting up servers.

The vast majority of people live within 200 kilometers of the ocean, Cutler noted, and Microsoft’s cloud strategy has long been about putting its data centers close to major population centers. So with large offshore wind farms potentially providing renewable power and the obvious cooling benefits of being under water (and cooling is a major cost factor for data centers), trying an experiment like this makes sense.

“Within Microsoft, we’ve spent an enormous amount of energy and time on cloud — and obviously money,” Cutler explained when I asked him about the genesis of this project. “So we’re always looking for new ways that we can innovate. And this idea sort of gelled originally with one of our employees who worked on a U.S. Navy submarine and knew something about this technology, and that this could maybe be applied to data centers.”

So back in 2013, the team launched phase one and dropped a small pressure vessel with a few servers into the waters of the Pacific Ocean. That experiment worked out pretty well. Even the local sea life seemed to appreciate it. The team found that the vessel didn’t heat up the water close to it by more than a few thousandths of a degree Celsius warmer than a few feet further away from it. The noise, too, was pretty much negligible. “We found that once we were a few meters away from the vessel, we were drowned out by background noise, which is things like snapping shrimp, which is actually the predominant sound of the ocean,” Cutler told me, and stressed that the team’s job is to measure all of this as the ocean is obviously a very sensitive environment. “What we found was that we’re very well received by wildlife and we’re very quickly colonized by crabs and octopus and other things that were in the area.”

For this second phase, the team decided on the location off the coast of Scotland because it’s also home to the European Marine Energy Center, so the infrastructure for powering the vessel from renewable energy from on- and off-shore sources was already in place.

Once the vessel is in the ocean, maintenance is pretty much impossible. The idea here is to accept that things will fail and can’t be replaced. Then, after a few years, the plan is to retrieve the vessel, refurbish it with new machines and deploy it again.

But as part of this experiment, the team also thought about how to best make these servers last as long as possible — and because nobody has to go replace a broken hard drive inside the vessel, the team decided to fill the atmosphere with nitrogen to prevent corrosion, for example. To measure the impact of that experiment, Microsoft also maintains a similar vessel on land so it can compare how well that system fares over time.

Cutler stressed that nothing here is cutting-edge technology. There are no exotic servers here and both underwater cabling and building vessels like this are well understood at this point.

Over time, Cutler envisions a factory that can prefabricate these vessels and ship them to where they are needed. That’s why the vessel is about the size of a shipping container and the team actually had it fabricated in France, loaded it on a truck and shipped it to England to test this logistics chain.

Whether that comes to pass remains to be seen, of course. The team is studying the economics of Natick for the time being, and then it’s up to Microsoft’s Azure team to take this out of the research labs and put it into more widespread production. “Our goal here is to drive this to a point where we understand that the economics make sense and that it has the characteristics that we wanted it to, and then it becomes a tool for that product group to decide whether and where to use it,” said Cutler.

Google expands its Cloud Platform region in the Netherlands

Google today announced that it has expanded its recently launched Cloud Platform region in the Netherlands with an additional zone. The investment, which is worth a reported 500 million euros, expands the existing Netherlands region from two to three regions. With this, all four of the Central European Google Cloud Platform zones now feature three zones (which are akin to what AWS would call “availability zones”) that allow developers to build highly available services across multiple data centers.

Google typically aims to have a least three zones in every region, so today’s announcement to expand its region in the Dutch province of Groningen doesn’t come as a major surprise.

With this move, Google is also making Cloud SpannerCloud BigtableManaged Instance Groups, and Cloud SQL available in the region.

Over the course of the last two years, Google has worked hard to expand its global data center footprint. While it still can’t compete with the likes of AWS and Azure, which currently offers more regions than any of its competitors, the company now has enough of a presence to be competitive in most markets.

In the near future, Google also plans to open regions in Los Angeles, Finland, Osaka and Hong Kong. The major blank spots on its current map remain Africa, China (for rather obvious reasons) and Eastern Europe, including Russia.