Google gives its AI the reins over its data center cooling systems

The inside of data centers is loud and hot — and keeping servers from overheating is a major factor in the cost of running them. It’s no surprise then that the big players in this space, including Facebook, Microsoft and Google, all look for different ways of saving cooling costs. Facebook uses cool outside air when possible, Microsoft is experimenting with underwater data centers and Google is being Google and looking to its AI models for some extra savings.

A few years ago, Google, through its DeepMind affiliate, started looking into how it could use machine learning to provide its operators some additional guidance on how to best cool its data centers. At the time, though, the system only made recommendations and the human operators decided whether to implement them. Those humans can now take longer naps during the afternoon, because the team has decided the models are now good enough to give the AI-powered system full control over the cooling system. Operators can still intervene, of course, but as long as the AI doesn’t decide to burn the place down, the system runs autonomously.

The new cooling system is now in place in a number of Google’s data centers. Every five minutes, the system polls thousands of sensors inside the data center and chooses the optimal actions based on this information. There are all kinds of checks and balances here, of course, so the chances of one of Google’s data centers going up in flames because of this is low.

Like most machine learning models, this one also became better as it gathered more data. It’s now delivering energy savings of 30 percent on average, compared to the data centers’ historical energy usage.

One thing that’s worth noting here is that Google is obviously trying to save a few bucks, but in many ways, the company is also looking at this as a way of promoting its own machine learning services. What works in a data center, after all, should also work in a large office building. “In the long term, we think there’s potential to apply this technology in other industrial settings and help tackle climate change on an even grander scale,” DeepMind writes in today’s announcement.

Google Firebase adds in-app messaging, JIRA integration, new reports and more

Firebase is now Google’s default platform for app developers, and over the course of the last four years since it was acquired, the service has greatly expanded its feature set and integrations with Google services. Today, it’s rolling out yet another batch of updates that bring new features, deeper integrations and a few design updates to the service.

The highlight of this release is the launch of in-app messaging, which will allow developers to send targeted and contextual messages to users as they use the app. Developers can customize the look and feel of these in-app notifications, which are rolling out today, but what’s maybe even more important is that this feature is integrated with Firebase Predictions and Google Analytics for Firebase so that developers can not just react to current behavior but also Firebase’s predictions of how likely a user is to spend some additional money or stop using the app.

Developers who use Atlassian’s JIRA will also be happy to hear that Firebase is launching an integration with this tool. Firebase users can now create JIRA issues based on crash reports in Firebase. This integration will roll out in the next few weeks.

Another new integration is a deeper connection to Crashlytics, which Google acquired from Twitter in 2017 (together with Fabric). Firebase will now let you export this data to BigQuery to analyze it — and then visualize it in Google’s Data Studio. And once it’s in BigQuery, it’s your data, so you’re not dependent on Firebase’s retention and deletion defaults.

Talking about reports, Firebase Cloud Messaging is getting a new reporting dashboard and the Firebase Console’s Project Overview page has received a full design overhaul that’ll allow you to see the health and status of your apps on a single page. The Latest Release section now also features live data. These features will start rolling out today and should become available to everybody in the next few weeks.

Firebase Hosting, the service’s web content hosting service, is also getting a small update and now allows you to host multiple websites within one project. And when you push an update, Firebase Hosting now only uploads the files that have changed between releases, which should speed up that process quite a bit.

Oracle launches autonomous database for online transaction processing

Oracle executive chairman and CTO, Larry Ellison, first introduced the company’s autonomous database at Oracle Open World last year. The company later launched an autonomous data warehouse. Today, it announced the next step with the launch of the Oracle Autonomous Transaction Processing (ATP) service.

This latest autonomous database tool promises the same level of autonomy — self-repairing, automated updates and security patches and minutes or less of downtime a month. Juan Loaiza SVP for Oracle Systems at the database giant says the ATP cloud service is a modernized extension of the online transaction processing databases (OLTP) they have been creating for decades. It has machine learning and automation underpinnings, but it should feel familiar to customers, he says.

“Most of the major companies in the world are running thousands of Oracle databases today. So one simple differentiation for us is that you can just pick up your on-premises database that you’ve had for however many years, and you can easily move it to an autonomous database in the cloud,” Loaiza told TechCrunch.

He says that companies already running OLTP databases are ones like airlines, big banks and financial services companies, online retailers and other mega companies who can’t afford even a half hour of downtime a month. He claims that with Oracle’s autonomous database, the high end of downtime is 2.5 minutes per month and the goal is to get much lower, basically nothing.

Carl Olofson, an IDC analyst who manages IDC’s database management practice says the product promises much lower operational costs and could give Oracle a leg up in the Database as a Service market. “What Oracle offers that is most significant here is the fact that patches are applied without any operational disruption, and that the database is self-tuning and, to a large degree, self-healing. Given the highly variable nature of OLTP database issues that can arise, that’s quite something,” he said.

Adam Ronthal, an analyst at Gartner who focuses on the database market, says the autonomous database product set will be an important part of Oracle’s push to the cloud moving forward. “These announcements are more cloud announcements than database announcements. They are Oracle coming out to the world with products that are built and architected for cloud and everything that implies — scalability, elasticity and a low operational footprint. Make no mistake, Oracle still has to prove themselves in the cloud. They are behind AWS and Azure and even GCP in breadth and scope of offerings. ATP helps close that gap, at least in the data management space,” he said.

Oracle certainly needs a cloud win as its cloud business has been heading in the wrong direction the last couple of earnings report to the point they stopped breaking out the cloud numbers in the June report.

Ronthal says Oracle needs to gain some traction quickly with existing customers if it’s going to be successful here. “Oracle needs to build some solid early successes in their cloud, and these successes are going to come from the existing customer base who are already strategically committed to Oracle databases and are not interested in moving. (This is not all of the customer base, of course.) Once they demonstrate solid successes there, they will be able to expand to net new customers,” he says.

Regardless how it works out for Oracle, the ATP database service will be available as of today.

India may become next restricted market for U.S. cloud providers

Data sovereignty is on the rise across the world. Laws and regulations increasingly require that citizen data be stored in local data centers, and often restricts movement of that data outside of a country’s borders. The European Union’s GDPR policy is one example, although it’s relatively porous. China’s relatively new cloud computing law is much more strict, and forced Apple to turn over its Chinese-citizen iCloud data to local providers and Amazon to sell off data center assets in the country.

Now, it appears that India will join this policy movement. According to Aditya Kalra in Reuters, an influential cloud policy panel has recommended that India mandate data localization in the country, for investigative and national security reasons, in a draft report set to be released later this year. That panel is headed by well-known local entrepreneur Kris Gopalakrishnan, who founded Infosys, the IT giant.

That report would match other policy statements from the Indian political establishment in recent months. The government’s draft National Digital Communications Policy this year said that data sovereignty is a top mission for the country. The report called for the government by 2022 to “Establish a comprehensive data protection regime for digital communications that safeguards the privacy, autonomy and choice of individuals and facilitates India’s effective participation in the global digital economy.”

It’s that last line that is increasingly the objective of governments around the world. While privacy and security are certainly top priorities, governments now recognize that the economics of data are going to be crucial for future innovation and growth. Maintaining local control of data — through whatever means necessary — ensures that cloud providers and other services have to spend locally, even in a global digital economy.

India is both a crucial and an ironic manifestation of this pattern. It is crucial because of the size of its economy: public cloud revenues in the country are expected to hit $2.5 billion this year, according to Gartner’s estimates, an annual growth rate of 37.5%. It is ironic because much of the historical success of India’s IT industry has been its ability to offer offshoring and data IT services across borders.

Indian Prime Minister Narendra Modi has made development and rapid economic growth a top priority of his government. (Krisztian Bocsi/Bloomberg via Getty Images)

India is certainly no stranger to localization demands. In areas as diverse as education and ecommerce, the country maintains strict rules around local ownership and investment. While those rules have been opening up slowly since the 1990s, the explosion of interest in cloud computing has made the gap in regulations around cloud much more apparent.

If the draft report and its various recommendations become law in India, it would have significant effects on public cloud providers like Microsoft, Google, Amazon, and Alibaba, all of whom have cloud operations in the country. In order to comply with the regulations, they would almost certainly have to expend significant resources to build additional data centers locally, and also enforce data governance mechanisms to ensure that data didn’t flow from a domestic to a foreign data center accidentally or programmatically.

I’ve written before that these data sovereignty regulations ultimately benefit the largest service providers, since they’re the only ones with the scale to be able to competently handle the thicket of constantly changing regulations that govern this space.

In the India case though, the expense may well be warranted. Given the phenomenal growth of the Indian cloud IT sector, it’s highly likely that the major cloud providers are already planning a massive expansion to handle the increasing storage and computing loads required by local customers. Depending on how simple the regulations are written, there may well be limited cost to the rules.

One question will involve what level of foreign ownership will be allowed for public cloud providers. Given that several foreign companies already exist in the marketplace, it might be hard to completely eliminate them entirely in favor of local competitors. Yet, the large providers will have their work cut out for them to ensure the market stays open to all.

The real costs though would be borne by other companies, such as startups who rely on customer datasets to power artificial intelligence. Can Indian datasets be used to train an AI model that is used globally? Will the economics be required to stay local, or will the regulations be robust enough to handle global startup innovation? It would be a shame if the very law designed to encourage growth in the IT sector was the one that put a dampener on it.

India’s chief objective is to ensure that Indian data benefits Indian citizens. That’s a laudable goal on the surface, but deeply complicated when it comes time to write these sorts of regulations. Ultimately, consumers should have the right to park their data wherever they want — with a local provider or a foreign one. Data portability should be key to data sovereignty, since it is consumers who will drive innovation through their demand for best-in-class services.

The Istio service mesh hits version 1.0

Istio, the service mesh for microservices from Google, IBM, Lyft, Red Hat and many other players in the open-source community, launched version 1.0 of its tools today.

If you’re not into service meshes, that’s understandable. Few people are. But Istio is probably one of the most important new open-source projects out there right now. It sits at the intersection of a number of industry trends, like containers, microservices and serverless computing, and makes it easier for enterprises to embrace them. Istio now has more than 200 contributors and the code has seen more than 4,000 check-ins since the launch of  version 0.1.

Istio, at its core, handles the routing, load balancing, flow control and security needs of microservices. It sits on top of existing distributed applications and basically helps them talk to each other securely, while also providing logging, telemetry and the necessary policies that keep things under control (and secure). It also features support for canary releases, which allow developers to test updates with a few users before launching them to a wider audience, something that Google and other webscale companies have long done internally.

“In the area of microservices, things are moving so quickly,” Google product manager Jennifer Lin told me. “And with the success of Kubernetes and the abstraction around container orchestration, Istio was formed as an open-source project to really take the next step in terms of a substrate for microservice development as well as a path for VM-based workloads to move into more of a service management layer. So it’s really focused around the right level of abstractions for services and creating a consistent environment for managing that.”

Even before the 1.0 release, a number of companies already adopted Istio in production, including the likes of eBay and Auto Trader UK. Lin argues that this is a sign that Istio solves a problem that a lot of businesses are facing today as they adopt microservices. “A number of more sophisticated customers tried to build their own service management layer and while we hadn’t yet declared 1.0, we hard a number of customers — including a surprising number of large enterprise customer — say, ‘you know, even though you’re not 1.0, I’m very comfortable putting this in production because what I’m comparing it to is much more raw.’”

IBM Fellow and VP of Cloud Jason McGee agrees with this and notes that “our mission since Istio’s launch has been to enable everyone to succeed with microservices, especially in the enterprise. This is why we’ve focused the community around improving security and scale, and heavily leaned our contributions on what we’ve learned from building agile cloud architectures for companies of all sizes.”

A lot of the large cloud players now support Istio directly, too. IBM supports it on top of its Kubernetes Service, for example, and Google even announced a managed Istio service for its Google Cloud users, as well as some additional open-source tooling for serverless applications built on top of Kubernetes and Istio.

Two names missing from today’s party are Microsoft and Amazon. I think that’ll change over time, though, assuming the project keeps its momentum.

Istio also isn’t part of any major open-source foundation yet. The Cloud Native Computing Foundation (CNCF), the home of Kubernetes, is backing linkerd, a project that isn’t all that dissimilar from Istio. Once a 1.0 release of these kinds of projects rolls around, the maintainers often start looking for a foundation that can shepherd the development of the project over time. I’m guessing it’s only a matter of time before we hear more about where Istio will land.

The cloud continues to grow in leaps and bounds, but it’s still AWS’s world

With the big cloud companies reporting recently, we can be sure of a couple of things: the market continues to expand rapidly and AWS is going to be hard to catch. Depending on whose numbers you look at, the market grew around 50 percent as it continues its unprecedented expansion.

Let’s start with market leader, Amazon Web Services. Canalys has them with 31 percent of the market while Synergy Research puts them at 34 percent. That’s close enough to be considered a dead heat. As Synergy’s John Dinsdale points out, AWS is so dominant that in spite of mega growth numbers from other vendors, it is still bigger than the next four competitors combined, even after all these years.

Those competitors, by the way, are no slouches by any means. They include Microsoft, Google, IBM and Alibaba, so some pretty elite enterprise players. As we’ve noted in past analyses, one of the primary issues for all the competitors is how late they were to the market. They gave Amazon a massive head start, and they show no signs of ceding that lead any time soon.

 

Of course, AWS isn’t standing still either, it grew 48 percent last quarter by Canalys’ estimate, while Synergy has AWS marketshare up a tick to 34 percent.

Interestingly, Synergy finds this overall competitor growth did not cut into Amazon’s marketshare at all, but was the result of continued growth in the marketplace, as companies continue to shift workloads to the cloud. “The rapid growth of Microsoft, Google and Alibaba sees them all increase their market shares too, but it is not at the expense of AWS,” Synergy’s John Dinsdale pointed out in a statement.

Microsoft and Google still growing fast

That is not to say that Microsoft and Google are not growing too. In fact, Canalys had Microsoft growing at an 89 percent clip last quarter while Google grew an amazing 108 percent. It’s always important to point out that it’s easier to grow from a small number to a bigger number than it is to grow from a big number to a bigger number. Yet AWS continues to defy that idea and grow anyway, although not quite at the rate of its competitors.

Synergy reports these marketshare percentages for the competitors: Microsoft 14 percent, IBM 8 percent, Google 6 percent and Alibaba 4 percent, while Canalys shows Microsoft with 18 percent and Google with 8 percent. It did not report on IBM or Alibaba.

 

While these growth numbers have to drop at some point, they could continue to grow for the next several years as large companies get more comfortable with the cloud and move increasing percentages of their workloads.

Of course, even then it’s not a zero sum game. As we see increasing use of data-intensive workloads involving internet of things, blockchain and artificial intelligence, it’s entirely possible that the market will continue to grow even with fewer workloads moving from private data centers.

For now, even with their eye-popping growth numbers, the competition continues to chase AWS. Even as these companies find ways to differentiate themselves with different approaches, offerings and services, the market dynamics are hardening and catching AWS seems less and less likely.

It also seems increasingly less likely that some small upstart can come in and undermine the top players, as it just takes too much investment to keep up with them and their scale. “In a large and strategically vital market that is growing at exceptional rates, [the market leaders] are throwing the gauntlet down to their smaller competitors by continuing to invest enormous amounts in their data center infrastructure and operations. Their increased market share is clear evidence that their strategies are working,” Synergy’s Dinsdale said a statement.

What the competitors need to do now is continue to focus on customer requirements and what they can offer in terms of price and service to continue to take advantage of their own unique strengths. There’s plenty of room in this space for everyone to thrive, but some will thrive more than others. That’s just the nature of the market.

Dropbox add-on makes it easier to manage Gmail attachments

When Dropbox announced it was integrating its storage product with GSuite in March, it was more of a heads up that the two companies were working together. Today at Google Next, Dropbox announced a new add-on to manage Gmail attachments in Dropbox.

Ketan Nayak, a product manager at Dropbox says this is the first concrete piece to come out of that earlier announcement. “Back in March, we announced a broader partnership with Google to bring about integrations and product initiatives across a range of different Google Cloud products. And what we wanted to share with you today was that we’re bringing one of the first [pieces] in this product partnership, the Dropbox add-on for Gmail, to GA,” he said.

The partnership makes sense for the two companies as they share lots of overlapping users with more than 50 percent of Dropbox users also using G Suite. Being able to access Dropbox without leaving Gmail or other G Suite tool could potentially save users time and effort spent copying and pasting and switching programs.

Instead, there is a direct integration now that displays the attachments in a side panel after which you can save them if you so choose directly into your Dropbox, and the experience is the same in the mobile app or on the web, Nayak explained.

Dropbox displays the attachments in the email in a side panel for easy access. Photo: Dropbox

“We created this cross-browser, cross-platform solution that doesn’t exist today, especially on mobile, where a lot of our users live and work across these different tools. It’s been really hard for users to navigate in and out of different apps, and we really think of this add-on as a first step that enables users across our two platforms to start working more seamlessly,” Nayak explained.

Indeed, other integrations between products are already in the works including one that will allow users to insert a link to a file stored in Dropbox in an email without leaving the program. “Users can share and generate links to Dropbox content while composing an email,” he said. While that particular functionality isn’t ready yet, the company was demonstrating it on stage at Google Next today and it should be available soon.

Nayak says, these announcements are really just a starting point of what they hope will be a much more comprehensive set of integrations between the two company’s products in the future.

Virtru teams up with Google to bring its end-to-end encryption service to Google Drive

Virtru, which is best known for its email encryption service for both enterprises and consumers, is announcing a partnership with Google today that will bring the company’s encryption technology to Google Drive.

Only a few years ago, the company was still bolting its solution on top of Gmail without Google’s blessing, but these days, Google is fully on board with Virtru’s plans.

Its new Data Protection for Google Drive extends its service for Gmail to Google’s online file storage service. It ensures that files are encrypted before upload, which ensures the files remain protected, even when they are shared outside of an organization. The customer remains in full control of the encryption keys, so Google, too, has no access to these files, and admins can set and manage access policies by document, folder and team drive.

Virtru’s service uses the Trusted Data Format, an open standard the company’s CTO Will Ackerly developed at the NSA.

While it started as a hack, Virtru is Google’s only data protection partner for G Suite today, and its CEO John Ackerly tells me the company now gets what he and his team are trying to achieve. Indeed, Virtru now has a team of engineers that works with Google. As John Ackerly also noted, GDPR and the renewed discussion around data privacy is helping it gain traction in many businesses, especially in Europe, where the company is opening new offices to support its customers there. In total, about 8,000 organization now use its services.

It’s worth noting that while Virtru is announcing this new Google partnership today, the company also supports email encryption in Microsoft’s Office 365 suite.

Google is baking machine learning into its BigQuery data warehouse

There are still a lot of obstacles to building machine learning models and one of those is that in order to build those models, developers often have to move a lot of data back and forth between their data warehouses and wherever they are building their models. Google is now making this part of the process a bit easier for the developers and data scientists in its ecosystem with BigQuery ML, a new feature of its BigQuery data warehouse, by building some machine learning functionality right into BigQuery.

Using BigQuery ML, developers can build models using linear and logistical regression right inside their data warehouse without having to transfer data back and forth as they build and fine-tune their models. And all they have to do to build these models and get predictions is to write a bit of SQL.

Moving data doesn’t sound like it should be a big issue, but developers often spend a lot of their time on this kind of grunt work — time that would be better spent on actually working on their models.

BigQuery ML also promises to make it easier to build these models, even for developers who don’t have a lot of experience with machine learning. To get started, developers can use what’s basically a variant of standard SQL to say what kind of model they are trying to build and what the input data is supposed to be. From there, BigQuery ML then builds the model and allows developers to almost immediately generate predictions based on it. And they won’t even have to write any code in R or Python.

These new features are now available in beta.

Google wants Go to become the go-to language for writing cloud apps

The Google -incubated Go language is one of the fastest growing programming languages today, with about one million active developers using it worldwide. But the company believes it can still accelerate its growth, especially when it comes to its role in writing cloud applications. And to do this, the company today announced Go Cloud, a new open-source library and set of tools that makes it easier to build cloud apps with Go .

While Go is highly popular among developers, Google argues that the language was missing a standard library for interfacing with cloud services. Today, developers often have to essentially write their own libraries to use the features of each cloud, but organizations today want to be able to easily move their workloads between clouds.

What Go Cloud then gives these developers is a set of open generic cloud APIs for accessing blog storage, MySQL databases and runtime configuration, as well as an HTTP server with built-in logging, tracing and health checking. Right now, the focus is on AWS and Google Cloud Platform. Over time, Google plans to add more features to Go Cloud and support for more cloud providers (and those cloud providers can, of course, build their own support, too).

This, Google argues, allows developer teams to build applications that can easily run on any supported cloud without having to re-architect large parts of their applications.

As Google VP of developer relations Adam Seligman told me, the company hopes this move will kick off an explosion of libraries around Go — and, of course, that it will accelerate Go’s growth as a language for the cloud.