SAN FRANCISCO – Google continued to woo enterprise customers at its Google Cloud Next conference during several keynote presentations here on July 25.
Security is the top concern of enterprise customers, according to Garrick Toubassi, Google’s vice president of engineering for its G Suite offering. To that end, he said Google is using its significant investment in machine learning and artificial intelligence to protect customers.
“We can respond to new threats using AI instantaneously and more broadly than anyone,” said Toubassi. In the case of Gmail, he said Google blocks more than 10 million “bad messages” every minute.
But in cases where Google isn’t sure a message is dangerous–such as potential phishing attempts–Google will display a warning that “This Message Seems Dangerous” in bold lettering. Google is also adding a confidential mode that lets users add restrictions to email, such as preventing a message from being printed or setting a date when the message will expire.
New Enterprise Version of Google Voice
Google also announced an enterprise version of Google Voice that will be integrated to G Suite. This edition lets admins manage users, provision and port phone numbers, access detailed reports and set up call routing functionality. Companies will also be able to use Voice to deploy phone numbers to employees, or even entire departments, at once, and also assigns a number that’s not tied to a specific device. There are also several AI-powered features in Voice designed to facilitate things like voicemail transcription and spam filtering.
Another enterprise feature is the availability of G Suite data regions. Google announced that enterprise customers can now designate what region of the world it wants primary data for select G Suite apps stored when at rest globally, in the U.S. or in Europe. Brad Calder, Google’s vice president of cloud infrastructure, said the company now has data centers in 17 regions and Finland and Hong Kong set to be added this year.
Google also had a surprising IoT hardware announcement. The company’s new Edge TPU (Tensor Processing Unit) is an AI accelerator application-specific integrated circuit (ASIC) it developed specifically for neural network machine learning.
“The new Edge TPU is so small four of them can fit on top of a penny and can fit in your smallest sensors,” said Injong Rhee, Google’s Vice President of IoT Cloud. “We designed this to be highly focused on performance per dollar and performance per watt. It brings a brain to your endpoint devices at an extremely low cost.”
Edge TPUs Tailor-Made for Cloud IoT Core
The new Edge TPUs will work with Google’s Cloud IoT Core, a system that lets developers connect millions of IoT devices around the world to the Google Cloud. The Edge TPU is a way to bring more processing to the “edge” where data is being processed. Rhee gave several examples of how Edge TPUs might be deployed.
“You could, for example, have thousands of these in a city in traffic cameras connected to the Google Cloud Platform that could be used to analyze traffic,” Rhee said.
While Google (6 percent) is far behind cloud leaders Amazon Web Services (33 percent of the market at last glance) and Microsoft’s Azure (13 percent) in market share, it is making steady progress. The head of Google’s cloud computing efforts, Diane Greene, reported in February that the company is pulling in $1 billion per quarter in cloud revenue.
At Google Cloud Next a number of customers detailed their reasons for going with the Google Cloud Platform. Twitter is among those customers who have very demanding requirements.
Twitter Comes Over to GCP
“We operate at massive scale with a relatively small team,” said Twitter CTO Parag Agrawal. Even though the social media giant has built a lot of its infrastructure, Agrawal said it turned to Google’s cloud to help its data scientists better understand the petabytes of data it collects quickly at scale.
Agrawal cited the performance of GCP is the key reason for the move to Google from AWS.
“We did an extensive technical evaluation, and GCP performed best for ad hoc analysis and the network performance is so good that’s it’s a huge advantage for us to be able to do storage and compute separately,” Agrawal said.