Last week Alphabet, Google’s parent company, published its quarterly financial results. But this time it was different: for the first time it disclosed details on Google Cloud business. Well, it’s about time, given that AWS and Microsoft have been reporting their cloud business financials for a good few years now. I’m no finance guy, but there’s lots to learn from the stuff shared (as well as from what’s NOT shared) in these reports.
Alphabet is a strong player. Just last month Alphabet has become the fourth US company to have reached $1 Trillion market cap, joining Apple, Amazon and Microsoft. But how strong is it in the cloud business, against those players?
Show me the figures
According to its report, Google Cloud generated $8.9 billion revenues in 2019, which accounts to 5.5% of the company’s overall revenue. They show an impressive %53 increase of the sales year over year. Still this is a low figure, if you consider that AWS reported $9.95 billion in the last quarter alone and Microsoft reported $11.87 billion.
What does “Google Cloud” mean here, anyway?
“Google Cloud” in this report doesn’t refer just to GCP (Google Cloud Platform, the rough equivalent of Amazon’s AWS offering), but bulks together also G Suite (Gmail, Drive, Docs etc.) where business is more established, so we can’t really know GCP’s share of the business, though they stated:
The growth rate of GCP was meaningfully higher than that of Cloud overall.
Microsoft pulled off the same trick when it started reporting, bulking together Azure, Office365 and other SaaS services, and has been doing it since in different combinations. So when you see that $11.87 billion figure for Microsoft’s “Intelligent Cloud”, keep in mind it’s not just Azure but also SQL Server, Visual Studio and a host of other products and services.
How profitable is it?
Google only reported the revenues, not the operational income, so we can’t really tell how profitable Google Cloud is. Given Google’s aggressive efforts to gain market share, I wouldn’t be surprised if the customer acquisition costs and sales force headcount don’t leave much of that revenue. Just for comparison, AWS reported $2.60 billion operational income, attributing %67 of Amazon’s total income. Now THAT’S a profitable cloud business!
Where does GCP shine?
The report indicates GCP growth was led by the Infrastructure offerings and the Data and Analytics platform. Google also claims seeing “strong uptake of our multi-Cloud Anthos offering”, which I’d assume refers to hybrid-cloud use cases where orgs migrate parts of their on-prem to Google Cloud. Are there also significant multi-cloud use cases in production, combining other private/public cloud vendors? Would be interesting to see.
Google is putting a lot of effort on analytics and AI (artificial intelligence) to gain leadership there. Another strategy is targeting vertical solutions combining GCP capabilities with other Google and Alphabet capabilities. Google gave some examples around Healthcare – combining cloud and AI to manage medical records while helping with diagnosis of disease; or Automotive – combining Cloud, Android Auto, in some cases Waymo. And let’s not forget Google’s advertisement ecosystem, enabling ads monetization around multiple use cases.
Alphabet’s recent report shows that Google remains the challenger in the cloud domain, with figures lagging behind Amazon and Microsoft. But some interesting figures, aggressive GTM and growth engines show Google plans on putting up a serious fight.
Last week Clayton Christensen passed away. If the name doesn’t ring a bell, then you may recognize his famous book “The Innovator’s Dilemma“, the man who coined the term “disruptive technology” and dedicated his life to research it.
I’d always been skeptical about Academia’s ability to study innovation. After all, I’m a Start-Up Nation native, born and raised in Israel where we suckle innovation and entrepreneurship with the milk and honey. Who can teach that secret sauce? But then I came across Christensen’s work. It was a clear and practical analysis of the tech market’s dynamics, that had counter-intuitive observations and at the same time they made perfect sense. He showed
how even the most outstanding companies can do everything right–yet still lose market leadership.
His theory influenced industry leaders such as Steve Jobs, Andy Grove, and Reed Hastings.
Christensen provided valuable insights into how the market works, dynamics of companies and much more. But one practical lesson came to my mind as I heard of Christensen’s demise, a valuable tool I‘ve carried with me throughout my long journey in hi-tech sphere, and most notably when leading new products. A tool to understand your customer.
It’s called simply “Jobs to be done“. In a nutshell it says that customers have jobs to be done. You should ask yourself:
why should they hire my product for the job?
This answer will lead you to the right product to answer the need and to drive customer action. Whether you’re an entrepreneur, an innovator, a CEO, CTO or even a mere product manager, this is a tool you should cherish. Here it is from Christensen himself, served with a Milkshake.
Thank you Christensen for showing the relevance of Academia in the modern era of rapid tech revolution, and for helping us become better innovators. Rest in peace Professor.
Business Intelligence (BI) tools are hot again. Just take a look at the past weeks, with Salesforce buying Tableau, Google buying Looker, and Logi Analytics buying Zoomdata. This joins other M&A deals in the past months such as Logi Analytics’ additional acquisition (yes, two in a row!) of Jinfonet in February, and Sisense acquiring Periscope Data last month.
Traditional BI is about sharp analysts who master the art and science of digging into data, intelligently querying and analyzing it to find signals amid the noise and digging up insights.
But the world has changed.
We live in the world of Big Data, where the sheer amounts of the data, the diversity of the data sources and the real-time response to the data render traditional human analysis largely impractical.
It is the age of Big Data Analytics, utilizing clever Artificial Intelligence and Machine Learning (AI/ML) algorithms to find the needle in the data haystack and surface insights. AI’s superiority was symbolically shown when Google’s AlphaGo AI defeated the world’s best Go player Sedol in 2017, some 20 years after IBM’s Deep Blue defeated Chess grandmaster Kasparov.
Big players such as Google, Amazon and Facebook have been mastering their markets for many years by using AI/ML internally. In the past few years cloud vendors also started exposing those capabilities as cloud services for everyone to use, thereby evening the (data) plane field and boosting AI/ML adoption.
This change renders older technologies irrelevant and forces consolidation.
I gave before the example of the Hadoop big data technology, where last year leading vendors Cloudera and Hortonworks had to merge to survive, and even that didn’t help them, judging by the CEO departure this month after poor reports. And their arch-competitor MapR is doing even worse now desperately looking for a buyer to avoid shutting down.
The same consolidation is facing BI tools. They need to find their place among the new AI/ML data analytics to remain relevant.
Do traditional BI tools have room in the new world?
While machines can do a fine job crunching massive amounts of data and surfacing insights, there’s still need for humans in the process to go through those insights, rank, filter, prioritize, or even simply monitor and take action items accordingly. It requires good data visualization. And that’s where BI tools can fit in. Taking complex data and insights and presenting them in a simple, intuitive and self-service user experience is the skill that BI tools have been developing for many years.
What should we expect next?
With so many BI tools out there, this consolidation will continue, where the leading tools will merge with leading analytics platforms to give an end-to-end data management experience, and smaller players will just disappear or grow into niche areas. The cloud vendors will keep leading this battle: Google will bild Looker into its AI suite, Microsoft will push its established Power BI and tighten the integration with its other Azure data services, and Amazon – the leading cloud vendor though not necessarily on this front – will probably step up its data visualization layer.
Follow Horovits on Twitter!
Amazon’s re:Invent yearly conference last week was packed as ever, with people and with announcements. But one announcement caught my attention, and you wouldn’t like to miss it either:
Amazon is officially stepping into hybrid cloud.
AWS Outposts is the name of Amazon’s hybrid cloud service, offering customers the ability to run compute and storage on-premises using the same native AWS APIs they know and love from the cloud. In Amazon’s own words:
AWS Outposts bring native AWS services, infrastructure, and operating models to virtually any data center, co-location space, or on-premises facility for a truly consistent and seamless hybrid cloud.
This is a groundbreaking change for AWS, the dogmatic crusader of public cloud. Back in 2016 I pointed out that hybrid cloud strategies gain traction with enterprises, one of the most lucrative sectors, and their needs and workloads cannot be met by a public cloud pure-play. Similar need for hybrid cloud arises with industrial automation, telecom network virtualization and other sectors. Now it seems Amazon finally embraced it.
The first crack in Amazon’s public cloud dogma was made in late 2016 when Amazon made a strategic partnership with VMware, which enabled Amazon to step into hybrid cloud without “getting its hands dirty”. This alliance is receiving a boost now with AWS Outposts: alongside the native AWS APIs on-prem offering, Outposts will come in another variant running VMware Cloud on AWS service.
The second crack in Amazon’s public cloud dogma was made in 2017 when the Internet-of-Things drove Amazon to offer an Edge Computing service named Greengrass, which enabled customers for the first time to run their favorite AWS services locally.
AWS is well behind in realizing the hybrid cloud: Amazon’s primary cloud rival Microsoft announced its Azure Stack in 2015. Microsoft has been pushing the adoption of Azure Stack by enterprises also through partnerships with established manufacturers and enterprise incumbents such as Dell-EMC, HPE, Huawei and Lenovo, which provide the hyperconverged hardware units and therefore have the business incentive to promote the joint solution.
Though the details are scarce, and Outposts isn’t due in general availability until second half of 2019, it seems from the announcement that Amazon is taking a different strategy, offering both the software, hardware and support by AWS in an attempt to keep all ends of the business in-house and in tight control:
Outposts infrastructure is fully managed, maintained, and supported by AWS, and its hardware and software components can be updated to deliver access to the latest AWS services.
Many other major players which had tried launching public cloud offering in the past few years have discovered Amazon’s domination in this arena and ultimately changed strategy, shut down their offering and turned to hybrid cloud, with emphasis on serving enterprise workloads. Just read the stories of major companies such as HP, Cisco, Verizon and Rackspace.
The popular containers movement also accelerates the hybrid cloud approach, by enabling same deployment constructs through on-prem and cloud and easy mobility between them.
Hybrid cloud model is significant to many industries and workloads, taking into account various constraints such as data privacy, geo-location, and latency. This need requires more mature solutions, and Amazon weighing in will surely push the industry forward, as it has done for public cloud, serverless computing and other domains.
Follow Horovits on Twitter!
In 2011 Marc Andreessen explained in his monumental essay “why software is eating the world“. This originally controversial statement is by now a well-established fact of life. So what the next step in the evolution? I argue it’s open source.
Open Source Is Eating The World
Yes, you’ve heard it right. Open source is no longer just a topic for computer geeks running techie experiments and for bootstrapped startups looking to save some bucks. Open source has become mainstream. You don’t have to take my word for it, just take a look at this year’s multi-billion dollar mergers and acquisitions (M&A):
- IBM is acquiring Red Hat, the first open source company to cross $1B revenues, for $34 billion.
- Cloudera and Hortonworks, the vendors behind open source Hadoop that started the Big Data Analytics wave, are merging in a $5.2 billion deal.
- Microsoft acquired GitHub, the popular version control system, for $7.5 billion.
- Salesforce acquired MuleSoft, a popular open source messaging and integration middleware platform, for $6.5 billion in May. Interestingly enough, last week Salesforce made a strategic investment in Docker, another popular open source vendor for containers, alongside MuleSoft’s announced partnership with Docker. Could that mark Salesforce’s next major open source move?
Mergers and acquisitions, however, are not enough to make such a paradigm shift. As a young engineer I grew up programming in Java, using Solaris Unix and Open Office, all of which are open source software by Sun Microsystems. Then Oracle acquired Sun in 2009. Has Oracle changed its core DNA following that acquisition? Not exactly.
It takes more than just commercial M&A.
The big players need to embrace and commit to open source culture.
And indeed I see that happening in an increasing pace:
Microsoft is a great example: As I pointed out already back in 2015, Microsoft has made a strategic choice to embrace open source. Just last month Corporate Vice President Nat Friedman, who was recently appointed as GitHub CEO, made waves when he tweeted the open sourcing of Microsoft’s patent portfolio:
Microsoft is pledging our massive patent portfolio – over 60,000 patents – to Linux and open source by joining OIN this morning. If you’re looking for signs that we are serious about being the world’s largest open source company, look no further.
Another example from last month is IBM, which pledged its commitment to open source as an integral part of Red Hat’s acquisition:
With this acquisition, IBM will remain committed to Red Hat’s open governance, open source contributions, participation in the open source community and development model, and fostering its widespread developer ecosystem. In addition, IBM and Red Hat will remain committed to the continued freedom of open source, via such efforts as Patent Promise, GPL Cooperation Commitment, the Open Invention Network and the LOT Network.
The modern giants, who were born into the open source era, started off with open source as an integral part of their development culture. For example Kubernetes, the leading containers orchestration platform and the 2nd most active project on GitHub, originated in a project which Google open sourced to the Linux Foundation. Now you’d find even Google’s bitter rivals collaborating there. That’s the beauty of open source movement.
Even the traditional Telecommunications industry, which tends to lag behind IT and trust heavily on standardization bodies, have come to embrace open source as their new and agile way forward, reaching dozens of open source projects under the Linux Foundation alone. A prime example is Open Network Automation Platform (ONAP), which itself is a unification of open sources from east and west.
Open source movement is not just limited to software. I’m not going to delve into it in this post, but there are great examples of hardware open source, such as Open Compute Project backed by Facebook (which initiated it), Google and others.
Open source is here. Leading vendors embrace it; Enterprises and governments use it; Cloud providers and system integrators offer services for it; Communities innovate on it.
Open Source Is Eating The World.
Yesterday IBM announced its intent to acquire Red Hat in a deal worth approximately $34 billion. This is IBM’s biggest so far, and third-biggest in the history of US technology (the biggest was the Dell-EMC merger). In this acquisition IBM is trying to position itself in the cloud market, with special focus on its enterprise base (can anyone remember the days back in 2010 when IBM’s CEO Palmisano stated “You can’t do what we’re doing in a cloud“?).
Today’s public cloud market is largely dominated by Amazon, which according to Gartner’s 2017 IaaS survey holds over half of the market share (51.8%). Another quarter of the market is shared by Microsoft (13.3%), Alibaba (4.6%), Google (3.3%) and IBM (1.9%).
As I pointed out before, in principle there are 3 strategies that vendors can seek:
- Multi-cloud model: If you can’t beat them, join them. Support Amazon, Microsoft, Google public clouds. If done via a good generic platform, it can help avoid vendor lock-in.
- Hybrid model: mix the public cloud support with support for private cloud and bare-metal to offer public-private-hosted hybrid approach.
- Private model: concentrate on strictly private cloud. The popular open-source project OpenStack is a leading candidate for this strategy. This approach is useful for the customers insisting to run things on their own premises.
IBM chose a hybrid- and multi-cloud strategy. As Ginni Rometty, IBM Chairman, President and Chief Executive Officer, wrote:
IBM will become the world’s #1 hybrid cloud provider, offering companies the only open cloud solution that will unlock the full value of the cloud for their businesses.
IBM’s hybrid cloud business is already worth $19 billion, according to the announcement. Red Hat will join IBM’s Hybrid Cloud team, but will operate as a distinct unit, preserving its independence and neutrality, as well as Red Hat’s open source development heritage and commitment. Following the multi-cloud model, IBM stated it will continue to build and enhance Red Hat partnerships, including those with major cloud providers, such as Amazon Web Services, Microsoft Azure, Google Cloud, Alibaba and more, in addition to the IBM Cloud.
Just last month Red Hat, IBM and Hortonworks announced a joint project aimed to “enable big data workloads to run in a hybrid manner across on-premises, multi-cloud and edge architectures”. In an interesting timing, this month Hortonworks announced merger with Cloudera, both well-established open source companies around Hadoop and other big data projects.
Others are also targeting hybrid cloud strategy: just last month Google and Cisco rolled out a new hybrid cloud offering, Amazon teamed up with VMware, Microsoft offers Azure Stack, to name just a few. IBM should be lean and move fast to gain market share. The current acquisition and open source play should give it a serious boost, but would it be enough?