Tag Archives: cloud

Business Intelligence Reimagined in the Age of Artificial Intelligence

Business Intelligence (BI) tools are hot again. Just take a look at the past weeks, with Salesforce buying Tableau, Google buying Looker, and Logi Analytics buying Zoomdata. This joins other M&A deals in the past months such as Logi Analytics’ additional acquisition (yes, two in a row!) of Jinfonet in February, and Sisense acquiring Periscope Data last month.

Traditional BI is about sharp analysts who master the art and science of digging into data, intelligently querying and analyzing it to find signals amid the noise and digging up insights.

But the world has changed.
We live in the world of Big Data, where the sheer amounts of the data, the diversity of the data sources and the real-time response to the data render traditional human analysis largely impractical.

It is the age of Big Data Analytics, utilizing clever Artificial Intelligence and Machine Learning (AI/ML) algorithms to find the needle in the data haystack and surface insights. AI’s superiority was symbolically shown when Google’s AlphaGo AI defeated the world’s best Go player Sedol in 2017, some 20 years after IBM’s Deep Blue defeated Chess grandmaster Kasparov.

Big players such as Google, Amazon and Facebook have been mastering their markets for many years by using AI/ML internally. In the past few years cloud vendors also started exposing those capabilities as cloud services for everyone to use, thereby evening the (data) plane field and boosting AI/ML adoption.

This change renders older technologies irrelevant and forces consolidation.
I gave before the example of the Hadoop big data technology, where last year leading vendors Cloudera and Hortonworks had to merge to survive, and even that didn’t help them, judging by the CEO departure this month after poor reports. And their arch-competitor MapR is doing even worse now desperately looking for a buyer to avoid shutting down.
The same consolidation is facing BI tools. They need to find their place among the new AI/ML data analytics to remain relevant.

ggl-looker-salesforce-tableau

Do traditional BI tools have room in the new world?

While machines can do a fine job crunching massive amounts of data and surfacing insights, there’s still need for humans in the process to go through those insights, rank, filter, prioritize, or even simply monitor and take action items accordingly. It requires good data visualization. And that’s where BI tools can fit in. Taking complex data and insights and presenting them in a simple, intuitive and self-service user experience is the skill that BI tools have been developing for many years.

What should we expect next?

With so many BI tools out there, this consolidation will continue, where the leading tools will merge with leading analytics platforms to give an end-to-end data management experience, and smaller players will just disappear or grow into niche areas. The cloud vendors will keep leading this battle: Google will bild Looker into its AI suite, Microsoft will push its established Power BI and tighten the integration with its other Azure data services, and Amazon – the leading cloud vendor though not necessarily on this front – will probably step up its data visualization layer.

1311765722_picons03 Follow Horovits on Twitter!

Advertisements

Leave a comment

Filed under Big Data

The Next Chapter of Big Data Analytics: What’s Behind The Cloudera-Hortonworks Merger?

Over a decade ago a new open source project called Hadoop came to life and started the era of Big Data Analytics. It was a novel project with a scale-out, shared-nothing distributed architecture, which promised to handle the big data challenges not manageable by the standard relational databases of the time. However, it was notoriously hard to install and operate in production.

Two brands became synonymous with commercial Hadoop – Cloudera and Hortonworks – with the promise of helping enterprises leverage Hadoop. These companies, built by talents from the likes of Google, Facebook and Yahoo, grew into impressive public companies: their combined equity value after the merger is estimated at $5.2 billion (give or take the stock price changes).

But while they were bashing around (together with MapR and some others), the Big Data market has drastically changed. These emerging challenges have driven these arch-rivals to join forces. Cloudera and Hortonworks announced a “merger of equals”, though at a closer look it seems to be more of a “first among equals” arrangement, with Cloudera’s stockholders to own a 60% stake, and the new company to be called “Cloudera Inc.“.

From on-prem to the cloud

In the age of Cloud Computing, enterprises are no longer inclined to build enormous Hadoop clusters in their data centers to crunch the data (especially with Hadoop’s high upfront storage costs and painful upgrades). And so the traditional players found themselves increasingly competing against public cloud giants Amazon, Google, Microsoft and their big data services. On Gartner’s 2018 Magic Quadrant for Data Management Solutions for Analytics, the main cloud vendors shine in the Leaders quadrant (with Google a hairline away), while Hortonworks, Cloudera and MapR lag behind in the Niche Players quadrant.

From data to insights

Since the emergence of Hadoop, there’s been a big (data) bang which spawned a range of tools, platforms and services aimed at different aspects of analytics, from data lakes to data warehouses, from batch to stream, from time series to graphs and from structured to unstructured. But most disruptive are the data science driven methods of artificial intelligence and machine learning. This shuffled the competitive landscape introducing many specialized vendors such as KNIME and H2O.ai (both open source), which lead Gartner’s 2018 Magic Quadrant for Data Science and Machine Learning Platforms and keep pushing back even established players such as IBM, Microsoft, SAS and Teradata. Hortonworks and Cloudera are not even on that chart.

From the data center to the edge with IoT

The Internet of Things (IoT) brings a massive surge of data streaming from disperse locations, ranging from connected cars to industrial automation. This calls for different big data analytics solutions to meet regulatory, security, performance, and geo-location need, combining both cloud and edge computing. You can read more on that in this blog post.

Joining forces towards the next chapter of big data analytics

Hortonworks and Cloudera have made some moves in the right direction even before the merger to adapt their strategies to these market changes: They formed partnerships with Amazon, Google, IBM, Microsoft and started operating on their clouds, while maintaining a hybrid cloud differentiation; They launched advanced analytics and IoT offering such as Hortonworks DataFlow for streaming and IoT workloads, and Cloudera Data Science Workbench.

But it will require more than these individual efforts to keep them competitive. Therefore, joining forces is an important strategic move. Combined with the strong open source DNA shared by both companies, they will be well positioned to take a leadership role in the next chapter of big data analytics.

1311765722_picons03 Follow Horovits on Twitter!

Leave a comment

Filed under Big Data, Uncategorized

Is Dell Going to Sell To VMware In A Reverse Merger?

In 2015 Dell made the biggest take over of all times when it acquired EMC for $67 Billion. One of the main assets of that acquisition was VMware, which EMC had acquired in 2004. With the acquisition, Dell currently owns around 80% of VMware.

But now the plot thickens, as Dell is considering selling itself to the smaller VMware in a massive reverse merger that may very well be the biggest deal ever. The reason behind this strategy is Dell’s interest to get back to the public markets, after Michael Dell took the company private in 2013. This move will enable Dell to be traded publicly without going through a formal listing of a traditional IPO.

But the reason may also be found in the disruptive technology of cloud computing. Dell traditionally had a strong grasp of the enterprise IT market on the hardware side, and VMware had a similar grasp in enterprise IT virtualization solutions. But the market has been disrupted with the entrance of the public cloud vendors such as Amazon AWS and Microsoft Azure, enabling enterprises to avoid owning and managing their own data centers and instead run their workloads in the cloud. Traditional incumbents such as Verizon, HP and Dell tried fighting off by launching their own public cloud offering, but had to pull back after failing to compete. VMware also had to team up with Amazon to maintain its cloud strategy. Shifting to a more software driven approach targeted at the cloud native solutions may be part of Dell’s strategy to regain its position with the enterprises.

1311765722_picons03 Follow Horovits on Twitter!

Leave a comment

Filed under Cloud, technology

Edge Computing Draws Startups And Venture Capital

Edge computing is the new hype, some see it as the next big thing after Cloud Computing. Edge computing is drawing attention by the major cloud vendors,  as well as by Tier 1 telecom carriers, standards bodies and consortia.

But the big guys are not alone here. As with anything hot and innovative, Edge Computing draw also the attention of the lean and mean – the startups.

Vapor IO is an interesting US-based startup which provides an edge computing platform enabling simple way to deploy and manage cloud servers. Vapor IO provides both the hardware and the software to remotely administer, manage and monitor the distributed environment. Its main focus is helping telecom carriers and wireless base-station landowners to offer cloud compute capabilities in close proximity to the Radio Access Network (RAN). In June Vapor IO launched Project Volutus, with the ambitious mission statement:

Project Volutus seeks to build the world’s largest network of distributed edge data centers by placing thousands of Vapor Chambers at the base of cell towers and directly cross-connecting them to the wireless networks. This will make it possible to push true cloud capabilities to within yards of the end device or application, one hop from the wireless network.

edge_telecom_infographic_hero-297x300

Vapor IO backs its ambitious statement with a strategic investor Crown Castle, the largest wireless tower company in the US, which leases towers to all the top wireless carriers, including Verizon, AT&T, and T-Mobile. With Vapor IO tapping to Crown Castle’s existing network of 40,000 cell towers and 60,000 miles of fiber optic lines in metropolitan areas, the startup seems up to fulfilling its vision. Vapor IO is also among the founding members of Open19, a an open foundation formed by LinkedIn together with HPE and GE to establish open standards for truly open, innovative platform for data centers and edge platforms.

Another interesting US-based startup is Packet with its bare-metal distributed micro datacenters. In July Packet announced expanding to Ashburn, Atlanta, Chicago, Dallas, Los Angeles, and Seattle, along with new international locations in Frankfurt, Toronto, Hong Kong, Singapore, and Sydney. This amounts to 15 global locations to date. Packet’s technology is based on the hottest industry trends: cloud and containers. They also partnered with major new-age technology players such as Docker, Mesosphere and Cloud66. Packet’s vision is also well-backed, seeing its last funding round of $9.4M led by telecom and internet giant SoftBank. In their customer base you’ll find Cisco, the industry leader in networking.

1311765722_picons03 Follow Horovits on Twitter!

Leave a comment

Filed under Edge Computing

New ‘Cloud Native Computing Foundation’ Trying to Standardize on Cloud and Containers

Cloud Native Computing Foundation (CNCF) is a new open standardization initiative recently formed under the Linux Foundation with the mission of providing standard reference architecture for cloud native applications and services, based on open-source software (OSS). The first OSS is Google’s Kubernetes, which was released in v1.0 the same day, and was donated by Google to the foundation.

Google is one of the 22 founding members, together with big names such as IBM, Intel, Redhat, VMware, AT&T, Cisco and Twitter, as well as important names in the containers realm such as Docker, Mesosphere, CoreOS and Joynet.

The announcement of the new foundation came only a few weeks after the announcement of the Open Container Initiative (OCI), also formed under the Linux Foundation. Even more interesting to note that almost half of the founding companies of CNCF are among the founders of OCI. According to the founders, these two initiatives are complementary: while OCI is focused on standardizing the image and runtime format for containers, CNCF will target the bigger picture of how to assemble components to address a comprehensive set of container application infrastructure needs, starting with the orchestration level, based on Kubernetes. This is the same bottom-up dynamics as we see in most other initiatives and projects, starting from standardizing on the infrastructure and then continuing upwards: cloud computing evolved same way from IaaS to PaaS and to SaaS, Network Function Virtualization (NFV) evolved from the NFV Infrastructure to Management and Orchestration (MANO), etc.

Open strategy has become the name of the game and all the big companies realize that in order to take the technology out of infancy and enabling its adoption in large-scale production deployments in enterprises they need to take the lead on the open field. Google’s Kubernetes and its recent contribution to CNCF is one example. Now we’ll wait to see which other open-source ingredients will be incorporated and which blueprint will emerge and how it succeeds in meeting the industry’s varying use cases.

1311765722_picons03 Follow Dotan on Twitter!

5 Comments

Filed under Cloud, cloud automation, DevOps

Industry Standardizing on Containers with Open Container Project

* Update: the foundation has decided to rename from Open Container Project (OCP) to Open Container Initiative (OCI)

 

“Open” is not just providing open source software. It’s also, and perhaps more importantly, about open standards, enabling the community to converge on one single path and work together on improving it. The absence of such agreement drives the community to wars for domination, especially in emerging fields. We see that with the Internet of Things, with cloud computing and network virtualization.

The containers community, headed by Docker, was no different. Docker’s success drew the attention of every major player, including every major player in the cloud and DevOps world, and created competing standards which threatened to draw everyone into battles for domination. But there’s good news. This week at DockerCon 2015 these players joined forces to form the Open Container Project (OCP).  The new governance body, formed under The Linux Foundation, aims to create standards around container format and runtime. And though under Linux Foundation governance, it certainly targets other operating systems, with Microsoft pushing Windows.

ocp

The Open Container Project has all major cloud players, including Amazon, Google (which promotes Kubernetes), Microsoft, HP, IBM. It also has players in the DevOps scene such as Docker itself and CoreOS (which offers a prominent competing container called appc), Mesosphere, Rancher Labs, Red Hat and VMware/EMC.

Seems Docker will be leading the path, writing the first draft of the format specification and using Docker’s implementation as baseline. Docker’s first contribution is runC, which is already available on the project’s GitHub page. But that’s only the beginning. The true test will be the adoption within enterprises, which have been struggling in adopting the technology.

1311765722_picons03 Follow Dotan on Twitter!

4 Comments

Filed under Containers, DevOps

Virtual networking picks up at OpenStack

Virtual networking was a key theme at this week’s OpenStack Summit in Paris. We saw keynotes addressing it, panels with leading Telco experts on it, and dedicated sessions on emerging open standards such as OpenNFV.

Telco inherently posses more challenging environments and networking needs, with elaborate inter-connectivity and service chaining, which the Neutron project has not yet adequately addressed. We also see open standards emerging in the industry around SDN and NFV, most notably OpenDaylight, which OpenStack foundation still haven’t decided how to address in collaboration and complementary fashion. It would become even trickier in light of competing open standards such as the ON.Lab’s Open Network Operating System (ONOS) which was announced just this week.

This lack of standardization in SDN & NFV for OpenStack presents an opportunity for different vendors to offer an open source solution in attempt to take the lead in that area, similarly to the way Ceph took the lead and ultimately became the de-facto standard for OpenStack block storage. On this week’s summit we saw two announcements tackling this gap of SDN for OpenStack: both Akanda and Midokura announced their open source products in compatibility with OpenStack.

Midokura decided to open source it’s core asset MidoNet which provides Layer-2 overlay aiming to replace the default OVS plugin from OpenStack. Midokura is targeting OpenStack community, making its source code available as part of Ubuntu’s OpenStack Interoperability Lab (OIL). OpenStack is also clearly targeted in their announcement:

MidoNet is a highly distributed, de-centralized, multi-layer software-defined virtual network solution and the industry’s first truly open vendor-agnostic network virtualization solution available today for the OpenStack Community.

Akanda on the other hand was an open-source project from the beginning. Akanda focuses on Layer-3 virtual routing on top of VMware NSX’s Layer 2 overlay, with support for OpenDaylight and OpenStack. In fact Akanda is a sort of a spin-out of DreamHost, the company that spun-out Inktank and brought about Ceph (acquired by RedHat in April). Will they be able to achieve same success with Akanda in Networking as they did with Ceph in Storage?

Telco operators such as AT&T, Huawei and Vodafone are pushing the OpenStack Foundation and community to address the needs of the Telecommunications domain and industry. The OpenStack framework has reached enough maturity in its core projects and ecosystem to be able to address the more complex networking challenges and requirements. Backed by the network operators and network equipment providers (NEPs), and with the right collaboration with other open-source projects in the SDN and NFV domains, I expect it shall be on the right path to offer a leading virtualization platform for Telco and Enterprise alike.

1311765722_picons03 Follow Dotan on Twitter!

10 Comments

Filed under NFV, OpenStack, SDN