Will the Internet of Things talk Googlish?

Things definitely change fast in the landscape of the Internet of Things. On my last blog post less than 2 weeks ago I discussed standardization efforts in IoT and covered the announcement of a new consortium called Open Interconnect Consortium (OIC), led by Samsung, Intel, Dell and others.

And just a week later we got the new heavy gun in the field: Google announced, through its recently acquired company Nest, a new industry group called Thread, together with Samsung, ARM Holdings and others, to define the communications standard for the smart home. The new standard is said to solve reliability, security, power and compatibility issues for connecting products around the home.

Thread-Group

This announcement joins Microsoft’s announcement from beginning of this month about joining AllSeen Alliance as the 51st member, which was followed by last week’s announcement of 7 other new members, making AllSeen Alliance 58 members strong to date (on my last blog post earlier this month they were only 51, just think about it…).

Google’s new consortium joins other industry consortia. How do these different initiatives relate to one another? This question becomes even more interesting when noting that Samsung is a member of both OIC and Thread, and that Apple’s list of HomeKit partners includes Broadcom (another member of OIC) and Haier (member of AllSeen Alliance).

It may be that in these early stages organizations are reluctant to bet on a single horse and distribute the risk across different consortia. It may also be that some of these initiatives are not really competitive but rather complementary. Reading through the statement of the new Thread Group seems that they target a new networking protocol (to supersede WiFi, Bluetooth and the likes) for IoT to be more energy-efficient and scalable, which may be complementary to the mandate declared by OIC which seems to deal with higher layers. But as statements are very high level and tend to change, we will have to patiently wait and see how it plays out.

Leave a comment

Filed under InternetOfThings, IoT

The common language of the Internet of Things

On my last post I described the chronology of the Internet Of Things (IoT) since the early 1990s and until these days, and the drive for its evolution. On that post we’ve established the motivation and the $$ to invest in the IoT. So what is the next step?

Since the Internet of Things is all about enabling devices of various kinds to talk to each other, we need to have a common language for these devices to talk. This becomes more acute when when involving devices of different vendors and providers.

Do you remember the Internet in its early days? back then there were islands of isolated networks and we were missing a common language to enable them to talk between them. In the internet case the solution was the invent of Internet Protocol (IP), which standardized the communication “language” and addressing system (together with higher-level standards which followed such as TCP, UDP and HTTP). These open standards paved the way for the mass adoption of the internet and its worldwide spread. In fact, IP got caught up with its own success, with IPv4 (4th version of the protocol) nearing exhaustion of its address space, and mandating the inevitable switch to IPv6, through a sometimes-painful migration process (as many old systems were hard-wired to the IPv4 format).

ascore-2014-jan-ipv4v6-standalone-1200x710

What is the “Internet Protocol” of the Internet Of Things?
This question drove the formation of a new standardization body which was announced yesterday and which is backed by a consortium of industry leaders such as Samsung, Intel and Dell. The new consortium, called Open Interconnect Consortium (OIC), is said to focus on :

… delivering a specification, an open source implementation, and a certification program for wirelessly connecting devices.

oic-diagram

This initiative joins other initiatives in the area, most notably AllSeen Alliance which was founded in December and already has 51 members including LG Electronics, Panasonic, Qualcomm, Cisco another and others, and the newest member – Microsoft – which joined just earlier this month. During its Worldwide Developers Conference last month Apple also announced a similar initiative called HomeKit based on iOS devices. These initiatives are all aimed at standardizing the language in which devices will talk in the Internet of Things.

Standardization is an important step in maturing of any technology. Furthermore, open source standards, APIs, and reference implementations, have become a predominant part of the IT industry, and emerging technologies are now expected to provide them. An excellent example of that is Cloud Computing, which grew the open standards of OpenStack and CloudStack that are backed by impressive communities incorporating industry leader corporation, service integrators and individual contributors worldwide alike. Reviewing the recently-released statement of work by the OIC shows it is well aware of this expectation and is set to provide that.

Open standards and open source implementation is the right step for IoT. We should only hope that the communities grow and establish good collaboration between the members and between the different alliances (and avoid needless politics) so that they could put forth the right standard to meet the needs, and more importantly – adapt them to the changing needs in an agile manner for the benefit of all.

1 Comment

Filed under InternetOfThings, IoT

How IBM is using big data to fix Beijing’s pollution crisis

horovits:

A fascinating way to leverage big data to help the world

Originally posted on Quartz:

Of China’s major cities, Beijing’s pollution problem is probably the worst, causing thousands of premature deaths every year. Its residents are fed up. The growing outrage has forced leaders to declare a “war on pollution,” including the goal of slashing Beijing’s PM2.5— the concentration of the particles that pose the greatest risk to human health—by 25% by 2017. The Beijing municipal government will earmark nearly 1 trillion yuan ($160 billion) to meet that target.

​

Why, then, are the city’s own government officials skeptical about hitting that 2017 goal? Perhaps because Beijing’s pollution woes are unusually complicated. The city is flanked on three sides by smog-trapping mountain ranges. There are numerous sources of foul air, and a multitude of subtle ways the chemicals interact with each other, which make it hard to identify what problems need fixing.

IBM thinks it change that outlook. On Monday, the company will unveil a 10-year initiative launched in partnership with the Beijing Municipal Government…

View original 612 more words

Leave a comment

Filed under Uncategorized

Facebook outage reported now worldwide

Facebook is down. trying to access the page shows a page with a laconic message that “something went wrong”.

facebook outage error message

According to downdetector.com the outage started 3:55 a.m. EDT.

facebook outage statistics downcenter.com

Reports are  flooding the net. The outage seems to be worldwide.

facebook outage twitter responses

So far no explanation from Facebook.

Stay tuned.

 

 

Leave a comment

Filed under Uncategorized

The Internet of Things: Vision and Execution

The Internet of Things (IoT) is the hot buzzword these days. Everyone’s talking about it, there’s proliferation of ventures around it, and trillions of dollars are invested in it.

According to a new research published by IDC last week:

a transformation is underway that will see the worldwide market for IoT solutions grow from $1.9 trillion in 2013 to $7.1 trillion in 2020.

Although IoT seems like a relatively new hype, the concept has in fact been discussed since the early 1990s and the term was coined back in 1999 by Kevin Ashton. Around the same year Microsoft launched a new concept video (highlighted last week by Gizmodo) sharing its futuristic vision for the Smart Home.

Microsoft may not have used the term IoT on that video, but it certainly effectively preached it, talking about things such as opening the door using our voice or eye scan, locating our spouse on the family car via the car’s computer, sending our children messages to their “Pocket PC”, or having the trash bin adding groceries to the shopping list.

What seemed like science fiction only 15 years ago, is now a reality coming true. With the wireless and mobile internet readily available everywhere, smartphones (anyone said “Pocket PC”?) being a commodity even for children, GPS positioning available from satellites straight to our devices, telematics prevailing in vehicles and other devices, and much more, the infrastructure is in place to connect everything together, not just PCs and smartphones but literally every device from our toaster to our car. According to Gartner’s research:

The Internet of Things (IoT), which excludes PCs, tablets and smartphones, will grow to 26 billion units installed in 2020 representing an almost 30-fold increase from 0.9 billion in 2009, according to Gartner, Inc. Gartner said that IoT product and service suppliers will generate incremental revenue exceeding $300 billion, mostly in services, in 2020. It will result in $1.9 trillion in global economic value-add through sales into diverse end markets.

With such economic drive there’s no wonder that everyone is investing in IoT. from startups to Apple and Google, from voice commands (e.g. Apple Siri) to location-based tracking apps (e.g. FindMe), from smart TV (e.g. Amazon FireTV) to self-learning thermostat (e.g. Nest, startup recently acquired by Google). Even humans are instrumented nowadays (see rumors yesterday that Apple will launch it’s iWatch at an October event, with a multitude of health and fitness sensors).

Microsoft’s absence in this field is particularly evident, in light of their clear vision 15 years ago as well as their complete domination of the software market at the time. Was it the company’s size? was it lack of agility? Whatever the reason may be, it is clear that whether the vision was good, execution did not follow, leaving Microsoft to now chase other major players.

According to Cosmology, the early universe underwent exponential expansion, in what is known as the Inflationary Model. Similarly, we are now in the early stages of the Internet of Things, and with more and more devices instrumented and connected every day, we are now witnessing the inflationary model of the Internet of Things. So hold on for the ride, and make sure not to be left behind on vision and execution.

2 Comments

Filed under InternetOfThings, IoT

Scaling All The Way: Welcoming Scala as a First-Level Citizen in GigaSpaces XAP

Scala is a hot topic in the programming world. I’ve been following Scala for quite a while, and about 2 years ago I endeavored programming in Scala on top of GigaSpaces XAP (disclaimer: I’m solution architect for GigaSpaces). XAP had no built-in support for Scala, but I leveraged XAP’s native support for Java together with Scala’s compatibility with Java.

And it worked like a charm!

You can read the full details in my previous blog post. I concluded my blog post by saying that

… programming in Scala on top of XAP is a viable notion that deserves further investigation.

However, I also added a disclaimer that

… XAP platform offers a vast array of features, and that the Scala language offers a vast array of constructs, very few of which have been covered on this experiment. Similarly, I should also state that Scala is not officially supported by the XAP product, which means that there is no official support or test coverage of Scala in the product.

Further explorations of more advanced Scala usage, also together with concrete customer use cases, showed that although possible, the resulting code is less intuitive for Scala users, and does not fully utilize the elegant constucts of the Scala language.

Two years went by, and now we decided to take our relationship with Scala to the next level, and make Scala programming on XAP much more intuitive with things such as better support for immutable objects, functional querying using predicates, Scala script execution and enhanced REPL shell. XAP now also exposes some of the platform’s powerful mechanisms for distributed and scalable processing, such as remote script execution and Map/Reduce pattern, over native Scala. These goodies have just been unveiled as part of the latest release of XAP (XAP 9.6). Let’s give you a taste of some of the goodies.

Predicate-based queries

You can now run queries based on Scala predicates just as you’re used to for functional querying:

val pGigaSpace = gigaSpace.predicate
val person = pGigaSpace.read { person: Person =>
person.age > 25 || person.name == personName }

This will be compiled into the XAP native SQL query mechanism, so no runtime overhead and you get all the optimizations available on the platform’s SQL query engine and indexing.

Scatter/Gather and Map/Reduce patterns

XAP contains a mechanism called Task Execution which provides easy implementation of the Scatter/Gather and Map/Reduce patterns.

Scatter:

DistributedTaskExecution_phase1

Gather:

DistributedTaskExecution_phase2

You can now define both the Scatter and the Gather as Scala functions and dispatch onto a cluster of nodes with a simple invocation:

val asyncFuture2 = gigaSpace.execute(
{ gigaSpace: GigaSpace => gigaSpace.read(Data()).data },
{ results: Seq[AsyncResult[String]] => results.map { _.getResult() } mkString } )

Remote and parallel execution of Scala scripts over a cluster

What if you want to execute your Scala script across a cluster of compute nodes (a-la compute grid)? Maybe even colocated with the data found on these nodes (a-la in-memory data grid)? This is now easily achievable using XAP’s Dynamic Language Tasks (which also supports other dynamic language scripts such as JavaScript and Groovy):

val script = new ScalaTypedStaticScript("myScript", "scala", code)
  .parameter("someNumber", 1)
  .parameter("someString", "str")
  .parameter("someSet", Set(1,2,3), classOf[Set[_]]) 
val result = executor.execute(script)

Final words

XAP is a multi-language and multi-interface platform. You can write in Java, .NET, C++;you can use standards such as SQL, JPA and Spring; you can use it as a key-value store; you may even choose to store your data in document format instead of objects, to support semi-structured model. So enhancing XAP to support Scala was but a natural move.

XAP-multi-API

In my blog post two years ago I concluded saying that

… [Scala] is an exciting option worth exploring, and who knows, if Scala becomes predominant it may one day become an official feature of the product.

Finally the day has come and Scala made its first steps in becoming a first-level citizen in GigaSpaces XAP. What I described above is just part of it. You can read the full listing here.  There are yet many more things to do in order to fully expose XAP platform’s rich functionality through Scala functional language, such as full support for immutable types. Now it’s time for the user community to check it out. So go ahead and play with it and let us know what you think you need to make your application scale all the way with XAP and Scala.

Leave a comment

Filed under Programming Languages

Enterprises Taking Off to the Cloud(s)

Cloud Deployment: the enterprise angle

The Cloud is no longer the exclusive realm of the young and small start up companies. Enterprises are now joining the game and examining how to migrate their application ecosystem to the cloud. A recent survey conducted by research firm MeriTalk showed that one-third of respondents say they plan to move some mission-critical applications to the cloud in the next year. Within two years, the IT managers said they will move 26 percent of their mission-critical apps to the cloud, and in five years, they expect 44 percent of their mission-critical apps to run in the cloud. Similar results arise from surveys conducted by HP, Cisco and others.

SaaS on the rise in enterprises

Enterprises are replacing their legacy applications with SaaS-based applications. A comprehensive survey published by Gartner last week, which surveyed nearly 600 respondents in over 10 countries, shows that

Companies are not only buying into SaaS (software as a service) more than ever, they are also ripping out legacy on-premises applications and replacing them with SaaS

IaaS providers see the potential of the migration of enterprises to the cloud and adapt their offering. Amazon, having spearheaded Cloud Infrastructure, leads with on-boarding enterprise applications to their AWS cloud. Only a couple of weeks ago Amazon announced that AWS is now certified to run SAP Business Suite (SAP’s CRM, ERP, SCM, PLM) for production applications. That joins Microsoft SharePoint and other widely-adopted enterprise business applications now supported by AWS, which helps enterprises migrate their IT to AWS easier than ever before.

Mission-critical apps call for PaaS

Running your CRM or ERP as SaaS in the cloud is very useful. But what about your enterprise’s mission-critical applications? Whether in the Telco, Financial Services, Healthcare or  other domains, the core business of the organization’s IT usually lies in the form of a complex ecosystem of 100s of interacting applications. How can we on-board the entire ecosystem in a simple and consistent manner to the cloud? One approach that gains steam for such enterprise ecosystems is using PaaS. Gartner predicting PaaS will increase from “three percent to 43 percent of all enterprises by 2015”.

Running your ecosystem of applications on a cloud-based platform provides a good way to build applications for the cloud in a consistent and unified manner. But what about legacy applications? Many of the mission-critical applications in enterprises are ones that have been around for quite some time and were not designed for the cloud and are not supported by any cloud provider. Migrating such applications to the cloud often seems to call for major overhaoul, as stated in MeriTalk’s report on the Federal market:

Federal IT managers see the benefits of moving mission-critical applications to the cloud, but they say many of those application require major re-engineering to modernize them for the cloud

The more veteran PaaS vendors such as Google App Engine and Heroku provide great productivity for developing new applications, but do not provide answer for such legacy applications, which gets us back to square one, having to do the cloud migration ourselves. This migration work seems too daunting for most enterprises to even dare, and that is one of the main inhibitors for cloud adoption despite the incentives.

It is only recently that organizations have started to use PaaS for critical functions, examining PaaS for mission-critical applications. According to a recent survey conducted by Engine Yard among some 162 management and technical professionals of various companies:

PaaS is now seen as a way to boost agility, improve operational efficiency, and increase the performance, scalability, and reliability of mission-critical applications.

What IT organizations are looking for is a way to on-board their existing application ecosystem to the cloud in a consistent manner as provided with the PaaS, but while having IaaS-like low-level control over the environment and the application life cycle. IT organizations seek the means to keep the way they are used to doing things in the data center even when moving to the cloud. A new class of PaaS products emerged over the past couple of years to answer this need, with products such as OpenShift, CloudFoundry and Cloudify. In my MySQL example discussion I demonstrated how the classic MySQL relational database can be on-boarded to the cloud using Cloudify without need for re-engineering MySQL, and without locking into any specific IaaS vendor API.

Summary

Enterprises are migrating their applications to the cloud in an increasing rate. Some applications are easily migrated using existing SaaS offering. But the mission-critical applications are complex and call for PaaS for on-boarding them to the cloud. If the mission-critical application contains legacy systems or requires low-level control of OS and other environment configuration then not every PaaS would fit the job. There are many cloud technologies, infrastructure, platforms, tools and vendors out there, and the right choice is not trivial. It is important to make proper assessment of the enterprise system at hand and choose the right tool for the job, to ensure smooth migration, avoid re-engineering as much as possible, and keep flexible to accomodate for future evolution of the application.

If you are interested in consulting around assessment of your application’s on-boarding to the cloud, feel free to contact me directly or email ps@gigaspaces.com

1311765722_picons03 Follow Dotan on Twitter!

Leave a comment

Filed under cloud deployment, IaaS, PaaS