Home Cloud + Machine-to-Machine = Disruption of Things: Part 1

Cloud + Machine-to-Machine = Disruption of Things: Part 1

Editor’s note: This is the first in a two-part series on the advantages that cloud computing brings to the machine-to-machine space. It was first published as a white paper by Ken Fromm. Fromm is VP of Business Development at Appoxy, a Web app development company building high scale applications on Amazon Web Services. He can be found on Twitter at @frommww.

The use of cloud infrastructure and cloud services provides a low-cost means to create highly scalable applications. Even better, the cloud dramatically improves development speed and agility. Applications can be developed in much less time with much smaller teams. And as these benefits extend themselves in the machine-to-machine (M2M) space, companies creating M2M applications will see dramatic reduction in the cost of developing applications and provisioning services.

Articles on the Internet of Things (or Web of Things) are increasingly finding their way into mainstream news. Executives of large companies (such as the CEO of Sprint) and even government officials (such as the Chinese Premier) are speaking about the possibilities and opportunities of having ubiquitous sensors connected to the Internet.

The use of the cloud – in combination with the advent of low-cost sensors and high-availability M2M data transmission – will transform old industries and modify many business models.

Almost every major electronic device, vehicle, building component, and piece of equipment has the ability to become “smart” by connecting sensors to it. Most devices already do. The difference though is that moving data to the cloud and being able to process it in infinite combinations provides new capabilities in very low cost, transparent ways.

M2M Business Transformation

The case for what the Internet of Things might entail has been eloquently made here, here, and here. When devices and machines can send data to the cloud and have dashboards and interfaces on Web browsers, HDTV wallboards, mobile phones, and ipads, the impact becomes large.

This potential will affect almost every industry – just as the Internet, email, websites, e-commerce, and now Web 2.0 are touching every industry and every business process. The impact will be most noticeable at non-Web companies.

The change here will be dramatic – from where every device is by itself or controlled through a local device to where every device can be accessed anywhere (by authenticated users), where data streams can be “followed,” and interfaces and dashboards improved on the fly to provide new views and device control. Does the concept of “following” a jet engine or a pneumatic thermostat have appeal to equipment makers and airlines or building owners? You bet it does.

Equipment, automobile, and device manufacturers need to beginning positioning themselves to gather realtime data on the performance of each product and use cloud processing and data storage to do it. Using this approach, they’ll be able to rapidly improve their products, build direct connections with customers, and get ahead of customer and product issues. They’ll also be able to offer service offerings and develop new revenues sources. Services will become a part of every product. Some as ways to improve customer support and customer connections. Others as revenue sources in and among themselves.

Want a quick diagnosis on your transmission? Go to CloudAutoDiagnostics.com and check in with your car’s data feed. It will compare your data from the transmission sensors against others with similar transmissions. Yup, there’s a issue but nothing serious. Want a 10% coupon for the service shop around the corner?

Below is a short list where the Internet of Things and M2M in the cloud will matter although it really could be just a single line that says anywhere where there is a sensor, an electronic device, or a machine.

  • Personal Health and Fitness
  • Medical Devices
  • Automobiles
  • Shipping, and Transportation
  • Smart Grid and Smart Buildings
  • Retail
  • Architecture
  • Agriculture
  • Mining
  • Natural Resource Management

The use of the cloud – in combination with the advent of low-cost sensors and high-availability M2M data transmission – will transform old industries and modify many business models. As is the case in each disruptive tech cycle, new companies will arise, existing market share will be threatened, and the separation between industries and channels will become blurred. Those in the M2M space who take advantage of what the cloud offers will not only be able to anticipate these changes but will lead the way into these new opportunities.

Key M2M Cloud Patterns

The goal of this paper is not to convince readers of what the future will like or even go through what new devices might look like. ReadWriteWeb and other tech publications will do a far better job there. The goal here is to list out the advantages that cloud computing brings to M2M applications.

Circuit photo by pawel_231; cloud photo by Rybson.

Separation of Data Collection, Processing, Interface, and Control

The use of cloud computing means that data collection, processing, interface, and control can be separated and distributed to the most appropriate resource and device. Current M2M implementations combine data collection, processing, interface, and control. Either chips in sensor bodies or an onsite laptop or desktop PC tied within a local mesh network perform the data processing and determine the control logic.

Once data collection and processing moves to the cloud, however, most current limitations disappear. One of the more obvious ones is that data limits go away. Storing data in the cloud means that the data buffers within devices (whether its 500 or even 5,000 data points) no longer matter. Cloud storage is near limitless and so historic data can be saved for as long as its deemed valuable.

The data can be used for showing readings, performance, or status for the last day, week, month, and even year. Individual nodes can be inspected as well as grouped together with similar data from other devices. Analysis can be performed quickly on specific groupings and filters – whether it’s product line, region, demographic, or application use. The consumer Web understands the value of data and the many permutations that analysis can take. Once M2M data moves to the cloud, M2M companies begin to have the same realizations.

Not only is there access to super-fast processors, if there are server or storage bottlenecks, these can be addressed by on-demand launching of more servers or horizontally scaling storage. Using the cloud and dynamic languages and frameworks, the cost of ownership goes way down and the limitations go away.

Applications are also not restricted by tiny processors, low-power consumption, and special purpose programming languages. Processing in the cloud brings best-of-breed programming capabilities. These include widely popular programming languages and frameworks, flexible data structures, and extensive algorithm libraries.

Not only is there access to super-fast processors, if there are server or storage bottlenecks, these can be addressed by on-demand launching of more servers or horizontally scaling storage. Using the cloud and dynamic languages and frameworks, the cost of ownership goes way down and the limitations go away. There’s also a huge increase in the speed of product development.

Lastly, interfaces can move to Web browsers, wallboards, mobile phones and tablets, eliminating the need for either having screens a part of every devices or local computers permanently a part of installations. Medical devices no longer have to come with their own monitors. Separating the data input from the processing from the screen readout not only means lower costs (less components to the devices) but also easier upgrading of diagnostics and far better visualization capabilities.

An MRI or sonogram sent to the cloud, digitally refined using the latest algorithms distributed across multiple machines, and presented on an iPad or an HDTV screen is going to look a lot better and be more insightful than if displayed on existing monitors, no matter how new the device is. Separating the components and putting the data and processing in the cloud allows devices to keep getting smarter while not necessarily becoming more complex.

Data Virtualization

Data storage is one of the biggest advantages of using the cloud for M2M applications. The cloud not only offers simple and virtual ways to run applications, it also offers simple and virtual ways to store data. Cloud infrastructure companies are increasingly offering simple-to-use services to provision and maintain databases. These services even extend to offering databases as a service – meaning offering expandable data storage at the end of an IP address all the while masking or eliminating the management of servers, disks, backup and other operational issues. Examples include Amazon’s SimpleDB and RDS service and Salesforce’s Database.com offering.

Once transmitted to the cloud, data can be stored, retrieved and processed without having to address many of the underlying computing resources and processes traditionally associated with databases. For M2M applications, this type of virtualized data storage service is ideal.

Being able to seamlessly handle continuous streams of structured data from sensor sources is one of the more fundamental requirements for any distributed M2M application. As an example, Appoxy processes the data streams from network adapters from Plaster Networks, a fast-growing leader in the IP-over-power line space. Status inputs are sent by the adapters continuously to Plaster which runs its applications on one of the top cloud infrastructure providers. Appoxy processes these for use with the user dashboard running on the Web.

This console provides insights to users on the status and performance of the adapters and their networks (allowing users to see whether they are getting optimal performance out of their devices and networks). The data also provides valuable diagnostic information to Plaster, dramatically reducing support calls and improving device usage and customer satisfaction. The information is also invaluable for product development. Data on the performance of new products and features can be assessed in real-time, providing insights that would otherwise be unattainable from devices in field.

This type of smart device marks the beginning of the trend. It’s a fair bet that all network devices will become smart and cloud-aware. Followed by all vehicles, manufacturing equipment and almost all machines of any substance.

The types of data stores available include SQL, NoSQL and block or file storage. SQL is useful for many application needs but the massive, parallel and continuous streams of M2M data lends itself well to the use of NoSQL approaches. These data stores operate by using key-value associations which allows for a flatter non-relational form of association. NoSQL databases can work without fixed table schemes, which makes it easy to store different data formats as well as evolve and expand formats over time.

NoSQL databases are also easy to scale horizontally. Data is distributed across many servers and disks. Indexing is performed by keys that route the queries to the datastore for the range that serves that key. This means different clusters respond to requests independently from other clusters, greatly increasing throughput and response times. Growth can be accommodated by quickly adding new servers, database instances and disks and changing the ranges of keys.

The NoSQL approach play well in M2M applications. Data for sensors or groups of sensors can be clustered together and accessed by an ever-expanding set of processes without adversely affecting performance. If a datastore gets too large or has too many requests, it can be split into smaller chunks or “shards.” If there are many requests on the same data, it can be replicated into multiple data sets, with each process hitting different shards lessening the likelihood of request collisions.

About ReadWrite’s Editorial Process

The ReadWrite Editorial policy involves closely monitoring the tech industry for major developments, new product launches, AI breakthroughs, video game releases and other newsworthy events. Editors assign relevant stories to staff writers or freelance contributors with expertise in each particular topic area. Before publication, articles go through a rigorous round of editing for accuracy, clarity, and to ensure adherence to ReadWrite's style guidelines.

Get the biggest tech headlines of the day delivered to your inbox

    By signing up, you agree to our Terms and Privacy Policy. Unsubscribe anytime.

    Tech News

    Explore the latest in tech with our Tech News. We cut through the noise for concise, relevant updates, keeping you informed about the rapidly evolving tech landscape with curated content that separates signal from noise.

    In-Depth Tech Stories

    Explore tech impact in In-Depth Stories. Narrative data journalism offers comprehensive analyses, revealing stories behind data. Understand industry trends for a deeper perspective on tech's intricate relationships with society.

    Expert Reviews

    Empower decisions with Expert Reviews, merging industry expertise and insightful analysis. Delve into tech intricacies, get the best deals, and stay ahead with our trustworthy guide to navigating the ever-changing tech market.