Home Why it’s Time to Move to an Event Driven Architecture

Why it’s Time to Move to an Event Driven Architecture

Real-time and IoT have modernized application development. But “the laws of physics still apply.” As a guest speaker early in my career, I’d tell audiences that the fundamental insights they gained from their traditional application development experiences still apply to modern application development. Here is why it’s time to move to an event-driven architecture.

Development experiences teach valuable lessons.

Some 25 years since I first gave that presentation, I still believe that development experience teaches valuable lessons. For instance, we should know that databases don’t run any faster in an application for the Internet of Things (IoT) than they run in the typical customer service application built using traditional methods

Yet I still see too many instances where IoT developers ignore the limits of traditional databases. These databases cannot handle the enormous demands required for analyzing massive amounts of data. So developers instead wind up trying to build applications that require thousands of updates a second. They should know from the get-go that it’s not going to work.

In the IoT world, solutions depend on streaming data.

Solutions depend on streaming data. But most application developers still do not have a good grasp of the best way to process that data. They usually go with: “I get some data. I stick it in the database and then I go run queries.”

The process of sticking the data in the database and running queries works when you’re building traditional applications for transaction processing or business intelligence. The database usage requires moderate data rates and no need for real-time responses.

But that’s not going to work when you have massive streams of data coming in each second that need immediate analysis.

For instance, ask a developer about the speed of their database and they may tell you it can do 5,000 updates a second. So why then are they trying to build an IoT application that must perform 50,000 updates a second? It won’t work. They should already know that from experience.

Let’s step back for a moment to understand why this happens.

Real-Time Applications and the Database

For decades, databases have been used to store information. Once the data was there, you could always return at your convenience and query the database further to determine what was of interest.

But with the advent of real-time systems, databases are an albatross. The entire point of real-time systems is to analyze and react to an event in the moment. If you can’t analyze the data in real-time, you’re severely constrained — particularly with security or safety applications.

Most application developers are more accustomed to situations where they input data into a database and then run their queries. But the input/run model doesn’t work when the applications stream tons of data per second that require an immediate response.

A further challenge: How to display real-time data in some sort of a dashboard.

As a standard, one runs queries against the database to get the data. You kill resources when you try to display real-time information with lots of data running big queries every second.

Except for a handful of specialists steeped in this technology, most of us aren’t prepared to handle high volumes of streaming data.

Consider a sensor tracking ambient temperatures that are generating a new reading once every second. Ambient temperatures don’t change that rapidly, so a few sensors may be manageable. Now imagine the massive amount of data generated by 10,000 sensors spitting out information simultaneously.

Similarly, consider the example of a power company gathering billions of data points that get fed directly into a database. It’s just not possible to dump all of that data into a system at one time and expect to process everything instantly. You can’t update a database 100,000 times a second.

The system isn’t cost-effective or efficient to throw all this data into a database at once and then do nothing for a day until the next batch arrives.

Imagine the hardware you’d need to handle the spike. The situation begs for trouble. In fact, most developers haven’t ever built these kinds of applications before. And when they do try, they’re likely going to encounter errors or get frustrated by slow speeds.

The spike and the system requires finding ways to process the data in memory rather than trying to do it all in the database

New Times, New Development Model

Looking at the spike and the hardware system will explain why we’re still struggling to put in place a workable, scalable architecture that can support the promise of IoT.

Think about the challenges that municipalities encounter trying to manage “smart roads.” If you’re going to avoid accidents, you need data instantaneously. But when data stream transmissions that measure traffic are slow arriving in central headquarters, that’s a big roadblock (pardon the pun).

What about systems based on event-driven architecture?

With the adoption of systems based on an event-driven architecture (EDA), that future need not happen. While EDA is relatively new, many industries already use this approach.

It’s common on assembly lines or in financial transactions, whose operations would suffer from delays getting crucial data for decision making.

Until now, the software development model has relied on storing large volumes of information into databases for subsequent processing and analysis. But with EDA apps, systems analyze data as events occur in a distributed event mesh.

The crucial data delivered.

In these scenarios, the processing and analyzing of data now gets down closer to — or even on — the sensors and devices that actually generate the data.

High volume data must be analyzed in memory to achieve the rapid response times required. The upshot: the development of applications that act in real-time and respond to tens of thousands — or even millions — of events per second when required.

Instead of relying upon traditional database-centric techniques, we must apply an event-driven architecture.

When we apply event-driven architecture — data can be analyzed by real-time systems. And we can process high-volume event streams more efficiently and faster than traditional databases do.

The contours of the future have rarely been any clearer about where technology is heading.

About ReadWrite’s Editorial Process

The ReadWrite Editorial policy involves closely monitoring the tech industry for major developments, new product launches, AI breakthroughs, video game releases and other newsworthy events. Editors assign relevant stories to staff writers or freelance contributors with expertise in each particular topic area. Before publication, articles go through a rigorous round of editing for accuracy, clarity, and to ensure adherence to ReadWrite's style guidelines.

Paul Butterworth is Co-Founder/CTO at VANTIQ.

Get the biggest tech headlines of the day delivered to your inbox

    By signing up, you agree to our Terms and Privacy Policy. Unsubscribe anytime.

    Tech News

    Explore the latest in tech with our Tech News. We cut through the noise for concise, relevant updates, keeping you informed about the rapidly evolving tech landscape with curated content that separates signal from noise.

    In-Depth Tech Stories

    Explore tech impact in In-Depth Stories. Narrative data journalism offers comprehensive analyses, revealing stories behind data. Understand industry trends for a deeper perspective on tech's intricate relationships with society.

    Expert Reviews

    Empower decisions with Expert Reviews, merging industry expertise and insightful analysis. Delve into tech intricacies, get the best deals, and stay ahead with our trustworthy guide to navigating the ever-changing tech market.