2019 Technology Predictions

19 For 19: Technology Predictions For 2019 and Beyond

Striim’s 2019 Technology Predictions article was originally published on Forbes.

With 2018 out the door, it’s important to take a look at where we’ve been over these past twelve months before we embrace the possibilities of what’s ahead this year. It has been a 2019 Technology Predictionsfast-moving year in enterprise technology. Modern data management has been a primary objective for most enterprise companies in 2018, evidenced by the dramatic increase in cloud adoption, strategic mergers and acquisitions and the rise of artificial intelligence (AI) and other emerging technologies.

Continuing on from my predictions for 2018, let’s take out the crystal ball and imagine what could be happening technology-wise in 2016.

2019 Technology Predictions for Cloud

• The center of gravity for enterprise data centers will shift faster towards cloud as enterprise companies continue to expand their reliance on the cloud for more critical, high-value workloads, especially for cloud-bursting and analytics applications.

• Technologies that enable real-time data distribution between different cloud and on-premises systems will become increasingly important for almost all cloud use-cases.

• With the acquisition of Red Hat, IBM may not directly challenge the top providers but will play an essential role through the use of Red Hat technologies across these clouds, private clouds and on-premise data centers in increasingly hybrid models.

• Portable applications and serverless computing will accelerate the move to multi-cloud and hybrid models utilizing containers, Kubernetes, cloud and multi-cloud management, with more and more automation provided by a growing number of startups and established players.

• As more open-source technologies mature in the big data and analytics space, they will be turned into scalable managed cloud services, cannibalizing the revenue of commercial companies built to support them.

2019 Technology Predictions for Big Data

• Despite consolidation in the big data space, as evidenced by the Cloudera/Hortonworks merger, enterprise investment in big data infrastructure will wane as more companies move to the cloud for storage and analytics. (Full disclosure: Cloudera is a partner of Striim.)

• As 5G begins to make its way to market, data will be generated at even faster speeds, requiring enterprise companies to seriously consider modernizing their architecture to work natively with streaming data and in-memory processing.

• Lambda and Kappa architectures combining streaming and batch processing and analytics will continue to grow in popularity driven by technologies that can work with both real-time and long-term storage sources and targets. Such mixed-use architectures will be essential in driving machine learning operationalization.

• Data processing components of streaming and batch big data analytics will widely adopt variants of the SQL language to enable self-service processing and analytics by users that best know the data, rather than developers that use APIs.

• As more organizations operate in real time, fast, scalable SQL-based architectures like Snowflake and Apache Kudu will become more popular than traditional big data environments, driven by the need for continual up-to-date information.

2019 Technology Predictions for Machine Learning/Artificial Intelligence

• AI and machine learning will no longer be considered a specialty and will permeate business on a deeper level. By adopting centralized cross-functional AI departments, organizations will be able to produce, share and reuse AI models and solutions to realize rapid return on investment (ROI).

• The biggest benefits of AI will be achieved through integration of machine learning models with other essential new technologies. The convergence of AI with internet of things (IoT), blockchain and cloud investments will provide the greatest synergies with ground-breaking results.

• Data scientists will become part of DevOps in order to achieve rapid machine learning operationalization. Instead of being handed raw data, data scientists will move upstream and work with IT specialists to determine how to source, process and model data. This will enable models to be quickly integrated with real-time data flows, as well as continually evaluating, testing and updating models to ensure efficacy.

2019 Technology Predictions for Security

• The nature of threats will shift from many small actors to larger stronger, possibly state-sponsored adversaries, with industrial rather than consumer data being the target. The sophistication of these attacks will require more comprehensive real-time threat detection integrated with AI to adapt to ever-changing approaches.

• As more organizations move to cloud analytics, security and regulatory requirements will drastically increase the need for in-flight masking, obfuscation and encryption technologies, especially around PII and other sensitive information.

2019 Technology Predictions for IoT

• IoT, especially sensors coupled with location data, will undergo extreme growth, but will not be purchased directly by major enterprises. Instead, device makers and supporting real-time processing technologies will be combined by integrators using edge processing and cloud-based systems to provide complete IoT-based solutions across multiple industries.

• The increased variety of IoT devices, gateways and supporting technologies will lead to standardization efforts around protocols, data collection, formatting, canonical models and security requirements.

2019 Technology Predictions for Blockchain

• The adoption of blockchain-based digital ledger technologies will become more widespread, driven by easy-to-operate and manage cloud offerings in Amazon Web Services (AWS) and Azure. This will provide enterprises a way to rapidly prototype supply chain and digital contract implementations. (Full disclosure: AWS and Azure are partners of Striim.)

• Innovative new secure algorithms, coupled with computing power advances, will speed up the processing time of digital ledger transactions from seconds to milliseconds or microseconds in the next few years, enabling high-velocity streaming applications to work with blockchain.

Whether or not any of these 2019 technology predictions come to pass, we can be sure this year will bring a mix of steady movement towards enterprise modernization, continued investment in cloud, streaming architecture and machine learning, and a smattering of unexpected twists and new innovations that will enable enterprises to think — and act — nimbly.

Any thoughts or feedback on my 2019 technology predictions? Please share on Steve’s LinkedIn page: https://www.linkedin.com/in/stevewilkes/  For more information on Striim’s solutions in the areas Cloud, Big Data, Security and IoT, please visit our Solutions page, or schedule a brief demo with one of our lead technologists.

Striim Introduces Solution at IoT World to Address Three Primary IoT Data Challenges

The Striim Team is excited to introduce a data management solution to address the key data challenges of the Internet of Things (IoT) at IoT World, booth #218, from May 16-18, 2017, in Santa Clara, CA.

IoT World will bring together thought leaders from several different industries to discuss the growing impact and potential that the Internet of Things has in both the enterprise and consumer marketplace, and we’re thrilled to be able to share our knowledge and expertise in dealing with the volumes of data the IoT is generating.

At IoT World, there will be countless vendors showcasing their ability to analyze IoT data. However, Striim will be the only vendor demoing a streaming data management solution capable of addressing three of the primary challenges in handling IoT data:

  • Managing the tsunami of data generated by IoT devices – because by 2023, we’ll only be able to store a small fraction of the data being generated annually (IDC).
  • Integrating IoT data with the enterprise (i.e. data from enterprise databases, log files, message and event queues) and analyze it in real-time – because business operations cannot make decisions based on IoT data alone.
  • Addressing IoT security issues associated with the explosion of connected devices – because devices are becoming more business-critical, therefore increasing the exposure of companies to cyberattacks.

At the speed at which businesses operate in today’s world, companies need to incorporate a solution that pre-processes and analyzes IoT data with all necessary enterprise data, allowing them to make informed decisions in-time and in-context to improve operational efficiency, drive intelligent innovation and make the IoT-enabled enterprise more secure.

For more information, read the related press release: Striim Introduces Enterprise-Grade Data Management Solution to Address Data Challenges of the Internet of Things (IoT).

Striim Presentations at IoT World:

Along with hosting theater presentations in Striim’s booth, Steve Wilkes, CTO and founder of Striim, will deliver two track speaking sessions during the conference:

Leveraging Streaming Analytics to Combat Cyberattacks at the Edge
Wednesday, May 17
2:20 p.m. PST

Architecting for Continuous IoT Analysis
Thursday, May 18
1:40 p.m. PST

Preview of the Striim Edge Controller

As a key component of Striim for IoT, Striim is pleased to preview Striim Edge Controller, a new management and monitoring framework that allows users to easily manage hundreds of edge servers in a scalable fashion.

Stop by Striim booth #218 on Tuesday evening and grab a beer in a light-up Striim pilsner glass. Or try your key at any time in our Gadget Giveaway for your chance to win your choice of prize in our gadget box.

See you at IoT World!

 

Striim Joins Microsoft, Statistica, Fujitsu, Dell at Hannover Messe

The Striim Team is excited to join with key partners including Microsoft, Statistica, Dell and Fujitsu at Hannover Messe 2017. Through joint demos, presentations and interactive experiences, Striim is showcasing a wide variety of real-time IoT integration and analysis solutions to address the needs of Industrie 4.0.

Taking place April 24-28, 2017 in Hannover, Germany, Hannover Messe is the world’s leading industrial trade show. This year’s lead theme is Integrated Industry, and features over 500 Industrie 4.0 solutions. Look for Striim at the Striim + Statistica Booth – Digital Factory, Hall 6, Booth G52.

Participation with Microsoft

We’ve joined with Microsoft to highlight a demo of our integrated solution enabling the continuous exchange and analysis of IoT data across all levels of an IoT infrastructure. This solution, which provides an edge-to-cloud smart data architecture, helps fulfill the Industrie 4.0 promise of enabling industries to be more intelligent, efficient and secure. To learn more, please click on the following links to watch a short video and read the related press release. Or stop by the Microsoft booth in the Digital Factory, Hall 7, Booth C40 to see the demo.

For more information regarding integration of the Striim platform with Microsoft IoT Azure technologies, check out the Striim solutions on the Microsoft Azure Marketplace.

Presentation in Microsoft Booth

Steve Wilkes, founder and CTO of Striim, will present the following session in Microsoft’s booth:

Ensure Manufacturing Quality, Safety and Security
Through Digital Transformation at the Edge
Wednesday, April 27
10:00am local time (GMT+2)
Digital Factory, Hall 7, Microsoft Booth C40

Participation with Microsoft, Statistica, Dell

Striim, Statistica, Microsoft and Dell have joined their IoT hardware and software to enable digital transformation through IoT, integrating machines, devices, sensors and people. A live and interactive demo at the Striim booth will feature a fully functional, model-sized factory floor that provides true-to-life sensor readings and events, feeding an end-to-end solution for real-time data processing, analytics, visualization and statistical analysis/model building. Stop by the Striim + Statistica booth in the Digital Factory, Hall 6, Booth G52 to experience first-hand the relationship between a factory’s IoT systems, and the real-time integration and analysis of the IoT data. To learn more, click here to view a short video.

Participation with Fujitsu

Furthermore, we’ve joined forces with Fujitsu to bring an advanced security appliance for discrete manufacturing companies using IoT edge analytics. The Striim platform, powered by Fujitsu servers, analyzes machine log data with sensor data from physical devices for reliable and timely assessment of any potential breach affecting the factory floor. To learn more about the Striim Fujitsu Cybersecurity Appliance, click here to watch a short video, or stop by the Striim + Statistica booth.

Please reach out if you are interested in scheduling a demo or a briefing during Hannover Messe, or would like additional materials to learn more.

You may also wish to download Gartner’s 2017 Market Guide to In-Memory Computing Technologies, and see why Striim is one of only a few vendors to address 4 out of 5 areas of In-Memory Computing, and the only vendor to do so in a single, end-to-end platform.

IoT Edge Processing with Intel and Striim

With the power of Intel X86 technology, and the speed and flexibility of the Striim platform, users are able to quickly leverage their IoT data to deliver unprecedented insights, visibility, and actions in real time.

The data explosion arising from the growth of the Internet of Things (IoT) will outgrow the infrastructure and technical capabilities to transport, store and analyze those data. Moreover, transmitting all data to a central location or the cloud for analysis would be cost-prohibitive.

At the same time, businesses need to respond to events happening in real-time with low or no latency while mission-critical applications need to continue functioning even in situations with little or no connectivity. These considerations increase the urgency to move data extraction and processing as close as possible to the source of the information, emphasizing the need for edge computing in IoT applications.

The edge processing power of the Striim platform running on a gateway based on the Intel® X86 architecture, offers efficiency and expediency to the development and deployment of IoT applications. With the power of the Intel® X86 technology, the Striim agent can seamlessly ingest, processes and analyze data in real time, generating insights in milliseconds. By circumventing the need to transmit data for processing and analysis, edge processing with Striim on the Intel-based gateway achieves peak performance at a fraction of the cost and latency compared with conventional architectures.

On top of that, the Striim platform makes it easy to build new streaming edge applications and create visualization dashboards through an easy-to-use, drag-and-drop UI and SQL-like programming language, expediting development and deployment of new IoT applications.

IoT applications do not exist in a vacuum. As such, IoT solutions not only need to incorporate a variety of IoT devices, but must also be integrated with the existing enterprise infrastructure. The Striim platform is built to handle data from a variety of sources – sensors, edge devices, databases, data warehouses, operational technologies and enterprise IT systems, integrating data from both legacy and new systems, bringing about a comprehensive and robust solution catered to the demands of the enterprises of the future.

Programming: A Shortfall of Legacy Event Processing

Guest post on event processing by Opher Etzion

With the onset of the Internet of Things, our universe is now flooded with events that require event processing. These events are introduced to the digital world with all the wonders of IoT and wearable computing that report on anything and everything that happens. But our ability to take advantage of the power of these events is currently quite limited due to several barriers. Here, we’ll discuss one of the main barriers: the difficulty of event processing programming.

What Is Event Processing?

It’s important first to understand what event processing is and how it’s relevant to the challenge we have. Let’s take an example: Say we would like to build a system that detects that a car is being stolen. We can use a motion sensor, camera and velocimeter inside the car, as well as location sensors in the cell phones of all authorized drivers.

Next, we need to decide what pattern of reported events indicates car theft. In this case, the answer could be:

  1. A person enters the car (reported by the camera and motion sensors).
  2. The person does not look similar to any of the authorized drivers (using their archived pictures).
  3. None of the authorized drivers are currently at the car’s location.
  4. The car starts moving (reported by the velocimeter).

There are various types of patterns, including logical patterns of events (conjunction, disjunction, absence of events); temporal patterns (sequence, finding trends in events); looking for thresholds on a series of events; and more.

Indeed, much of the programming in events involves the filtering, transformation, aggregation, and pattern detection of events in real time. We are typically not interested in a single event, but in the interpretation of a pattern detected – in this case, the car is being stolen.

Programming Challenges

Now, let’s look at the programming aspects of working with events.

The way that most programmers have approached event processing is by inserting all events into a database, and then executing ad-hoc or periodic queries to detect patterns. This is consistent with the “request-response” paradigm of thinking in most programming.

However, this approach is neither effective nor efficient. We may miss detections if we conduct periodic queries, so most queries will be issued in vain. Besides, event thinking is different. The required abstractions are related to time and space, and are not present in current programming languages.

Another challenge in programming event processing applications is that it’s hard to get data in and out of these systems, and the syntax is often hard to learn. 

Furthermore, in many cases, the logic of the system must be personalized. Consider a collection of sensors installed in a home with elderly residents in order to provide alerts and reminders. In this case, such alerts are highly personalized, since different people have different needs. The inability to easily control event processing logic is a major challenge in making the power of events more accessible and more pervasive.

Like the Internet that succeeded when browsing the web became possible for everybody, the Internet of Things will bloom when everybody can control an application’s logic by creating and modifying patterns based on multiple events.

8 Streaming Data Myths, Busted

Streaming data and analytics is catching hold across today’s markets, providing powerful tools for organizations to sense, understand and respond to what is happening across markets and geographies in real time. However, many executives lack a clear understanding of the technology and applications.

8 Streaming Data and Analytics Myths, Busted
Mythbusters: Streaming Data Myths

Here are some of the leading myths that hold back successful streaming data and analytics efforts.

Myth #1: Streaming data is only for media/telecom companies.

Media, broadcast and telecom companies are currently capitalizing on the power of streaming in a big way. However, the range of potential applications extends far beyond streaming media. Industrial companies, for example, can monitor product performance long after the sale. Marketers can track customer preferences. The possibilities are only going to grow.

Myth #2: There’s a lot of data that needs to be captured.

Actually, the key to business value is being able to identify and capture the right bits of data that are of material importance to the organization. The challenge is determining in real time what data is of value, and filtering out the irrelevant data from the firehose.

Myth #3: IoT devices in the field will provide real-time data streams on-demand.

Unfortunately, it’s a very diverse world when it comes to devices and sensors, built by many different manufacturers, adhering to many different standards. As a result, IoT devices operate on different networks and have differing latencies. Analytics is required to pull together and rationalize data from across a number of devices. As such, response times might extend to accommodate the highest latency streams.

Myth #4: Some data cannot be streamed.

Actually, all data is streamable, regardless of origin or intent. It could be data coming for a relational data application, or from an IoT device. It’s up to the business to decide if this data would be of value streamed in real time to applications, decision-makers or customers.

Myth #5: Streaming data and analytics is a new concept.

Streaming solutions have been around for many years, incorporated into offerings such as complex event processing engines. Devices with real-time operating systems have been pumping out data, but until recently, this data has never been captured for analysis. Lately, these solutions are becoming mainstream, being adopted for a range of business purposes.

Myth #6: A streaming data and analytics solution is expensive to implement.

At one time, anything connected to real-time processing was more expensive than batch-oriented solutions. However, with the rise of cloud solutions and networks, robust streaming platforms can be quickly adopted. The cloud model, with pay-per-usage plans, offers avenues by which organizations can rapidly embrace streaming capabilities at a relatively low startup cost.

Myth #7: Streaming data and analytics requires specialized skillsets.

The skillsets required to build and manage big data sets, frameworks, IoT, and analytical tools will readily incorporate streaming capabilities. Yes, it may be a challenge to staff a data-driven organization, but these same skills are applicable to streaming data and analytics initiatives. The emerging range of cloud solutions also relieves organizations of the need for full-functioning technology staffs to manage the opportunity.

Myth
#8: Streaming and batch data do not go together.

The value of streaming data will be realized when it’s used in conjunction with traditional batch data. The great opportunity is to integrate legacy systems with new streaming platforms or services.

Striim Included in Vendors Providing Real-World Solutions for IoT Analytics

12 Real-World Solutions for IoT AnalyticsFern Halper of Upside.com has written a great article about the Internet of Things (IoT) and compiled 12 real-world solutions from a number of vendors.

According to Fern, “The Internet of Things — a network of connected devices that can send and receive data over the Internet — is a hot market topic.” The growth of these devices, say analysts, will reach “tens of billions” in the next few years. They’re going to be everywhere you go, everywhere you work, and on everything you wear, and “these devices are generating a lot of data.”

The issue, she states, is “…collecting IoT data by itself is not a path to value. Data must be analyzed and acted upon to drive real benefit.”

And that’s where we came in. We provided a description of how we can help with IoT analytics and real-world use-cases dealing with public transportation management and auto-scaling of content distribution infrastructure.

You can read the full article here and read about how Striim helps with IoT on this page.

The Tipping Point – Data Stream Analytics Meets NonStop Transaction Processing in Real Time

Are we comfortable yet with data streams? Are we considering tapping them for greater insight into changes of behavior and do we acknowledge their potential contribution to all aspects of our mission critical transaction processing? Or, are we prepared to let the streams burble on by for a while longer? In discussing with the NonStop community big data analytics, and in particular the move to real-time data stream analytics, there is clearly a diversity of opinions, not totally unexpected. However, there are some early signs that there’s a growing appreciation among NonStop users and vendors alike of the intersection between data stream analytics and transaction processing.

Across the business world there’s often talk about tipping points and lynchpin occurrences. Often the fodder for novels and movies lynchpin occurrences are a reference to the first domino that falls which, in turn, triggers an unstoppable fall of dominoes that eventually leads to completely changed circumstances. Tipping point, on the other hand, “Is that subtle juncture where a certain idea, product or behavior suddenly catches on, or ‘tips,’ and establishes a whole new reality on the ground,” as one source I turned to noted. What lynchpins and tipping points have in common is the irreversible impact they have on everything they touch – a kind of course correction where there’s no coming back.

Race car drivers are always fine tuning their race cars, but even as they make adjustments they ensure that they can return to a baseline set-up should changes prove to be a step backwards. That’s the nature of the sport – simply put, if you are not advancing then in standing still you are falling behind. When it comes to IT, improvements came slowly at first with systems even carrying nomenclature tied to a decade – the System 360 followed by the System 370 and so on, as we saw IBM coming out with new mainframes. Fortunately, we have seen vendors drop such classifications as change continues to accelerate. Recent changes are opening our eyes to just how big a contribution technology is making to business – as I pointed out in my recent presentation on IoT and IoT Analytics (IoTA) to the NonStop community at the 2015 NonStop Technical Boot Camp (Boot Camp), by quoting in my opening slides Meg Whitman, CEO of Hewlett Packard Enterprise, “IT strategy and business strategy are no longer separate, they have become inseparable … every business is a technology business today.”

The prospect of reaching a tipping point came as a result of a discussion with one vendor attending my presentation at Boot Camp. What was central to the conversation was the business acceptance of hybrids. In particular, where hybrids weren’t just a reference to clouds, public private and / or managed, but to the technology collective that now makes up a data center. Of course, the development of shared infrastructure chassis from HPE NonStop that packages NonStop together with Linux and / or Windows that share a common InfiniBand infrastructure was part of the conversation, but the picture is a lot bigger than just one vendor’s platform. Hybrids are indeed catching on and from my perspective will prove to be a tipping point when it comes to grappling with understanding all that is transpiring within a data center.

Often referenced as a disruptive technology big changes taking place, such as the deployment of hybrids, highlight that we need new tools and new processes – we cannot simply have data center operations try to comprehend all that’s happening around them. There are just too many data streams of events and alerts presented in different forms for any form of understanding or consensus to materialize without additional assistance being provided. What I like about the models behind Striim is how it can be turned on to look at disparate data streams and make sense from a lot of data that lacks any uniform structure. It’s as if the processing Striim performs ensures a steady supply of consumable, or indeed to put it another way, drinkable water.

Striim (pronounced, “stream”) reduces the babble that arises from data streams as they pass by – the noise that otherwise would be incomprehensible – and turns it into actionable data. In my presentation at Boot Camp I referenced the blog post of March 9, 2015, In a Realtime World, NonStop Customer Experience Can Change in an Instant! In the real-time IT world systems, platforms, operating systems, middleware and applications are all providing updates about their operational status and yet, I noted, even as we mix in other systems it all becomes noise! Picking just one example, I added in that post, how this constant barrage of data makes tracking the performance of an application difficult; who can tell whether basic SLA metrics are being met?

The intersection between data stream analytics and transaction processing has indeed arrived and it does represent a course change that all in the data center are coming to appreciate. Even the most hardened of NonStop system managers is aware of the need to integrate the data being generated by the execution of adjacent applications. Are databases truly in synch? Are networks and firewalls really functioning for all users? Are the processes running actually conforming to the SLAs in place? It’s all too hard to do without an additional fabric and yes, the tipping point has been reached. The more I talked with the NonStop community following my presentation (jointly given with Justin Simonds, Master technologist at Hewlett Packard Enterprise), the more I came to appreciate the full potential on offer with Striim, and from my perspective, it may very well prove to be the tipping point for every data center where NonStop systems reside.

How Will the Internet of Things Impact Streaming Data?

Internet_of_Things-300x295In a thought provoking post on Forbes this week, Mike Kavis warns Don’t Underestimate The Impact of the Internet of Things. Mike discusses how the Internet of Things will have a bigger impact on life that most people understand. As the post starts out, many people hear the term “Internet of Things” and they envision a connected refrigerator ordering milk or a Fitbit telling the world how many steps you took today, but the IoT is far more comprehensive.

The IoT represents advances in sensing technology, streaming data, and technology cost reduction that will fundamentally impact how all industries function. A few examples are:

  • Manufacturing: Sensors on manufacturing lines measuring temperature, humidity, torque, and light and alerting of anomalies in real-time
  • Healthcare: As we care for our elderly citizens and allow people to live longer in their homes before being shifted to assisted living facilities
  • Transportation: GE smart jet engines are transmitting over one terrabyte of data per flight so that mechanics know that an engine requires maintenance as soon as it lands
  • Security: Detecting hazardous conditions in real-time

The post points to sensor company libelium who list 50 Internet of Things use cases on the site, including: smart cities, smart water, smart metering, security & emergencies, retail, logistics, industrial control, smart agriculture, smart animal farming, domotic (domestic robots) & home automation, and eHealth. Mike concludes: “We are heading towards a world where everything is connected and better decisions can be made in real time.”

This sentiment is aligned with our thoughts that we are moving away from a software world of “query/response” (ask a question, get an answer) and moving toward the sensing enterprise. In the sensing enterprise software monitors your streams of data and acts autonomously to specific correlated events in your streams. The IoT drastic increase in the number of connected devices will dramatically increase streaming data volume, velocity, and variety. The next logical question is how will you handle all of the streaming data from the Internet of Things? The WebAction Real-time App Platform is designed to manage streaming data in an efficient and easy to use manner. As an end-to-end platform from acquisition to processing to delivery of your big data records, WebAction is a one-stop shop for real-time. See how the Internet of Things will impact your business, setup a free needs assessment with one of our real-time streaming data experts.

Posted in IoT

The Economist on the Cyber-security Dangers from the Internet of Things

Economist_Internet-of-Things_IoT-July-2014-300x222The Internet of Things Means You Now Have Less Time to ID Threats

In a July 12 Cyber-security brief “The internet of things (to be hacked)” The Economist discussed the coming explosion of connected devices sharing data in what has commonly become called the Internet of Things (IoT), or as some are now calling the Internet of Everything (IoE). Either term gets you to a place where 18 months from now you have way too much data coming at you to store it all now to process “later”. “Later” will never come and every one of those devices introduces a new potential security threat to your enterprise. The Economist notes:

“There have already been instances of nefarious types taking control of webcams, televisions and even a fridge, which was roped into a network of computers pumping out e-mail spam.”

In this hyper-connected world you need to be continuously monitoring all of devices and traffic on your networks to note interesting correlated events across your infrastructure. WebAction Security Event Processing Data Driven Apps give you a unique insight across your data streams.

Wireless Networks Will Become Saturated

By the nature of IoT devices, wireless is the preferred method of communication. As the number of devices grows, so does the wireless chatter and noise. The chatter is building over all wireless communication channels: WiFi, cellular, Bluetooth, near field communications (NFC), and others. There is no end in sight to the expected growth of wireless connected devices. Networks will need to be fortified and new methods of managing wireless traffic are being considered. Enriching wireless network traffic with rich context and history allows for dynamic network traffic prioritization based on the profiles of your customers. Always make sure that your most important customers always get the best Quality of Service, and know immediately when quality degrades.

Insights Available in Your IoT Streams

The Economist fears that with the loose regulation of connected devices we will see more incidents of hackers working their way into your refrigerator and thermostat. The brief concludes with: “Who needs a smart fridge anyway?” I suppose that is an interesting question, but rather than resisting progress and change (which we know doesn’t work in the long-run), we suggest finding novel ways to immediately identify and neutralize security threats arising from the Internet of Things.

All of those connect devices are reporting their streams back to home base, and home base needs to make some snappy decisions about what to do with the data streams flowing in. At the same time every  stream rides on your wireless networks and contains potential threats and useful data signatures. That’s where the WebAction Real-time App Platform shines, monitoring streams to identify patterns in-memory enabling immediate (and informed) action downstream. On the fly your real-time data is correlated across streams, filtered, and enriched with history and context to create highly actionable Big Data Records.

The Internet of Things comes with some very significant blue sky ahead of it and the WebAction Real-time App Platform enables you to take advantage of this new frontier. Request a demo of the WebAction Platform.

Posted in IoT