In this video, Steve Wilkes presents “Addressing the Fundamental Challenges to IoT Data Management” at the 2nd Intel Global IoT DevFest.

Up until now, much of the perceived innovation of IoT has been centered around devices, and the new use cases they create. However, the true value of IoT is in the data. According to a recent IDC white paper, the amount of real-time data within the next 8 years will increase to 38 zettabytes – of this, 95% will be originating from the world of IoT.

Needless to say, properly managing IoT data will be critical for enterprise companies, so how can they best prepare to deal with these extreme volumes of IoT data?

In this presentation, participants will learn how to address the three primary challenges to IoT data management:

  • Managing IoT Data – Collect and process volumes of IoT data through in-flight filtering, transformation and aggregation at the edge, and only store and act upon relevant IoT data.
  • Integrating IoT Data with Enterprise Data – IoT data cannot be siloed, but needs to be integrated with all other enterprise data in real time to gain insights from the data before it loses operational value.
  • Addressing IoT-related Security Issues – Correlate events and identify potential breaches across IoT and nion-IoT systems throughout the enterprise.

Visit our IoT solution page to learn how Striim can help you get timely and rich operational intelligence fromm device and sensor data.

To learn more about the Striim platform, visit the platform overview page.


Unedited Transcript:

Hi, good afternoon everyone, or evening or even for some of you in the world. Good morning and thank you for attending my presentation.

You’ve probably heard a lot already today about and yesterday about Iot devices, Iot, device management, um, security. How do you integrate blockchain, um, integration of machine learning? Yeah, a whole bunch of different areas of Iot. In this presentation I’m going to focus on Iot data and Iot data management and how do you handle the large amounts of Iot data? So unless we should the presentation and you’ll see me again when we get to the questions. So to start with, Iot is not a, a silo, it’s not something that should be treated separately to everything else. Iot is a part of enterprise data assets, so it needs to be treated as part of a connected ecosystem. And so the Iot data is part of your data assets as an enterprise, as it goes with your enterprise data assets, anything you might have in the cloud and you get best value out of Iot data when it can be integrated with correlated with, joined with other data assets that you have.

And this is crucial to the digital transformation. If you want to start doing things like predictive maintenance or part management, you start thinking about cyber security or predictive monitoring and automated customer engagements and all of these different things you need to integrate things that you’re getting from Iot with data that you may have elsewhere in the enterprise. And this is true across every industry. There are innovations happening across so many different industries with Iot and it’s going to affect everyone at some point. And getting the best value out of Iot really involves dealing with the Iot data. And the challenge today is that there are so many different technologies. You have data that’s being generated through websites, applications, network traffic devices you might have in the enterprise. And then you have the, the new wave of Iot devices that mobile deliver to cloud automatically. Um, they can be very varied and the data content can be very varied.

And so there’s seven different technologies that you may have already. Um, some legacy data, uh, databases. You may have data lakes that you have messaging systems, ETL and batch systems and all these need to work in order to fulfill business needs to better serve customers, to deal with orders and eventually insecurity. Um, so all these things need to work together in order to provide seamlessly the best value out of Iot data. The other thing is data volumes. You think about Iot and was this study that came out from IDC a few months ago and they basically stated that today we’re generating around 16 zettabytes of data annually. So every year another 16 zettabytes of data by 2025 that estimates in that will lead to 160 zettabytes of new data generated every year. And it anyone with a math background for recognize that curve. It’s an exponential curve and that means that in every two years, the generating more data than has ever been generated in the whole history of mankind before NASA.

A lot of data now off that data today, around 5% of it is generated a needs to be dealt with in real time immediately by 2025. The estimating, the 25% of all data generated will be real time in nature so it will be generated and have to be dealt with in real time. And of that 95% of it will be generated by Iot. Now this is the kicker that you have all of this data being generated but that send yellow line is the available storage. So only a small percentage of all the data being generated can ever be stalled. This literally physically not enough hard drives, flash drives, magnetic tape, newfangled crystal structures to store all of the data that’s being generated. It’s just something that you cannot do. So if he can’t store all the data, in fact if we can only store a small fraction of it, you’re left with the conclusion that you have to process and analyze the data in memory before ever hits this before I ever hit storage in a streaming fashion and this kind of streaming approach to things. And that helps you move to additional transformation where you can integrate and join data in real time and you can do real time alerting and monitoring. And you can also use this for integration so you can, um, combine data together and landed somewhere, whether it’s in the cloud or file system or databases, et Cetera. But really it’s trying to move to a unified technology that incorporates iot data with the rest of your enterprise data in a scalable fashion. I’m using a inherently streaming in memory architecture.

It’s not just iot data that is enormous. Um, people don’t normally think of things like security data as Iot, but security data can be enormous. If you think about, um, no devices, rotors, right? You can have netflow data, which generates a lot of data. But if you’re doing package tacky capture on a network, that’s immense amounts of data. And so you think about not just iot but other sources of data and that’s, you can see where these data volumes are coming from. Um, you have to think about how do you incorporate all of this together and answering questions. You, how do you avoid losing or ignoring the value, that data while still only storing the minimum. And that Kinda gets to the notion of a difference between data and information. Anyone with a information science background will know there’s a difference is that the large amounts of data can have very small information content.

If you think about measuring the temperature in the room and you’re doing that once a seconds, that’s the over three and a half thousand data points, um, in an hour. But if the temperature stayed the same, 70 degrees for that whole hour, how much inflammation is that? That’s just one piece of information. It was 70 degrees for the whole hour. So yeah, other than connect, keep alive information from individual devices. From a story’s perspective. If he wants to do machine learning or if he wants to do a further lytics or 14, you don’t need the citizen data plates. You just need to know that between this time period, the temperature of seconds, B grades. So you can start to think about how do you convert the information content, the data, content, inflammation. Um, and that’s really what we mean by only store at the minimum you’re storing the inflammation.

And how’d you also at the same time correlate events so that you can do immediate responses so that you can react as right. So if without temperature scenario, if the temperature suddenly over a period of less than a minute increased to 200 degrees, there’s a good chance the room’s on fire. So you can react immediately to that. As soon as you see that San Francisco Times rise rapidly, you need to be able to react immediately to it. And there are obviously much more critical situations that you have to respond to even faster. You know, you’re measuring the speed of the turbine. If it suddenly drops, you need to be able to react to any milliseconds. Yeah. So those are the things we’re talking about here. How do you work on the data so you, and you’re still at a minimum while at the same time being able to react to immediately, um, and give proactive responses when you’re actually needs it.

And doing this too, obviously some operational efficiency and then also to better serve customers and protect your reputation. Um, and be competitive. Innovation is really what drives the competitive nature of society and if you can as innovate people, you can come up with new ways of doing things. You can disrupt and you can change things and that’s the way the competition works. Um, it’s not just, you know, Iot data that is streaming. And the reason I mentioned this is because if you’re getting device data as data streams, we already talked about that. It needs to be able to get the data in a streaming fashion because you can’t store all of it. Basically getting that data as data streams. Then in order to make best use of it in order to correlate it with other stuff and make instant decisions based on correlations between Iot data and what’s going on in the enterprise and databases and what’s going on in machine that’s written to logs.

All of that needs to be streaming as well. And it’s just a historical thing that we deal with all of that, rest of that data in batches. If you think about why we have batch processing of data, um, it’s really because historically storage was cheaper than CPU and memory. And in fact having enough memory to do like a daily batch job was just out of the question, what you could store that data and process it from this and right back to this. So the notion of of batches is really an artifact of previous technology limitations. But if you have sufficient CPU and memory and both CPU and memory are getting much, much cheaper, then you can start to think about instead of doing queries against the database, do change data capture and turn that into a data stream. Um, instead of processing machine logs or love of the sun, you read at the end of the log file and stream in real time.

So that gives you the notion of all of your data in enterprise, not just iot data being streaming, which then means that stream processing is a major infrastructure requirements. And by stream processing, I mean the in memory processing and analytics data before it ever hits this, um, in a structured way in order to create some business. And this is part of a overall data modernization. And we’re seeing that across a lot of access tumors and it’s definitely evidenced across the front of the whole industry. Um, that there’s this notion that, you know, there’s a lot of value in all the data. All the data stores is legacy data stores and legacy can mean three years old instance where there’s a lot of value in that data and you can’t just repair everything old and replace it with something new. For example, if you’re a manufacturing plant, um, manufacturing equipment doesn’t work on internet times.

It’s not like the iPhone where some people were placed every year. Um, it’s something that maybe has a 10, 20 a lifespan and you can’t just rip all of that as replaced with something new because it sounds like a good idea or because it’s trending, it has to be done in a methodical way. And typically, honestly these things h at, you know, so, um, how do you access that data and work with that as part of your iot assets and incorporate that with some of the newer data you have. So you can have an older manufacturing plan that has robots and things in it. Um, and maybe that data is being written to a historian, which is you’re a database and now you can stream that data out by doing change data capture on that database. And then you have, um, a live data from that.

And maybe you want to augment that by slapping on new sensors onto the motors on the robot that will measure vibration and temperature and movement and all of these things and correlate the old data with the new data in order to make sure the robots behaving properly. So when you think about data modernization, it’s not about replacing old castles with shiny new skyscrapers. It’s about how do you incorporate the new technologies with the old technologies. And some of the things we hear from our customers, uh, you know, the things that they have today, the legacy systems, they either can’t keep up or they expect that in the future, the data volumes are going to massively increase and it needs to be able to be prepared for that and need to be able to deal with that. Or the latency in the applications is too high.

Maybe they ricing everything to a database and they’re doing batch processing on the afterwards, which is going to give them, you know, maybe two, three hour time frames before they know something and they want to move to more, you know, five, ten second timeframes before then or something. And then in addition to that, there’s a lot of pressure on the people within organizations and do analytics to produce new applications and produce new ways of viewing things. And a lot of this comes down to our exposure as consumers to things like smart phones. No, we are used to now having information at our fingertips, being able to receive instant messages and instant video and instant news. Everything is just that. And the interfaces, the ways that we access these things is also very easy. We have very easy access to real time information. And as consumers, we’re used to that when you move to an enterprise scenario and you want the same amount of your business systems or your manufacturing systems or whatever, um, it’s much harder to get through because things maybe weren’t architected that way.

And when you asked to produce outputs, new applications, new views, new reports on that, you need to be nimble as a, as a analyst, unless it’s department, a data science department to be able to keep up with all the demands you could get. So we need to think about architectures that can handle all of those things without necessarily ripping and replacing all the existing systems. And it’s thinking about, um, manufacturing. For example. You know, there was a very hierarchical view of kind of how manufacturing looks, um, all the way from the individual sensors and devices [inaudible] kind of control levels and then manufacturing execution systems all the way through to color here, RP, which basically manage the supply chain and that kind of hierarchical if a rigid architecture, the specific things at different places, um, and processing done in particular areas is being replaced with this more general architecture that has no devices on one end and applications on the other end with processing and analytics and storage happening wherever it makes sense.

And this may be termed the fall, um, you may call some pieces on here first receivers. Um, it’s, it’s really what, uh, how do you want to term it? But at the end of the day, it’s an architecture that incorporates the processing, analytics and storage and movement of data in a very flexible way that enables lots of different types of applications to be produced that can both minimize your amount of data that you need to store. Um, so you can manage huge amounts of data that can also react immediately. And that kind of smart data architecture, um, can be exotic, varied. And I’ll give you some different flavors of ways of piecing together this architecture that can solve different scenarios, right? So you have, um, to think about the requirements when you’ve got to collect and process data, perform analytics, be able to take actions on the data and on the results found the six things we find out, visualize results and do this and have the edge in the cloud on premise and wherever it makes sense for a particular application. Let’s start with the simplest kind of Iot architecture. And this is what a lot of people going to think about Iot on that you have devices and they send data to the cloud and that’s where you do your processing on Listserv and you have those, a lot of examples of that. But there are also a lot of issues with that architecture. Um, first off, not every device you’re interested in is I, it doesn’t speak it just no Internet, right? There’s a lot of legacy devices that you want to be able to connect into this type of architecture. But how do you manage that? How do you turned them into an Internet enabled things? And

The other thing is maybe you need to react fast. You’re not gonna be able to react fast if all of your processing is in the cloud. Yes. The two of the things we’re trying to solve here aren’t solved by this architecture.

We can solve the same thing and there’s a lot of, um, discussions and momentum around protocol translation gateways. Uh, protocol translation gateway is like C3 period from star wars and knows how to talk to, you know, both mother and robots like R2D2 and also the dehumidifiers who are working in the, in the desert war track. Yeah. So you need to be able to talk to your manufacturing equipment. Um, old school things have wired into boxes backnet and lockless and Zigbee and some of those other protocols and integrate that with the new devices that talk MQTT or MQP over TCP. Like, so the critical translation gateways, they work with that. Microsoft has one with the Iot edge. Uh, there’s work going on, uh, within a consortium organized by Dell, the edge x foundry. His goal is to produce a modular critical translation gateway. But it’s not enough though, just to do the protocol translation at the edge because if you just doing that and you sending everything up into the cloud, you’re still missing out on things.

You’re not doing your analysis quickly and be able to respond rapidly. Um, you don’t have this flexible architecture and you’re sending huge amounts of data over a network instructed that you maybe don’t even need to. So that’s where you start to have to think about, uh, doing edge processing and analytics, um, also within the gateway. And so you can start to do things like, uh, change detection. So instead of sending all the data into the cloud, you only send the data in sucks. Like when it changes with, you know, pings heartbeats to let you know the devices still alive. And so instead of sending the three and a half thousand data points a second from your thermostat or your, um, data points a second from your a turbine, you just send it when it changes. Um, and [inaudible] happy to say that I’m still alive. So you can do that kind of thing at the edge.

And you can also do analytics. You can look for neutral patterns or anomalies, um, and then alerts on that and maybe respond and change and modify devices, stop things from happening, uh, sound alarms, that kind of thing, and do that at the edge as well. So you can get this really fast response. And that type of thing often in involves machine learning. So you know, a model that you can use here is you take, not the data but the information contents from the results of the processing and analytics. You move that into the cloud. You do that across a large number of areas. And now the toad is accumulating information, not the raw data, but in summation, which can be used to perform machine learning. And once you performed the machine learning and you’ve built some model and you’ve tested that model and it looks like it’s valid, you can then move that model to the edge.

What’s the models at the edge? It can be used to make predictions in real time close to where the data is being generated. It can also be used for anomaly detection to look for unusual behavior and a whole bunch of other things that you can imagine. Machine learning could be useful. So this is kind of a general reference architecture of how you would incorporate machine learning into Iot. And obviously that can then be scaled up. You can add in multiple additional nodes within a particular sites. You can add additional sites, you can accumulate all of that information in the cloud, performed the machine learning, export the models, and do your edge analytics. So now you’ve done a lot of the things we talked about. You’ve reduced the amount of data down because you’re sending only information to the cloud and you’re also able to do processing and analytics of the edge, which enables you to respond rapidly to issues.

Send alerts, maybe send less and less into the cloud as well. So you can have a visual view of everything that’s happening across all of your, all of your sites. So this, this kind of architecture works really well. So a lot of different use cases where you have, uh, information in a lot of different sites that you need to be able to view in a single place. Uh, but in a number of different scenarios, you can imagine that you want to also add an onsite processing and analytics. Um, and this could be for a number of reasons. It could be because you need to be able to manage an individual facility or an individual production line and do that separately from a lot of other things. And that requires not just very specialized, um, algorithms, specialized views into things. But you also need to be able to react more quickly than if the data had to make it all the way into the cloud.

And you may want to repeat that across into precisely all have their own autonomy, but then you also have situations, I imagine this isn’t any factory in much in this healthcare. In a healthcare scenario, you may not be allowed to send a lot of this data into the cloud in a, uh, in the phone that is on premise. Uh, you may have patient data in there that is protected by HIPAA and other regulations and that may not be permissible to send that into a cloud. But what you may be able to do is to anonymize that data. And so if within a hospital you can view everything that’s happening, um, based on the, the architecture with kind of localized processing and unless it’s for an individual hospital, but data the sense up into the cloud is anonymize, the identifying patient information is removed, then you can additionally do machine learning in the cloud.

You can also view across a whole bunch of different hospitals, you know, status and things and where you are with things like inventory and other stuff. Um, but what you can do with a machine learning is maybe you can look at trends, um, and look for correlations between information about patients, uh, in an anonymous fashion, maybe looking for combinations of different measurements on that patient, um, and what their symptoms are and look for relationships between those that you may not have spotted before. And then by that machine learning most of the patients. So you can start to think of a uh, health test scenario using anonymized data being produced by huge amounts of hospitals that will help you save patient’s lives because you can now, um, look to things and learn things about patients that you wouldn’t have been able to do before.

So if you have an architecture like this, it’s enabling you to connect to anything because you have the Internet enabled Iot, which you can already talk to. You already do, MQTT, AMCP, TCP, http, whatever protocol it’s using. But you also have the protocol translation gateway that has modules that allow you to speak to back in the mapquest slope, ECUA and all these other things, um, that talks and more and more devices that I to reactivate it because you’re doing the edge processing and analytics and also limited data centers to the cloud and change that data to information before ever makes sense. Like I to make everything much more efficient.

It obviously will scale as required because you are moving a lot of the processing to can edge devices and edge devices can inherently scales by adding more of them and add more onsite processing and analytics as necessary and the skylight excited. That’s what the site is designed to do with the ability to control everything centrally through a cloud interface. Um, so not just be able to visualize everything but be able to work with everything century through a correct side interface. Iot, as I mentioned, is not a separate thing. So to get the full picture that you have to join together, not just the Iot data, but you have to also bring in other enterprise data. So you start, it seems like, um, finance information, customer information, a supply chain inventory, all of those things. And the biggest value that you’re going to get from Iot is when you can join that data with your iot data.

Yeah. So very good example would be if you have a sensor on a motor and that sensor is just sending out data, say over MQTT, and the data is sending his device id Xyz value three. But that doesn’t mean much from an analytics perspective. You know, it’s a sensor, you know, sending out value three, you’re not quite sure what that means or what that sense is where it is. But if you can join that sensor information with information that you have in your asset database with information you have in your ERP system, then you find that this is a sensor and it’s measuring vibration levels on a Mosa that was bought three years ago with a warranty of three years. Um, with an average life span that you’ve seen in your dealings with this type of motor. Um, that’s 3.2 years and that’s going to start to give you much more information for your analytics or that information wasn’t present in the Iot side of the world. There’s a present in the other enterprise assets that you have. No, it’s just a very simple example, but I’m sure you can think of hundreds of more examples where by augmenting iot data with other data, it just makes it so much more valuable.

I mentioned machine learning as the way that you really integrate machine learning is that you need to be able to work with, uh, data, first of all. So a lot of the machine learning algorithms, so kind of sensitive and picky to the format of the, that you give it to train it, they tend to work best when you present all variables together in a single row. Um, and so that you can train the model based on the correlations and relationships between variables. Um, then there are algorithms that work with time series data and receiving events over time. Um, the are more complex but you know, key thing within machine on needs, it needs to process and prepare data and get it ready for the learning first. And what you want to try and do is make sure the same data that you using separate pear machine learning and train the model can in future be used to perform real time scoring.

Inference predictions, uh, nominate detection classification or whatever it is you’re doing can be done on the same data, right? So if you take a look at this flow, you take the data wherever it’s coming from, we have iot and maybe some enterprise data, join that together or parrot get it into the right format, right that add to files and then have they to run their magic against the files and build a machine learning model and obviously test that, validate it against the data, ensure that it’s correct, export that model and then utilize that within a streaming architecture. You know, so now the same data that is being accumulated for training can be passed into the in memory streaming pot and compare to gates the model pushed through the model and see what the model tells you about it. And so that would enable you that then immediately a on real time data, see if an anomaly was occurring or on real time data, predict when something was going to fail or classify something in some way based on the data you’re getting, babies image classification. So all these things can be done if you have this kind of architecture that enables you to train the machine learning model and to do real time scoring. And of course, if you’re seeing that the model isn’t working well because you continually accumulate some training files, you can then also, um, retrain the model and reexported and resend it back into the streaming side and now work with a new updated model so that it makes that whole process much easier.

The other thing that I mentioned, and this is something that we see customers doing is if you have on premise a manufacturing equipment or other equipment that is not Iot, it’s just that things, uh, but they’re not, uh, already hooked up into the Internet. Um, maybe that writing data into a historian database. Then by utilizing change data capture and doing that within the framework of this architecture that I’ve just talked about, you can treat historian data as if it was real time iot data being censored with any other way. So it’s a way of modernizing your existing investments in manufacturing or in a three equivalents that writes with database medical information. The right, so database for example, by using change, data capture and streaming, it’s what’s being written to the database in real time. You now have a stream of your iot data as well. Well, just talk about some use cases of these architectures. Um, one obvious one is actually cybersecurity. 

A lot of new approaches to kind of cyber security do incorporate machine learning. So it’s about looking at what is normal behavior and classifying that as normal behavior and then being able to spot unusual behavior really quickly. And the best way of doing that is by selecting and correlating as much data as you can. Um, it’s easier to save a single piece of data than it is to save a whole collection of data from a lot of different sources. If you think about a stuxnet, stuxnet was a virus that, um, incited the control systems of centrifuges, um, that we’re making nuclear material and it did two things. The centrifuges simplistic fashion, it made them spin faster so they would destroy themselves. And it also told the, uh, people monitoring the centrifuges. Everything was okay. So it gives them time to destroy themselves. Now, if you are also measuring a whole bunch of other information about those things like vibration levels and power draw, maybe video imagery of, of those things. Um, and correlating that with what you are being told, you will see that what you’d been told isn’t actually what’s happening. And so by utilizing machine learning and incorporating iot data with other data that you may have, um, maybe network data, seeing if there’s been a security breach there, all these things that you get to get a much bigger picture of. Uh, the security around IFC. And this also works for enterprise security as well. But in the context of Iot, security is essential to correlate as much information as possible.

And another example where machine learning can be very important is in a production quality where you collect and analyze device data and you predict it with machine learning what you expect, the quality, the end results and be, uh, based on what you’ve seen in the past. And so, um, instead of a widget making it all the way down in a production line only to go in the reject box, you may spot very early on that based on what we’ve learned in on machine learning models, it’s the parameters of the widget are outside a certain range at this point in the production line, then it’s gonna fail anyway. You know. So by incorporating machine learning and of data and the architecture that I mentioned with uh, edge analytics based on machine learning, you can start to get a better picture of production quality and the issue is with a massive ROI because you’re not wasting anywhere near as much, um, raw materials or, uh, pieces that you building. And moving down to production line.

Similar story with healthcare, monetary [inaudible], it’s one thing to be able to measure certain aspects and parameters of a patient, but if you can join all that with multiple meds, medical devices, um, patient data, potentially utilizing machine learning on way as data, um, to enable you to look for anomalies or potential issues. Um, like if for example, if you had additional information about a diabetes patient in addition to, uh, glucose measurements, you knew when they left eight, um, with they’ve been out for a walk recently, what their heart rate was. And you could combine that and correlate all this together, compare that against the model you would be able to track immediately if this patient was at risk. So again, by joining together lots of different patient information, running it through models, you can have immediate insights into patients and reacting immediately. And also, um, if you can somehow get everyone to collaborate, you can get large scale data.

Again, anonymize looks at trends, so spot outbreaks and things like that very quickly. And a final example, again use the similar architecture is location tracking and location is a really important aspect of Iot and is also really important aspects of any or reason why you need stream processing. Um, thinking about location is things move around a lot and they move around quickly and you want to be able to deal with them quickly. And we have the example with a partner building a, a ample monitoring system. And that involves monitoring the locations of thousands of passengers in real time and staff in real time identifying multiple different zones and looking for when people walk in and out of zones. And Helen, they were waiting there for the purpose of this was if you know more people, too many people were wasting in one particular area. Maybe it was in ticketing, uh, then you could automate sending more staff to that area, deal with them and we’ll see people walked out into a, uh, uh, an area they weren’t supposed to go to into a secure area and then you can spot that immediately and send someone there as well.

But location tracking is generally a useful thing. A lot of people are used to real time with locations. Uh, if you’re driving, you’re probably used to using Blaney’s, uh, which gives you real time insights into accidents around you in real sign branching. Um, real time is essential and streaming is essential when it comes to location monitoring of any kind. But it has a benefits in manufacturing care, retail, aviation, et cetera. They should think about. Um, the question I like to ask people is you what have airports and hospitals going common? Um, the answer is problems with wheelchairs, you know, not knowing where essentially equipment like a wheelchair is all the time can be late planes and can delay a patient treatments in hospitals. So being able to monitor these things in real time is also a essential location. Tracking I think does see a lot more. This type of application as realtime analytics and processing really starts to pick up steam.

So just a little bit about us before I finish off the presentation. That opening up for questions. Um, the stream platform is a complete end to end platform that supports streaming integration analytics across enterprise, state and Iot. We have a flexible architecture that allows you to deploy data flows across a on premise and the cloud and processing at the edge. We do a continuous data collection from uh, not just iot devices and sensors messaged twos but also from files by reading at the end or from databases for Change Data Capture. And then we allow you to do the stream processing in memory with real time filtering, transformation aggregations and enrichments and do all of this through a SQL based language. It makes it easy for anyone to work with.

You can also do streaming analytics, a correlation of multiple data streams, a complex event processing, statistical analysis, a integration of machine learning models, and then also build visualizations and alerts and kind of trigger external systems and deliver the results of the collection, the processing, the analytics, um, to anywhere to databases, files, Hadoop/NoSQL, Kafka, cloud, etc. Um, we integrate with a lot of existing enterprise software, uh, open source as well as proprietary things like databases. And we do all of this in the enterprise. Great fashion. These inherently clustered, distributed, scalable, reliable and secure. Now we are a streaming integration analytics platform that supports Iot and we support specifically the integration of Iot data with all rest of the data that you need to give value to that when you’re building applications for our platform, you do so with data flows. Now you start with sources you’ve been processing through SQL and you end up triggering something, writing data somewhere.

And you could also build dashboards, uh, live streaming visualizations by dragging and dropping visualizations. They give you a view into um, what as the back end processing is you’re doing. So you can see how you can build up ISC applications really easily. The starting off with Iot data sources, MQTT, AMCP, whatever you have already. Um, we have examples as you working with, uh, as we know devices and working with a muddle factories and things like that. But I can easy to get your hands around. Um, and I’m delivering that Davis, NTT, you processing that data and then building a whole dashboard and visualization around it. As I, as controlling the devices, we integrate with a whole bunch of stuff. This is an eye chart and you’re welcome to come and look at this and the downloads of the presentation of the videos.

And we really differentiate ourselves by being an end to end platform. So everything that you need. So as you start getting, evaluating your data immediately, the easy to use as you can build applications really quickly using our platform and is easy to deploy them as well without any coding necessary. Um, it is an enterprise grade platform says inherently distributed, scalable, secure and reliable. And we saw a lot of the integration aspects. We worked with most of the data sources or you’re half, you know more about us, he has all the links that you need. And uh, with that, um, I’ll open it up to any questions. I don’t see any in the chat window right now, but, um, if you don’t have any questions I can always ask myself some. Um, we’ll just wait for a short while, see if anyone comes up with anything. Um, let me, uh, just go back to the Webcam.

There we go. And let’s just wait and see if anyone’s going to ask any questions. So, um, while people are thinking of really intelligent questions to ask me, um, I’ll ask myself the question. So yeah, all the, you know, lots of Iot platforms out there already. That was another open source iot platform price crediting platforms, um, that gets the data. Well, yes sir. All right. Um, and I’ll say as I mentioned in the presentation tread say time and time again, uh, a lot of the existing iot platforms focus on Iot data and just being Elsa access collect, they’ve deliver the Iot data to the cloud. Um, or if you’re lucky, being able to do some processing and analytics on that Iot data. Now it’s you want to get value out of the Iot data. Typically you’re going to need to join that with other data as well.

And a lot of that other data is locked up in other places within the enterprise and you need specific ways of getting to, whether it’s reading for files with its passing logs, whether it’s accessing network information or getting data from databases in real time. Um, you need specific tools to be able to do that. And so what we’ve done in our platform is to look at Iot as another source of data. And yes, be able to do things like edge processing. But edge processing is also useful for security data, not just iot data. That whole architect, it was useful for a lot of different scenarios. Um, not necessarily just with the data, but in any situation where you have huge volumes of data coming through. So I think I’ve exhausted myself with questions like ask myself a lot more. Um, but I’m not going to do that. I will give you guys a break than a little bit more time back. I think probably have 15 minutes left, so you get 15 minutes more of your day. Um, and unless anyone has any last minute things I want to ask me. Five, four, three, two, one. I will end it there. Thank you.