What is streaming data? Event stream processing explained
Streaming data, also called event stream processing, is usually discussed in the context of big data. It is data that is generated continuously, often by thousands of data sources, such as sensors or server logs. Streaming data records are often small, perhaps a few kilobytes each, but there are many of them, and in many cases the stream goes on and on without ever stopping.Historical data, on the other hand, normally goes through a batch ETL (extract, transform, and load) process before going into an analysis database, such as a data warehouse, data lake, or data lakehouse. That’s fine if you’re not in a hurry. On the other hand, it’s common to need to process streaming data quickly in order to act on the results in as close to real time as you can.To read this article in full, please click here
Streaming data, also called event stream processing, is usually discussed in the context of big data. It is data that is generated continuously, often by thousands of data sources, such as sensors or server logs. Streaming data records are often small, perhaps a few kilobytes each, but there are many of them, and in many cases the stream goes on and on without ever stopping.
Historical data, on the other hand, normally goes through a batch ETL (extract, transform, and load) process before going into an analysis database, such as a data warehouse, data lake, or data lakehouse. That’s fine if you’re not in a hurry. On the other hand, it’s common to need to process streaming data quickly in order to act on the results in as close to real time as you can.