Kinesis Data Generation

Allan MacInnis shows off a new data generation tool for Amazon’s Kinesis:

Amazon Kinesis Streams and Amazon Kinesis Firehose enable you to continuously capture and store terabytes of data per hour from hundreds of thousands of sources. Amazon Kinesis Analytics gives you the ability to use standard SQL to analyze and aggregate this data in real-time. It’s easy to create an Amazon Kinesis stream or Firehose delivery stream with just a few clicks in the AWS Management Console (or a few commands using the AWS CLI or Amazon Kinesis API). However, to generate a continuous stream of test data, you must write a custom process or script that runs continuously, using the AWS SDK or CLI to send test records to Amazon Kinesis. Although this task is necessary to adequately test your solution, it means more complexity and longer development and testing times.

Wouldn’t it be great if there were a user-friendly tool to generate test data and send it to Amazon Kinesis? Well, now there is—the Amazon Kinesis Data Generator (KDG).

Check it out if you’re using Kinesis and need to do some load testing.

Related Posts

Kafka Streams Basics

Anuj Saxena walks through Kafka Streams and provides a quick example: The features provided by Kafka Streams: Highly scalable, elastic, distributed, and fault-tolerant application. Stateful and stateless processing. Event-time processing with windowing, joins, and aggregations. We can use the already-defined most common transformation operation using Kafka Streams DSL or the lower-level processor API, which allow us […]

Read More

Data Lake Analysis With Excel And Power BI

Sachin C Sheth announces support for Azure Data Lake Store within Excel and Power BI: Until now, if you had to analyze data stored in ADLS with Excel, you would have to copy it into a relational data store like Azure SQL Data Warehouse or download the data onto a machine, and then use Excel […]

Read More

Categories