AutoMQ x RisingWave: Build Event-driven Data Stack with Kafka Ecosystem

May 29, 2024
AutoMQ Team
3 min read
AutoMQ x RisingWave: Build Event-driven Data Stack with Kafka Ecosystem

RisingWave is a distributed streaming database that provides a standard SQL interface, fully compatible with the PostgreSQL ecosystem, and integrates seamlessly without requiring any code changes. RisingWave treats streams as tables, enabling users to perform complex queries on both streaming and historical data smoothly. With RisingWave, users can concentrate on query analysis logic without having to learn Java or the specific underlying APIs of different systems.

This article will detail the process of importing data from AutoMQ into the RisingWave database using RisingWave Cloud.

Prepare AutoMQ and test data

Follow the Stand-alone Deployment guide to deploy AutoMQ, ensuring network connectivity between AutoMQ and RisingWave.

Swiftly create a topic named example_topic in AutoMQ and write a test JSON message by following these steps.

Create Topic

Utilize the Apache Kafka command line tool to create the topic, making sure you have access to a Kafka environment and that the Kafka service is operational. Here is an example command to create a topic:

bash
./kafka-topics.sh --create --topic exampleto_topic --bootstrap-server 10.0.96.4:9092 --partitions 1 --replication-factor 1

Once the topic has been created, use the following command to confirm its successful creation.

bash
./kafka-topics.sh --describe example_topic --bootstrap-server 10.0.96.4:9092

Generating test data

Generate JSON formatted test data that corresponds to the previously mentioned table.

json
{ "id": 1, "name": "testuser", "timestamp": "2023-11-10T12:00:00", "status": "active" }

Writing test data

Test data can be written to a topic named "example_topic" using Kafka's command-line tools or programmatically. Here's an example using command-line tools:

bash
echo '{"id": 1, "name": "testuser", "timestamp": "2023-11-10T12:00:00", "status": "active"}' | sh kafka-console-producer.sh --broker-list 10.0.96.4:9092 --topic example_topic

To view the data recently written to the topic, use the following command:

bash
sh kafka-console-consumer.sh --bootstrap-server 10.0.96.4:9092 --topic example_topic --from-beginning

Create an AutoMQ source on RisingWave Cloud

  1. Navigate to Clusters on RisingWave Cloud to create a cluster.

  2. Go to Source on RisingWave Cloud to create a source.

  3. Specify the cluster and database, and log into the database.

  4. AutoMQ is fully compatible with Apache Kafka®, so just click on Create source and choose Kafka.

  5. Follow the guide on RisingWave Cloud to configure the connector, set up source information, and define the schema.

  6. Review the generated SQL statement, click Confirm to finalize the source creation.

Query data

  1. Go to RisingWave Cloud Console and log into the cluster.

  2. Run the following SQL statement to access the imported data, replacing the variable `your_source_name` with the custom name specified when creating the source.

sql
SELECT * from {your_source_name} limit 1;

Newsletter

Subscribe for the latest on cloud-native streaming data infrastructure, product launches, technical insights, and efficiency optimizations from the AutoMQ team.

Join developers worldwide who leverage AutoMQ's Apache 2.0 licensed platform to simplify streaming data infra. No spam, just actionable content.

I'm not a robot
reCAPTCHA

Never submit confidential or sensitive data (API keys, passwords, credit card numbers, or personal identification information) through this form.