Table Topic Integration with AWS S3 Table
AutoMQ Table Topic supports integration with Iceberg to enable streaming data lake analysis and querying, eliminating the need for ETL configuration and maintenance. This article describes how to configure Table Topic and integrate AWS S3 Table Bucket in an AWS environment.
Prerequisites
To use AutoMQ Table Topic features in an AWS environment, the following conditions must be met:
-
Version Constraints: AutoMQ instance version >= 1.4.1 is required.
-
Instance Constraints: The Table Topic feature must be enabled when creating the AutoMQ instance for subsequent use of Table Topic. Once the instance is created, the Table Topic feature cannot be enabled later.
-
Resource Requirements: To use Table Topic on AWS, you can utilize AWS Glue as the Data Catalog or AWS S3 Tablebucket as the Data Catalog.
Operating Steps
Step 1: Create S3 Table Bucket
Integrating AutoMQ with S3 Table Bucket requires pre-creation of a Table Bucket in the AWS S3 console. Select the same deployment region as AutoMQ to create a Table Bucket for configuration in Step 2.

Step 2: Create S3 Table Catalog Integration
To utilize Table Topic, navigate to the AutoMQ console to create an S3Table Catalog integration for recording Catalog information. Follow these steps:
- Log in to the AutoMQ console and click the Integration menu.

-
Select Create S3Table Catalog Integration and provide the following details:
-
Name: Enter a unique integration configuration name.
-
Deployment Configuration: Select the deployment configuration for the integration. Ensure it matches the configuration of any instances you create afterward.
-
Warehouse: Specify the object storage S3 bucket used for the data lake, which is intended for long-term data storage.
-

-
After setting the warehouse parameter, AutoMQ will generate the necessary IAM policy to access the bucket and display the IAM role used by the AutoMQ instance. Please visit the cloud providers' IAM console to create the authorization based on this policy.
-
Once the authorization is created, you can proceed to create the S3 Table catalog integration.
Step 3: Create an AutoMQ Instance and Enable the Table Topic Feature
To use the AutoMQ Table Topic feature, it needs to be enabled during the instance creation process. For this, the instance configuration should follow the instructions below:
Note:
When a Table Topic is enabled for an AutoMQ instance, not all Topics will have the stream table enabled by default. Configuration on a per-Topic basis is required to implement streaming data into the lake as needed.
To use Table Topic, you must enable Table Topic when creating the instance. Once the instance is created, this configuration cannot be changed.

Step 4: Create a Topic and Configure the Stream Table
Once the Table Topic feature is enabled for an AutoMQ instance, you can configure the stream table on a per-Topic basis during Topic creation. The specific operations are as follows:
-
Enter the instance in Step 2, go to the Topic list, and click Create Topic.
-
During Topic configuration, enable Table Topic conversion and configure the following parameters:
-
Namespace: The namespace is used to isolate different Iceberg tables and corresponds to the database in the Data Catalog. It is recommended to set the appropriate parameter value based on business affiliation.
-
Schema Constraint Type: Set whether Topic messages adhere to schema constraints. If you select 'Schema', schema constraints are enabled, and you need to register the message schema with AutoMQ's built-in SchemaRegistry. Messages must strictly follow the schema, and the subsequent Table Topic will use the fields of this schema to populate the Iceberg table. If you select 'Schemaless', the message content does not have explicit schema constraints; the message key and value will be used as overall fields to populate the Iceberg table.
-

- Click Confirm to create a Topic that supports table migration.
Step 5: Produce Messages and Query Iceberg Table Data in Real-time
Once the AutoMQ instance configuration and Table Topic creation are completed, you can test producing data and query it from the Iceberg table.
-
Click to enter the Topic details, and on the Produce Messages tab, enter the test Message Key and Message Value, then send the message.
-
Go to AWSS3 console to view the Iceberg database and table written by AutoMQ.

- Click Query Table from Athena, activate AWS Athena to query the table data in the Table Bucket. You can see that AutoMQ is converting Kafka messages into corresponding data records in real time. Users can also use other query engines for analysis and computation.
