Kafka Integration
2 hours trial then $10.00/month - No credit card required now
Kafka Integration
2 hours trial then $10.00/month - No credit card required now
This integration facilitates the processing of data from an Apify dataset and its delivery to a Kafka topic. It is configured via a schema that includes details about the dataset, Kafka configuration, and batch processing size.
Apify to Kafka Integration
Overview
This integration facilitates the processing of data from an Apify dataset and its delivery to a Kafka topic. It is configured via a schema that includes details about the dataset, Kafka configuration, and batch processing size.
Input Schema
The input schema defines the configuration required to run the integration effectively. Below are the details of the schema and its properties:
Schema Details
- Title: Apify to Kafka Input Schema
- Type: Object
- Schema Version: 1
Properties
1. Default Dataset ID
- Title: Default Dataset ID
- Type: String
- Description: The ID of the Apify dataset to process.
- Default:
default
- Editor: Textfield
2. Kafka Configuration
- Title: Kafka Configuration
- Type: Object
- Description: Configuration settings for Kafka connection.
- Editor: JSON
- Default:
1{ 2 "clientId": "apify-kafka-producer", 3 "brokers": ["localhost:9092"], 4 "topic": "test-topic", 5 "ssl": false 6}
- Properties:
- Client ID:
- Title: Client ID
- Type: String
- Description: Kafka client identifier.
- Default:
apify-kafka-producer
- Brokers:
- Title: Brokers
- Type: Array of strings
- Description: Array of Kafka broker addresses.
- Default:
["localhost:9092"]
- Topic:
- Title: Topic
- Type: String
- Description: Kafka topic name.
- Default:
test-topic
- SSL:
- Title: SSL
- Type: Boolean
- Description: Enable/disable SSL connection.
- Default:
false
- SASL (Optional):
- Title: SASL
- Type: Object
- Description: SASL configuration for Kafka connection.
- Properties:
- Username:
- Title: Username
- Type: String
- Description: Kafka SASL username.
- Password:
- Title: Password
- Type: String
- Description: Kafka SASL password.
- Mechanism:
- Title: Mechanism
- Type: String
- Description: Kafka SASL mechanism.
- Default:
plain
- Username:
- Client ID:
3. Batch Size
- Title: Batch Size
- Type: Integer
- Description: Number of messages to process in each batch.
- Default:
2
- Minimum:
1
Required Properties
defaultDatasetId
kafkaConfig
Example Configuration
1{ 2 "defaultDatasetId": "my-dataset-id", 3 "kafkaConfig": { 4 "clientId": "my-kafka-client", 5 "brokers": ["kafka-broker1:9092", "kafka-broker2:9092"], 6 "topic": "my-kafka-topic", 7 "ssl": true, 8 "sasl": { 9 "username": "my-username", 10 "password": "my-password", 11 "mechanism": "plain" 12 } 13 }, 14 "batchSize": 5 15}
How to Use
- Define the Input: Provide the necessary configurations in JSON format as per the schema.
- Run the Integration: Pass the configuration to the Apify actor or script responsible for processing and delivering the dataset to Kafka.
- Monitor Outputs: Check your Kafka topic for incoming messages based on the processed dataset.
Notes
- Ensure the Kafka broker addresses are reachable from the environment where the integration is executed.
- SASL configuration is optional but required for secure Kafka connections.
- Modify the batch size according to the volume of data and processing capacity.
Actor Metrics
1 monthly user
-
1 star
>99% runs succeeded
Created in Nov 2024
Modified a month ago