
MySQL Insert
Pricing
Pay per usage

MySQL Insert
This act takes a crawler execution and inserts it's results into a remote MySQL database.
0.0 (0)
Pricing
Pay per usage
1
Monthly users
1
Last modified
2 years ago
mysql-insert
Apify actor for inserting crawler results into a remote MySQL table.
This actor fetches all results from a specified dataset (or as raw data) and inserts them into a table in a remote MySQL database.
Input
Input is a JSON object with the following properties:
1{ 2 "datasetId": "your_dataset_id", 3 "connection": { 4 "host" : "host_name", 5 "user" : "user_name", 6 "password" : "user_password", 7 "database" : "database_name", // optional 8 "port" : 3306 // optional 9 }, 10 "table": "table_name", 11 // Optionally, you can add raw data instead of datasetId 12 "rows": [ 13 {"column_1": "value_1", "column_2": "value_2"}, 14 {"column_1": "value_3", "column_2": "value_4"} 15 ] 16}
Webhooks
Very often you want to run an image mysql-insert
update after every run of your scraping/automation actor. Webhooks are solution for this. The default datasetId
will be passed automatically to the this actor's run so you don't need to set it up in the payload template (internally the actor transforms the resource.defaultDatasetId
from the webhook into just datasetId
for its own input).
I strongly recommend to create a task from this actor with predefined input that will not change in every run - the only changing part is usually datasetId
. You will not need to fill up the payload template and your webhook URL will then look like:
https://api.apify.com/v2/actor-tasks/<YOUR-TASK-ID>/runs?token=<YOUR_API_TOKEN>
Pricing
Pricing model
Pay per usageThis Actor is paid per platform usage. The Actor is free to use, and you only pay for the Apify platform usage.