Snowflake Uploader
No credit card required
Snowflake Uploader
No credit card required
This actor uploads Apify datasets to Snowflake tables. You can use it with in a combination with webhooks to integrate your scrapers with Snowflake.
Dataset ID
datasetId
stringOptional
Dataset of the ID to download the data from. If you set up a webhook, it will be posted in the default payload
Account name
account
stringRequired
Account name. You can get it from the url, it's usually a set of letters and is not the same as your username.
Warehouse
warehouse
stringRequired
Supply your custom warehouse name if you want. The deault works fine as well.
Default value of this property is "COMPUTE_WH"
Stage
stage
stringOptional
Supply name of the stage where the file with data will be uploaded (with PUT). If you don't specify this, your table's default stage will be used.
Flatten JSON
flattenJson
booleanOptional
If you select this option, instead of {a: {b: 1, c: [1, 2]}} you'll get {a.b: 1, a.c.0: 1, a.c.1: 2}
Default value of this property is false
Transform JSON key function
transformJsonKeyFunction
stringOptional
Runs after flattening JSON. Enter body of a function that will transform JSON keys of dataset objecs. You have access to 'key' argument and must return a string. Example: 'return key.toLowerCase()'
Transform JSON data function
transformJsonDataFunction
stringOptional
Runs after transforming keys. Enter the body of a function that will transform dataset objecs. You have access to 'value' argument that is a json object from dataset and must return a(n) (un)modified value. Example: value.name += '/'; return value
Limit rows
limit
integerOptional
Limit the number of rows to be uploaded (e.g. for testing purposes on large datasets, because you have to wait for all your dataset to get transformed before pushing anything to the db). If you don't specify this, all rows will be uploaded.
Overwrite table
overwrite
booleanOptional
Whether to drop the table's data before pushing dataset data into it
Default value of this property is false
Database schema columns (SETTING THIS OPTION WILL DROP YOUR TABLE)
synchronizeSchema
arrayOptional
In case your destination table has a schema incompatible with the dataset, you can provide a list of column names and their types, then the table will be synchronized according to your definitions by dropping and recreating the table. Key is the column name, value is the column type (as in SQL query, like VARCHAR). This is optional if your table already has the desired schema.
Actor Metrics
2 monthly users
-
2 stars
>99% runs succeeded
Created in Jan 2023
Modified a year ago