Snowflake Uploader avatar

Snowflake Uploader

Try for free

No credit card required

Go to Store
Snowflake Uploader

Snowflake Uploader

svpetrenko/snowflake-uploader
Try for free

No credit card required

This actor uploads Apify datasets to Snowflake tables. You can use it with in a combination with webhooks to integrate your scrapers with Snowflake.

Dataset ID

datasetIdstringOptional

Dataset of the ID to download the data from. If you set up a webhook, it will be posted in the default payload

Fully-qualified table name

tableNamestringRequired

Table name in the format DATABASE.SCHEMA.TABLENAME

Connection username

usernamestringRequired

Your acount's username

Account name

accountstringRequired

Account name. You can get it from the url, it's usually a set of letters and is not the same as your username.

Password

passwordstringRequired

Your account's password

Database name

databasestringRequired

Your database's name.

Warehouse

warehousestringRequired

Supply your custom warehouse name if you want. The deault works fine as well.

Default value of this property is "COMPUTE_WH"

Stage

stagestringOptional

Supply name of the stage where the file with data will be uploaded (with PUT). If you don't specify this, your table's default stage will be used.

Flatten JSON

flattenJsonbooleanOptional

If you select this option, instead of {a: {b: 1, c: [1, 2]}} you'll get {a.b: 1, a.c.0: 1, a.c.1: 2}

Default value of this property is false

Transform JSON key function

transformJsonKeyFunctionstringOptional

Runs after flattening JSON. Enter body of a function that will transform JSON keys of dataset objecs. You have access to 'key' argument and must return a string. Example: 'return key.toLowerCase()'

Transform JSON data function

transformJsonDataFunctionstringOptional

Runs after transforming keys. Enter the body of a function that will transform dataset objecs. You have access to 'value' argument that is a json object from dataset and must return a(n) (un)modified value. Example: value.name += '/'; return value

Limit rows

limitintegerOptional

Limit the number of rows to be uploaded (e.g. for testing purposes on large datasets, because you have to wait for all your dataset to get transformed before pushing anything to the db). If you don't specify this, all rows will be uploaded.

Overwrite table

overwritebooleanOptional

Whether to drop the table's data before pushing dataset data into it

Default value of this property is false

Database schema columns (SETTING THIS OPTION WILL DROP YOUR TABLE)

synchronizeSchemaarrayOptional

In case your destination table has a schema incompatible with the dataset, you can provide a list of column names and their types, then the table will be synchronized according to your definitions by dropping and recreating the table. Key is the column name, value is the column type (as in SQL query, like VARCHAR). This is optional if your table already has the desired schema.

Confirm data loss

dataLossConfirmationbooleanOptional

If you choose to synchronize your schema or overwrite previous data, check this checkbox to confirm that you agree to lose your previous data

Default value of this property is false

File upload retries

fileUploadRetriesintegerOptional

How many times to retry uploading the file to Snowflake. This setting is useful because this last operation sometimes fails for large datasets due to snoflake's driver

Default value of this property is 5

Developer
Maintained by Community

Actor Metrics

  • 2 monthly users

  • 2 stars

  • >99% runs succeeded

  • Created in Jan 2023

  • Modified a year ago