Schedule Webhooks

I would like the ability to schedule webhooks. When populating a database with data from webhooks, you have to constantly have the connection open to the database since the webhook is live. Other solutions I've used let you schedule webhooks so they will basically hold the webhook data in storage until the scheduled run.

For example, I bring data into Snowflake, but I don't use webhooks to Snowflake because we would eat through our credits by constantly having the warehouse running.

1

Comments

6 comments
Date Votes
  • Hi Jeffery Hill!

    Just to clarify, since we can't schedule webhooks per se (since that's on the "client" side and we can't control when they issue their events), are you envisioning something more like integrator.io "collecting" a batch of records from those webhook calls and sending them to Snowflake as a single batch (either once the batch reaches a certain size, or at a certain frequency, such as once per day) instead of record-by-record as they arrive?
     
    0
  • Hi Courtney,

     

    Yes I envision you all collecting them in storage and then having the flow on a schedule to pull all the events that have aggregated in batch. Webhooks are great since you don't have to go fetch the data, but if you don't need the data updated in realtime to other systems then it puts some unnecessary load. 

    0
  • Hi Jeffery Hill

    Does Snowflake not suspend the datawarehouse when it is not running?

    Would you consider staging data you receive from a webhook to a database and then kick off another flow that loads data from the staging to Snowflake?

    0
  • Snowflake does suspend the warehouse, but if the webhook constantly has data coming into it, then the warehouse will never get a chance to suspend. Also I don't think I should have to use a different database to stage the data, then import to snowflake. This is an enhancement request for you all to stage the events and hold them for scheduling.

    0
  • Jeffery Hill

    I acknowledge the idea put forward here. The database staging route was a suggestion until we have built a proper solution to address the needs.

     

    0
  • I agree this is a good enhancement request for integrator.io.  That said, another work around would be to simply route the raw webhook data to an S3 bucket, or FTP folder; and then have a second flow running on a schedule to get all the raw files and process them in mass.  Assuming you have a secure place to store the raw webhook data, this is a trivial second flow to build.

    When we eventually implement this as a native feature of the integrator.io platform, we would do something very similar behind the scenes using our own internal S3 account.

    0

Please sign in to leave a comment.

 

Didn't find what you were looking for?

New post