Skip to main content

Schedule Webhooks

Comments

6 comments

  • Courtney Jordan Experience Strategy & Design Director Community moderator
    Celigo University Level 4: Legendary
    Awesome Follow-up
    Top Contributor

    Hi Jeffery Hill!

    Just to clarify, since we can't schedule webhooks per se (since that's on the "client" side and we can't control when they issue their events), are you envisioning something more like integrator.io "collecting" a batch of records from those webhook calls and sending them to Snowflake as a single batch (either once the batch reaches a certain size, or at a certain frequency, such as once per day) instead of record-by-record as they arrive?
     
    0
  • Jeffery Hill

    Hi Courtney,

     

    Yes I envision you all collecting them in storage and then having the flow on a schedule to pull all the events that have aggregated in batch. Webhooks are great since you don't have to go fetch the data, but if you don't need the data updated in realtime to other systems then it puts some unnecessary load. 

    0
  • viliandy leonardo Product Management Director

    Hi Jeffery Hill

    Does Snowflake not suspend the datawarehouse when it is not running?

    Would you consider staging data you receive from a webhook to a database and then kick off another flow that loads data from the staging to Snowflake?

    0
  • Jeffery Hill

    Snowflake does suspend the warehouse, but if the webhook constantly has data coming into it, then the warehouse will never get a chance to suspend. Also I don't think I should have to use a different database to stage the data, then import to snowflake. This is an enhancement request for you all to stage the events and hold them for scheduling.

    0
  • viliandy leonardo Product Management Director

    Jeffery Hill

    I acknowledge the idea put forward here. The database staging route was a suggestion until we have built a proper solution to address the needs.

     

    0
  • Scott Henderson CTO
    Answer Pro
    Top Contributor

    I agree this is a good enhancement request for integrator.io.  That said, another work around would be to simply route the raw webhook data to an S3 bucket, or FTP folder; and then have a second flow running on a schedule to get all the raw files and process them in mass.  Assuming you have a secure place to store the raw webhook data, this is a trivial second flow to build.

    When we eventually implement this as a native feature of the integrator.io platform, we would do something very similar behind the scenes using our own internal S3 account.

    0

Please sign in to leave a comment.