Sending application/octet stream through HTTP/Rest connectors

We need to be able to send data from an FTP file as an octet stream to a service. The REST endpoint appears to support only XML or JSON, and the HTTP endpoint doesn't appear to offer an octet stream option on the request. Has anyone had any success in doing that?

0

Comments

11 comments
Date Votes
  • Viliandy -

     

    nevermind... We were overcomplicating it.

     

    The more I thought about it, the more I realized that the Request body I was creating was most likely being treated as static text.  We downloaded the resulting file, and we were correct.

     

    So... we removed it and just left the "BlobKeyPath" with the appropriate setting and updated the relative URI, and it's working like a charm.  Thanks a lot for the help!

     

    2
  • Hi Wade,

    When using REST or HTTP connector, you can set the header request to this name value pair: Content-Type: application/octet-stream.

    Thanks

    0
  • OK.  Thank you, Viliandy.  Based on your response, I assume this will override the "Request Media Type" on the HTTP setup?

     

    Out of curiosity... we've never done this with Celigo before.  The file that we're picking up from FTP is plain text.  All of the examples working with this API using Postman/cURL show converting the file to a byte array and then posting it.  We thought we could possibly use the pre-map Javascript hook to convert the file to a byte array, but aren't quite sure how to include it in the mapping past that point.

    Any thoughts/suggestions?

    0
  • Hi Wade Shelton

    Yes, it will override the "Request Media Type" value.

    If your source is an FTP provider, then I wonder why you are not using FTP connector to pick up the files?

    Thanks

    0
  • Viliandy

    The source is sftp. The destination is an API.

    0
  • Wade Shelton 

     

    Ok, how about you give the following configuration a try :

    1. On the source side, use FTP connector and choose Output mode = Blob Keys.

    2. On the destination side, use REST connector, choose Input Mode = Blob keys and set the Blob Key path field to "blobKey" value and request header Content-type = application/octet-stream  

    Thanks

    0
  • Hey Viliandy - 

     

    This has been helpful.  We've run into one final issue that I was hoping you could help with.

     

    We're able to send the file to the API, but when we do, We're getting an error that states: ERR_STREAM_WRITE_AFTER_END "write after end".

    I assume this is because the request body that we've built has a very simple HTTP Request Body and does not include a content length... ie.:

    {
    "content" : "blobKey"
    }

    I can see on the resulting server that the file was created, but is only 1KB.  Any thoughts on how to derive the content length from the file?  I don't see a ton of documentation on blobKeys...

     

    0
  • Hi Wade Shelton

    It is hard to tell what might cause the error without the rest of flow detail. 

    I suggest that you submit a support ticket so that we can look into this error. Please include a zip file of your flow so that the team can review the flow configuration and reproduce the error for trouble shooting.

    Thanks

    0
  • Wade Shelton

    I am glad it worked for you! May I know to which application destination you send the files? I am interested in commercial business application and not a home-grown application. This information will feed into our product backlog for file-processing connectors.

    Thanks

    0
  • Viliandy - 


    Sure thing.

    We have payroll data delivered from our payroll provider (ADP) via SFTP.  We're feeding it into ePBCS for HR planning and budgeting.  ePBCS has a built-in set of APIs that are fairly well documented (https://docs.oracle.com/en/cloud/saas/enterprise-performance-management-common/prest/launch.html), and an Oracle-provided, Java-based utility called epmautomate that can be used for instrumentation/orchestration.  We don't want to use it in this case, though, because it:

    • Typically runs from a Windows batch file
    • Requires the use of a stored username/password
    • Requires the use of "local" disk (could be network/S3/etc..., but still connected to the host machine)

    In this case, we have some fairly sensitive information and don't want it traversing our network, so rather than using epmautomate, we're attempting to leverage the APIs to push data into the filesystem that ePBCS uses and to run any data load/extraction rules.

    As an aside, though... I think some documentation around blobkeys and how they function would go a long way...

    0
  • Wade Shelton

    Our goal is to simplify the design so that users don't need to understand the blobkey field anymore and integrator.io platform will handle the blobkey field automatically. You'll see improvements about file processing features that are included in the June release (available early next week).

    Please provide us with feedback when you have a chance to try the new file processing UI (available only in the beta version). We'll publish documentation if there are specific cases where you need to config this blobkey fields.

    Hope this helps!

    0

Please sign in to leave a comment.

 

Didn't find what you were looking for?

New post