FAQ: Convert a blob key to base64 string for import?
There is currently no way to convert a blob key to base64 in integrator.io natively. However, you can use an Amazon S3 bucket in conjunction with an AWS Lambda function as a workaround. This requires two flows, not because the flows are complex, but because integrator.io runs so quickly that Amazon S3 won’t have time to convert the blob before integrator.io pulls the new file. You’ll need:
- Amazon S3 account
- Amazon S3 connection
- Lambda function with Amazon S3 (ext link)
- An FTP account with your file
- an FTP connection
Flow 1: Transfer file from FTP to Amazon S3
- First, create an FTP export.
- Do not parse your file. In this example, the exported file is a PNG.
- Import (transfer) your file to Amazon S3.
- Do not generate a file from your record.
- Set the Blob key path field in the Advanced settings to blobKey.
Transfer file to Amazon S3
In this example, the Amazon S3 file key (file name) is a handlebar expression that replaces the S3 file key (file name) with the generated blob key. For example, if your blob key is 234k. The S3 file key (name) becomes 234k.png.
Create an AWS Lambda function that is invoked from uploads into Amazon S3 storage
In AWS, use the Lambda feature to create a function that converts the blob key to base64. The function needs to be triggered by an S3 upload. To configure this, use an Amazon S3 trigger to invoke a Lambda function.
Function overview
Trigger configuration
Tips:
- Use a separate bucket for function outputs.
- Customize the trigger by setting the suffix to the file type you’re uploading in the first flow. In this case, the file ends with .png. So you can set the suffix to png.
Warning: The code below is an example of converting the file. It has not been validated or endorsed by Celigo. Use this at your own risk.
import boto3
import base64
import ison
def lambda_handler (event, context):
s3_bucket = event [ 'Records '][0][ 's3'][ 'bucket ' ][ 'name' ]
s3_key = event[ 'Records '][0]['s3']['object ]['key']
s3 = boto3.client('s3')
response = s3.get_object (Bucket=s3_bucket, Key=s3_key)
file_content = response [ 'Body'].read()
base64_content = base64.b64encode (file_content) .decode('utf-8')
print (base64_content)
json_ data = {
"filename": s3_key,
"base64encoded": base64_content
}
json_string = json.dumps (json_data)
new_s3_key = s3_key + '.json'
print (new_s3_key)
s3.put_object (Body=json_string, Bucket=s3_bucket, Key=new_s3_key)
# Return the new S3 file key
return new_s3_key
Flow 2: Get the new file from Amazon S3
- Create your Amazon S3 export. Parse the JSON file.
- Import your new file into any application.
Comments
Are there other ways to do this. Is there something on the roadmap?
Hi maarten panman
Thank you for reaching out. At present, its not on the near-term roadmap.
Best Regards,
Please sign in to leave a comment.