Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

And while you are being sarcastic, this is the Right Way to use queues.

Upload file to S3 -> trigger an SNS message for fanout if you need it -> SNS -> SQS trigger -> SQS to ETL jobs.

The ETL job can then be hosted using Lambda (easiest) or ECS/Docker/Fargate (still easy and scales on demand) or even a set of EC2 instances that scale based on the items in a queue (don’t do this unless you have a legacy app that can’t be containerized).

If your client only supports SFTP, there is the SFTP Transfer Service on AWS that will allow them to send the file via SFTP and it is automatically copied to an S3 bucket.

Alternatively, there are products that treat S3 as a mountable directory and they can just use whatever copy commands on their end to copy the file to a “folder”



If I have a user facing upload button, why can't I simply have a webserver that receives the data and pushes it into s3 via multi-part upload. Something that can be written in a framework of your choice in 10 minutes with 0 setup?

For uploads under 50 MB you could also skip the multipart upload and take a naive approach without taking a significant hit.


You can - you generate the pre-signed S3 URL and they upload it to the place your URL tells it to.

https://fullstackdojo.medium.com/s3-upload-with-presigned-ur...

And before you cry “lock in”, S3 API compatible services are a dime a dozen outside of AWS including GCP and even Backblaze B2.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: