How To Deploy And Execute A Native Linux Binary #117
-
Hi! Thank you for this great package, which takes away so much of the annoying configuration work normally needed to create lambda functions manually! I would like to ask some questions about something I thought could be realized by using lambda and the help of this package. I have a compiled Linux binary that I can execute on my local linux 64Bit machine (Linux Mint 20.3 Cinnamon 64Bit; Kernel 5.4.0-100-generic). It gets a file as input and produces a different output file inside the folder it is executed. Is there some way I could deploy a tool like this to Lambda by using sidecar? There are some questions that come to my mind: How would you implement something like this? Thanks in advance to everyone who leaves some information! |
Beta Was this translation helpful? Give feedback.
Replies: 5 comments
-
Yes! we are doing this kind of thing in various ways, one of which is to use Sidecar's ability to run any container you like to provide: In that way, you can have any complete Linux system with any installed software called through Sidecar. You can choose what language you use for the actual event handler function on the container (we are e.g. using JS in one example, just because we had it available, and calling our application from the JS). Getting files in and out via S3, I would recommend using AWS pre-signed URLs (aka temporary URLs in Laravel) - generate them, pass them to the container, then the container can fetch the data directly. Same in the other direction, your container can place a file on S3 and pass a URL back to your backend. Laravel does not have this function, but it can be done "manually" (also your container will likely not have Laravel), here's an example: https://beckermanyaakov.medium.com/uploading-files-directly-to-s3-digital-ocean-with-laravel-9efc30eb44c8 where the AWS SDK is being called directly to do the same thing. |
Beta Was this translation helpful? Give feedback.
-
Thank you very much for the quick answer! It sounds very promising and I'm getting a better picture about how I will implement it.
You say 'directly' but actually it sounds rather indirect. I mean the lambda function lives somewhere in AWS ecospace as well as the files. So making it accessible to the public web with a pre-signed url seems like a detour. Isn't it possible to reference the s3 files directly? Maybe by using some AWS library inside the Lambda handler? |
Beta Was this translation helpful? Give feedback.
-
If you install the AWS SDK in your container, and provide the authentication keys via some method, then you'll be able to do any S3 operation from the container. In my case it's just a lot simpler to pass a URL to the container and then use any native fetch via HTTP, in the language of choice depending on what you're doing. Avoiding yet another AWS authentication operation is also good. |
Beta Was this translation helpful? Give feedback.
-
Thank you very much for your help! I have another related question: Ideally I would like to version-control the binary file together with my app. I'm thinking about some command to create and deploy the container similar as I would deploy the lambda function using sidecar so the complete deployment process for the lambda function can be done with laravel commands. Is there some package I can use for that process? |
Beta Was this translation helpful? Give feedback.
-
Not sure about that actually. I just use docker commands together with an ECR repository. |
Beta Was this translation helpful? Give feedback.
Yes! we are doing this kind of thing in various ways, one of which is to use Sidecar's ability to run any container you like to provide:
https://hammerstone.dev/sidecar/docs/main/functions/handlers-and-packages#container-images
In that way, you can have any complete Linux system with any installed software called through Sidecar. You can choose what language you use for the actual event handler function on the container (we are e.g. using JS in one example, just because we had it available, and calling our application from the JS).
Getting files in and out via S3, I would recommend using AWS pre-signed URLs (aka temporary URLs in Laravel) - generate them, pass them to the container, then …