I'm newbie to DevOps. Is it possible to copy data generated inside a container to S3 bucket while the container is still running? If yes, what would be the best approach that uses less code or less external plugins?
I'm currently using aws-java-sdk
from my java application to connect to s3
bucket. I am able to copy an existing file to s3 bucket (with hardcoded path and file name) but not able to get files that get generated during container execution and therefore getting not able to find file error in my Kubernetes environment log. The container is standalone. The entry point is java -jar my_jar_name.jar
in helm
which is called when container is invoked or when new version is deployed. The new data generation happens only at execution time of the container.
Do I need to mount a volume in my Java code? Currently I'm not mounting any volume. I'm just providing access keys, make connection to s3, and then telling it to copy a certain file from a dir to S3. That dir is supposed to have that file at execution time. Hope you are able to understand my problem but if you need more details, please let me know.
Any help is highly appreciated.