Hi all. I'm hoping to get some help from folks with more Linux experience than me. I'm not a Linux noob, but I'm far from an expert, and I have some huge gaps in my knowledge.
I have a Synology NAS that I am using for media storage, and I have a separate Linux server that is using that data. Currently the NAS is mounted with samba. it automatically mounts at boot via an entry in /etc/fstab. This is working okay, but I don't like how samba handles file ownership. The whole volume mounts as the user who mounts it (specified in fstab for me), and all the files in the volume are owned by that user. So if I wanted two users on my server to have their own directory, I would need to mount each directory separately for each user. This is workable in simple scenarios, but if I wanted to move my Lemmy instance volumes to my NAS, the file ownership of the DB and the pictrs volumes would get lost and the users in the containers wouldn't be able to access the data.
Is there a way to configure samba to preserve ownership? Or is there an alternate to samba that I can use that supports this?
Edit:
Okay, so I set up NFS, and it appears to do what I want. All of the user IDs carry over when I cp -a
my files. My two users can write to directories that I set up for them that are owned by them. It seems all good on the surface. So I copied my whole lemmy folder over and tried to start up the containers, and postgres still crashes. The logs say "Permssion denied" and "chmod operation not permitted" back and forth forever. I tried to log into my container and see what is going on. Inside the container, root can't access a directory, which is bizarre. The container's root user can access that directory when I am running the container in my local filesystem. As a test, I tried copying the whole lemmy directory from my local filesystem to my local filesystem (instead of from local to NFS), and it worked fine.
I think this exact thing might be out of the scope of my original question, and I might need to make a post on [email protected] instead, as what I wanted originally has been accomplished with NFS.
If your Synology NAS supports ssh, might want to check to see if you can use sshfs. I used to use Samba and NFS on my Debian home server, but switched to sshfs a few months ago. File transfers seem a little quicker than with Samba.
SSHFS has a lot of overhead from FUSE as well as the encryption. It's much better to use NFS on the LAN if you care about speed.