File Transfer Patterns
Common file transfer patterns and use cases
Moving files around usually boils down to a few standard scenarios: backing up data, sending files to a partner, or keeping two folders in sync.
Since Weik.io MFT uses rclone under the hood, the way you move files depends on the command you choose.
The core commands
When you write your MFT definition, you tell it how to handle the files at the destination.
copy
command: copy
This is the safest option. It takes new and modified files from your source and puts them in the destination. If a file already exists at the destination, it overwrites it (if the source is newer), but it never deletes files at the destination that aren’t in the source.
Best for:
- Standard backups
- Log aggregation
- Scenarios where you can’t risk losing data
sync
command: sync
Use sync when you want the destination folder to be an exact mirror of the source. If you delete a file from the source, the next sync will delete it from the destination too.
Be careful with this one. If you accidentally point a sync job at an empty source folder, it will happily wipe out your entire destination folder.
Best for:
- Mirroring a file share to the cloud
- Website deployments
move
command: move
This acts like a cut-and-paste. It copies the files to the destination, and once it verifies the copy was successful, it deletes them from the source.
Best for:
- Ingesting files into a processing pipeline (so you don’t process the same file twice)
- Archiving old data to clear up space on the source drive
Filtering what you transfer
Sometimes you don’t want the whole folder. You can use standard glob patterns in the filters field to grab exactly what you need.
filters: "*.csv" # Just CSVs
filters: "*.{csv,json,xml}" # Multiple specific types
filters: "report_*.csv" # Anything starting with "report_"
filters: "data_2025-*.csv" # Files matching a specific year pattern
How this looks in the real world
Here are a few common ways teams use these commands and filters.
The daily database backup
If your database drops a compressed SQL dump to an SMB share every night, you can use copy to ship it to cloud storage. By filtering for *.sql.gz, you avoid copying any random temp files someone might have left in the folder.
apiVersion: weik.io/v1alpha1
kind: MFT
metadata:
name: db_backup_to_cloud
spec:
source:
name: backup_server_smb
path: backups/database/
destination:
name: backup_s3
path: database/
command: copy
schedule: 0 0 2 * * ?
filters: "*.sql.gz"
Partner file exchange
When you generate invoices and drop them in a folder, you want to send them to your partner’s SFTP server and then get them out of your local folder. move handles this perfectly.
apiVersion: weik.io/v1alpha1
kind: MFT
metadata:
name: invoice_to_partner
spec:
source:
name: company_smb
path: exports/invoices/
destination:
name: partner_sftp
path: incoming/
command: move
schedule: 0 0 */2 * * ?
filters: "invoice_*.xml"
Because it uses move, the source folder stays clean, and the next run won’t accidentally re-send old invoices.
Cloud-to-cloud migration
Maybe you’re moving from Azure Blob to AWS S3. You can set up a simple copy job to run overnight to migrate the archive.
apiVersion: weik.io/v1alpha1
kind: MFT
metadata:
name: azure_to_s3_migration
spec:
source:
name: weikio_blob
path: archive/
destination:
name: backup_s3
path: migration/
command: copy
schedule: 0 0 3 * * ?
Keeping a replica in sync
If you have a local document share that you want mirrored to the cloud every four hours, use sync. Just remember: if someone deletes a document locally, the sync job will delete the cloud copy too.
apiVersion: weik.io/v1alpha1
kind: MFT
metadata:
name: document_sync
spec:
source:
name: company_smb
path: shared/documents/
destination:
name: weikio_blob
path: documents/
command: sync
schedule: 0 */4 * * * ?
Next steps
- Scheduling File Transfers - See how to build your own cron schedules
- MFT Setup - Learn how to define the CoreSystems used in these examples
- MFT Overview - Go back to the basics