rclone.conf file to
automatically create secrets and locations in DataRaven — no manual re-entry required.
Prerequisites
- A DataRaven account with Admin or Owner role on the team
- An existing
rclone.conffile (usually at~/.config/rclone/rclone.conf)
Supported Providers
The importer supports object storage remotes:| rclone type | DataRaven provider |
|---|---|
s3 (AWS) | AWS S3 |
s3 (Cloudflare) | Cloudflare R2 |
s3 (Wasabi) | Wasabi |
s3 (Tigris) | Tigris |
s3 (Other) | S3 Compatible |
azureblob | Azure Blob Storage |
gcs | Google Cloud Storage |
b2 | Backblaze B2 |
Remotes using non-object-storage types (Google Drive, Dropbox, SFTP, etc.) are automatically
skipped with a friendly message — they won’t cause errors.
Step 1: Upload Your Config
Upload the file
Drag and drop your
rclone.conf file onto the upload area, or click to browse. The file is
parsed on the server — credentials are not stored at this stage.Step 2: Review and Edit
After parsing, you’ll see a breakdown of your remotes:- Ready to import — Recognized remotes with their mapped provider type and detected credentials
- Skipped — Recognized but unsupported types (e.g., Google Drive)
- Errors — Remotes that couldn’t be parsed (missing fields, unknown types)
- Edit the name — Change the display name for the secret and location
- Set the bucket name — Required for all remotes (enter your bucket or container name)
- Edit region and endpoint — Adjust if needed
- Deselect — Uncheck any remote you don’t want to import
Step 3: Confirm Import
Click Import to create resources. For each selected remote, two resources are created:- A Secret — Credentials are encrypted and stored in AWS SSM Parameter Store
- A Location — Linked to the new secret, ready to use in tasks
What’s Next?
Once your locations are imported, you can immediately use them in tasks.Create a Task
Set up sync, copy, or move operations between your imported locations.