Skip to main content

Storage API Examples

Production-ready code examples using AWS CLI, boto3, and other popular tools.

AWS CLI

Installation

pip install awscli

Configuration

# Configure AWS CLI
aws configure set aws_access_key_id "wayscloud"
aws configure set aws_secret_access_key "wayscloud_storage_abc123_YourSecretKey"
aws configure set default.region "eu-west-1"
aws configure set default.s3.signature_version "s3v4"

# Set endpoint
export AWS_ENDPOINT_URL="https://api.wayscloud.services/v1/storage"

Commands

# List buckets
aws s3 ls --endpoint-url $AWS_ENDPOINT_URL

# List objects in bucket
aws s3 ls s3://my-bucket/ --endpoint-url $AWS_ENDPOINT_URL

# Upload file
aws s3 cp document.pdf s3://my-bucket/ --endpoint-url $AWS_ENDPOINT_URL

# Download file
aws s3 cp s3://my-bucket/document.pdf ./download.pdf --endpoint-url $AWS_ENDPOINT_URL

# Delete file
aws s3 rm s3://my-bucket/document.pdf --endpoint-url $AWS_ENDPOINT_URL

# Sync directory
aws s3 sync ./local-folder s3://my-bucket/remote-folder --endpoint-url $AWS_ENDPOINT_URL

rclone

Configuration

# Install rclone
curl https://rclone.org/install.sh | sudo bash

# Configure
rclone config create wayscloud s3 \
provider=Other \
access_key_id=wayscloud \
secret_access_key=wayscloud_storage_abc123_YourSecretKey \
endpoint=https://api.wayscloud.services/v1/storage \
region=eu-west-1

Commands

# List buckets
rclone lsd wayscloud:

# List files
rclone ls wayscloud:my-bucket

# Upload file
rclone copy document.pdf wayscloud:my-bucket/

# Download file
rclone copy wayscloud:my-bucket/document.pdf ./

# Sync folder
rclone sync ./local-folder wayscloud:my-bucket/remote-folder

# Mount as drive (Linux/macOS)
rclone mount wayscloud:my-bucket /mnt/wayscloud --daemon

Cyberduck (GUI)

  1. Download from cyberduck.io
  2. Open → Open Connection
  3. Select Amazon S3
  4. Server: api.wayscloud.services/v1/storage
  5. Access Key ID: wayscloud
  6. Secret Access Key: Your API key
  7. Connect

Backup Scripts

Daily Backup Script

#!/bin/bash
# daily-backup.sh

WAYSCLOUD_API_KEY="wayscloud_storage_abc123_YourSecretKey"
BUCKET="my-backups"
DATE=$(date +%Y-%m-%d)
BACKUP_DIR="/var/backups"

# Create tar archive
tar -czf /tmp/backup-${DATE}.tar.gz ${BACKUP_DIR}

# Upload to WAYSCloud
curl -X PUT "https://api.wayscloud.services/v1/storage/${BUCKET}/backups/backup-${DATE}.tar.gz" \
-H "Authorization: Bearer ${WAYSCLOUD_API_KEY}" \
--data-binary @/tmp/backup-${DATE}.tar.gz

# Clean up
rm /tmp/backup-${DATE}.tar.gz

echo "Backup completed: backup-${DATE}.tar.gz"

Automated Cleanup

import requests
import os
from datetime import datetime, timedelta
import xml.etree.ElementTree as ET

API_KEY = os.getenv('WAYSCLOUD_API_KEY')
bucket = 'my-backups'
days_to_keep = 30

# List all backups
response = requests.get(
f'https://api.wayscloud.services/v1/storage/{bucket}/?prefix=backups/',
headers={'Authorization': f'Bearer {API_KEY}'}
)

# Parse XML
root = ET.fromstring(response.content)
cutoff_date = datetime.now() - timedelta(days=days_to_keep)

for obj in root.findall('.//{http://s3.amazonaws.com/doc/2006-03-01/}Contents'):
key = obj.find('{http://s3.amazonaws.com/doc/2006-03-01/}Key').text
last_modified = obj.find('{http://s3.amazonaws.com/doc/2006-03-01/}LastModified').text

# Parse date
obj_date = datetime.fromisoformat(last_modified.replace('Z', '+00:00'))

# Delete if older than cutoff
if obj_date < cutoff_date:
requests.delete(
f'https://api.wayscloud.services/v1/storage/{bucket}/{key}',
headers={'Authorization': f'Bearer {API_KEY}'}
)
print(f'Deleted old backup: {key}')

Next Steps