Upload Objects
Upload files to WAYSCloud Storage using the PUT object endpoint. This guide covers single file uploads, content types, and metadata.
Endpoint
PUT /v1/storage/{bucket}/{key}
Parameters:
bucket(required) - Bucket namekey(required) - Object key (file path/name)
Headers:
Authorization: Bearer {api_key}(required)Content-Type(recommended) - MIME typeContent-Length(optional) - Size in bytesx-amz-meta-*(optional) - Custom metadata
Basic Upload
cURL
curl -X PUT "https://api.wayscloud.services/v1/storage/my-bucket/document.pdf" \
-H "Authorization: Bearer wayscloud_storage_abc123_YourSecretKey" \
-H "Content-Type: application/pdf" \
--data-binary @document.pdf
Python (requests)
import requests
import os
API_KEY = os.getenv('WAYSCLOUD_API_KEY')
bucket = 'my-bucket'
key = 'document.pdf'
with open('document.pdf', 'rb') as file:
response = requests.put(
f'https://api.wayscloud.services/v1/storage/{bucket}/{key}',
headers={
'Authorization': f'Bearer {API_KEY}',
'Content-Type': 'application/pdf'
},
data=file
)
print(f'Status: {response.status_code}')
print(f'ETag: {response.headers.get("ETag")}')
Python (boto3)
import boto3
import os
s3 = boto3.client(
's3',
endpoint_url='https://api.wayscloud.services/v1/storage',
aws_access_key_id='wayscloud',
aws_secret_access_key=os.getenv('WAYSCLOUD_API_KEY')
)
# Upload file
s3.upload_file(
'document.pdf', # Local file
'my-bucket', # Bucket
'document.pdf' # Key
)
# Or use put_object for more control
with open('document.pdf', 'rb') as file:
s3.put_object(
Bucket='my-bucket',
Key='document.pdf',
Body=file,
ContentType='application/pdf'
)
JavaScript (axios)
const axios = require('axios');
const fs = require('fs');
const API_KEY = process.env.WAYSCLOUD_API_KEY;
const bucket = 'my-bucket';
const key = 'document.pdf';
const fileBuffer = fs.readFileSync('document.pdf');
const response = await axios.put(
`https://api.wayscloud.services/v1/storage/${bucket}/${key}`,
fileBuffer,
{
headers: {
'Authorization': `Bearer ${API_KEY}`,
'Content-Type': 'application/pdf'
}
}
);
console.log(`Status: ${response.status}`);
console.log(`ETag: ${response.headers.etag}`);
JavaScript (AWS SDK v3)
const { S3Client, PutObjectCommand } = require('@aws-sdk/client-s3');
const fs = require('fs');
const s3Client = new S3Client({
endpoint: 'https://api.wayscloud.services/v1/storage',
region: 'eu-west-1',
credentials: {
accessKeyId: 'wayscloud',
secretAccessKey: process.env.WAYSCLOUD_API_KEY
}
});
const fileContent = fs.readFileSync('document.pdf');
const command = new PutObjectCommand({
Bucket: 'my-bucket',
Key: 'document.pdf',
Body: fileContent,
ContentType: 'application/pdf'
});
const response = await s3Client.send(command);
console.log('Upload successful:', response);
Upload with Content Type
Setting the correct Content-Type ensures proper file handling:
# PDF
curl -X PUT "https://api.wayscloud.services/v1/storage/my-bucket/doc.pdf" \
-H "Authorization: Bearer $WAYSCLOUD_API_KEY" \
-H "Content-Type: application/pdf" \
--data-binary @doc.pdf
# Image
curl -X PUT "https://api.wayscloud.services/v1/storage/my-bucket/photo.jpg" \
-H "Authorization: Bearer $WAYSCLOUD_API_KEY" \
-H "Content-Type: image/jpeg" \
--data-binary @photo.jpg
# Video
curl -X PUT "https://api.wayscloud.services/v1/storage/my-bucket/video.mp4" \
-H "Authorization: Bearer $WAYSCLOUD_API_KEY" \
-H "Content-Type: video/mp4" \
--data-binary @video.mp4
Common MIME Types
| File Type | MIME Type | Extension |
|---|---|---|
application/pdf | ||
| JPEG Image | image/jpeg | .jpg, .jpeg |
| PNG Image | image/png | .png |
| MP4 Video | video/mp4 | .mp4 |
| MP3 Audio | audio/mpeg | .mp3 |
| JSON | application/json | .json |
| Plain Text | text/plain | .txt |
| HTML | text/html | .html |
| CSV | text/csv | .csv |
| ZIP Archive | application/zip | .zip |
Upload from String/Memory
Upload content without creating a file:
Python
import requests
import os
API_KEY = os.getenv('WAYSCLOUD_API_KEY')
# Upload string content
content = "Hello, WAYSCloud!"
response = requests.put(
'https://api.wayscloud.services/v1/storage/my-bucket/hello.txt',
headers={
'Authorization': f'Bearer {API_KEY}',
'Content-Type': 'text/plain'
},
data=content.encode('utf-8')
)
# Upload JSON
import json
data = {'name': 'WAYSCloud', 'services': ['storage', 'llm', 'database']}
response = requests.put(
'https://api.wayscloud.services/v1/storage/my-bucket/config.json',
headers={
'Authorization': f'Bearer {API_KEY}',
'Content-Type': 'application/json'
},
data=json.dumps(data)
)
JavaScript
const axios = require('axios');
const API_KEY = process.env.WAYSCLOUD_API_KEY;
// Upload string content
const content = 'Hello, WAYSCloud!';
await axios.put(
'https://api.wayscloud.services/v1/storage/my-bucket/hello.txt',
content,
{
headers: {
'Authorization': `Bearer ${API_KEY}`,
'Content-Type': 'text/plain'
}
}
);
// Upload JSON
const data = { name: 'WAYSCloud', services: ['storage', 'llm', 'database'] };
await axios.put(
'https://api.wayscloud.services/v1/storage/my-bucket/config.json',
JSON.stringify(data),
{
headers: {
'Authorization': `Bearer ${API_KEY}`,
'Content-Type': 'application/json'
}
}
);
Upload with Metadata
Add custom metadata to files:
curl -X PUT "https://api.wayscloud.services/v1/storage/my-bucket/photo.jpg" \
-H "Authorization: Bearer $WAYSCLOUD_API_KEY" \
-H "Content-Type: image/jpeg" \
-H "x-amz-meta-author: John Doe" \
-H "x-amz-meta-camera: Canon EOS R5" \
-H "x-amz-meta-location: Oslo, Norway" \
--data-binary @photo.jpg
Python with Metadata
import boto3
import os
s3 = boto3.client(
's3',
endpoint_url='https://api.wayscloud.services/v1/storage',
aws_access_key_id='wayscloud',
aws_secret_access_key=os.getenv('WAYSCLOUD_API_KEY')
)
with open('photo.jpg', 'rb') as file:
s3.put_object(
Bucket='my-bucket',
Key='photo.jpg',
Body=file,
ContentType='image/jpeg',
Metadata={
'author': 'John Doe',
'camera': 'Canon EOS R5',
'location': 'Oslo, Norway',
'date': '2025-11-04'
}
)
Metadata Guidelines:
- Prefix:
x-amz-meta-* - Key names: lowercase, alphanumeric, hyphens
- Value size: Max 2KB per key
- Total metadata: Max 8KB per object
Upload to Nested Paths
Organize files with folder-like structures:
# Upload to folder structure
curl -X PUT "https://api.wayscloud.services/v1/storage/my-bucket/docs/2025/report.pdf" \
-H "Authorization: Bearer $WAYSCLOUD_API_KEY" \
-H "Content-Type: application/pdf" \
--data-binary @report.pdf
# Multiple levels
curl -X PUT "https://api.wayscloud.services/v1/storage/my-bucket/users/john/photos/vacation.jpg" \
-H "Authorization: Bearer $WAYSCLOUD_API_KEY" \
-H "Content-Type: image/jpeg" \
--data-binary @vacation.jpg
Python Example:
import boto3
import os
s3 = boto3.client(
's3',
endpoint_url='https://api.wayscloud.services/v1/storage',
aws_access_key_id='wayscloud',
aws_secret_access_key=os.getenv('WAYSCLOUD_API_KEY')
)
# Upload with folder structure
s3.upload_file(
'report.pdf',
'my-bucket',
'docs/2025/november/report.pdf'
)
Storage is flat - "folders" are simulated using forward slashes in object keys.
Response
Success (200 OK)
HTTP/1.1 200 OK
Content-Length: 0
ETag: "1a2b3c4d5e6f"
Date: Mon, 04 Nov 2025 12:00:00 GMT
ETag: MD5 hash of the uploaded content
Overwrite Existing File (200 OK)
Uploading to an existing key overwrites the file:
# First upload
s3.upload_file('version1.txt', 'my-bucket', 'file.txt')
# This overwrites the previous file
s3.upload_file('version2.txt', 'my-bucket', 'file.txt')
No confirmation required - existing files are silently overwritten.
Error Responses
401 Unauthorized
{
"error": "Invalid or expired API key",
"code": "AUTH_FAILED"
}
Solution: Check API key is correct and active.
403 Forbidden
{
"error": "API key does not have write permission",
"code": "INSUFFICIENT_PERMISSIONS"
}
Solution: Ensure API key has storage write permissions.
404 Not Found
{
"error": "Bucket not found",
"code": "BUCKET_NOT_FOUND"
}
Solution: Create bucket first or check bucket name.
413 Request Entity Too Large
{
"error": "File size exceeds maximum allowed (50GB)",
"code": "PAYLOAD_TOO_LARGE"
}
Solution: Use multipart upload for large files.
507 Insufficient Storage
{
"error": "Storage quota exceeded",
"code": "QUOTA_EXCEEDED"
}
Solution: Delete unused files or upgrade storage plan.
Best Practices
1. Set Correct Content-Type
# Good - explicit Content-Type
s3.upload_file('photo.jpg', 'bucket', 'photo.jpg',
ExtraArgs={'ContentType': 'image/jpeg'})
# Bad - missing Content-Type (defaults to binary)
s3.upload_file('photo.jpg', 'bucket', 'photo.jpg')
2. Use Meaningful Keys
# Good - descriptive, organized
'users/john-doe/documents/invoice-2025-11.pdf'
'images/products/laptop-macbook-pro-2025.jpg'
# Bad - unclear, unorganized
'file1.pdf'
'img.jpg'
3. Handle Errors
import boto3
from botocore.exceptions import ClientError
try:
s3.upload_file('large-file.zip', 'my-bucket', 'large-file.zip')
print("Upload successful")
except ClientError as e:
if e.response['Error']['Code'] == 'EntityTooLarge':
print("File too large, use multipart upload")
else:
print(f"Upload failed: {e}")
4. Validate Before Upload
import os
def upload_with_validation(local_file, bucket, key):
# Check file exists
if not os.path.exists(local_file):
raise FileNotFoundError(f"{local_file} not found")
# Check file size
file_size = os.path.getsize(local_file)
if file_size > 50 * 1024**3: # 50GB
raise ValueError("File too large for single upload")
# Upload
s3.upload_file(local_file, bucket, key)
Next Steps
- Multipart Upload - Upload files larger than 5GB
- Download Objects - Retrieve uploaded files
- List Objects - Browse bucket contents
- Storage Examples - Complete code examples