hardSenior Backend EngineerVideo Platform
Design a file upload service that handles large files with resumable uploads and processing
Posted 18/04/2026
by Mehedy Hasan Ador
Question Details
At a video platform:
> "Users upload videos up to 5GB. Uploads often fail on slow connections. We need resumable uploads, progress tracking, and automatic transcoding after upload."
> "Users upload videos up to 5GB. Uploads often fail on slow connections. We need resumable uploads, progress tracking, and automatic transcoding after upload."
Suggested Solution
Architecture
Client → Upload API → S3 (multipart) → SQS → Transcoder → CDN
↓
Progress Tracking (Redis)
Chunked/Resumable Upload
// Step 1: Initialize upload
POST /api/uploads/initiate
{ fileName: "video.mp4", fileSize: 5000000000 }
→ { uploadId: "up123", chunkSize: 5000000, presignedUrls: [...] }
// Step 2: Upload chunks (parallel, resumable)
PUT { presignedUrls[0] } → chunk data
PUT { presignedUrls[1] } → chunk data
// Failed chunk? Just retry that chunk, not the whole file
// Step 3: Complete upload
POST /api/uploads/complete
{ uploadId: "up123", parts: [{ partNumber: 1, etag: "..." }, ...] }
→ Triggers transcoding pipeline
Server Implementation
// Initiate upload
export async function POST(req: Request) {
const { fileName, fileSize } = await req.json();
const uploadId = crypto.randomUUID();
const chunkSize = 5 * 1024 * 1024; // 5MB chunks
const totalParts = Math.ceil(fileSize / chunkSize);
// Generate presigned URLs for each chunk
const presignedUrls = await Promise.all(
Array.from({ length: totalParts }, (, i) =>
s3.getPresignedUrl(uploads/${uploadId}/${fileName}, i + 1)
)
);
// Track progress
await redis.set(upload:${uploadId}, JSON.stringify({
fileName, fileSize, chunkSize, totalParts, uploadedParts: 0,
}));
return Response.json({ uploadId, chunkSize, presignedUrls });
}
Processing Pipeline
// After upload completes → SQS message → Transcoder
async function processVideo(uploadId: string) {
// 1. Download from S3
// 2. Transcode to multiple resolutions (720p, 1080p, 4K)
// 3. Generate thumbnails
// 4. Upload processed versions to S3 + CDN
// 5. Update database with video metadata
// 6. Notify user (WebSocket or push)
}