FastAPI File Uploads and Form Data
UploadFile for robust file handling and Form() for standard HTML form fields. Unlike JSON endpoints, form data must be declared as individual function parameters rather than a single Pydantic model. For efficiency, UploadFile uses a 'Spooled' file internally, meaning small files stay in memory while large ones are written to disk automatically. Access contents via await file.read() and metadata through file.filename and file.content_type.
Uploading a Single File with Validation
The UploadFile class is a wrapper around a Python SpooledTemporaryFile. This is critical for performance: it keeps small files in RAM for speed but offloads larger files to the temporary directory of your OS to prevent memory exhaustion. In production, always validate the content_type and enforce a maximum file size.
from fastapi import FastAPI, UploadFile, File, HTTPException, status import shutil from pathlib import Path app = FastAPI() UPLOAD_DIR = Path("uploads") UPLOAD_DIR.mkdir(exist_ok=True) @app.post('/upload') async def upload_document(file: UploadFile = File(...)): # 1. MIME Type Validation (Don't trust the extension!) if file.content_type not in ['application/pdf', 'image/jpeg']: raise HTTPException( status_code=status.HTTP_400_BAD_REQUEST, detail="Invalid file type. Only PDF and JPEG are allowed." ) # 2. Size Validation using file object metadata # Note: .size is available in newer FastAPI/Starlette versions MAX_SIZE = 10 * 1024 * 1024 # 10MB real_file_size = 0 # Stream content to disk to avoid memory spikes save_path = UPLOAD_DIR / file.filename with open(save_path, "wb") as buffer: while chunk := await file.read(1024 * 1024): # Read in 1MB chunks real_file_size += len(chunk) if real_file_size > MAX_SIZE: raise HTTPException(status_code=413, detail="File too large") buffer.write(chunk) return { 'filename': file.filename, 'saved_at': str(save_path), 'final_size': real_file_size }
Mixing Form Fields with File Uploads
A frequent pain point for developers is trying to send a Pydantic JSON body alongside a file. Due to the way HTTP works, you cannot easily mix application/json and multipart/form-data. Instead, you must declare each form field using Form(). FastAPI will then parse the multipart body and map the keys to your arguments.
from fastapi import FastAPI, UploadFile, File, Form, status from typing import Annotated app = FastAPI() @app.post('/forge/profile-update', status_code=status.HTTP_201_CREATED) async def update_profile( username: Annotated[str, Form(...)], bio: Annotated[str, Form(min_length=10)], # Optional file: defaults to None if not in the request avatar: UploadFile | None = File(None) ): response = { "user_id": 1024, "username": username, "bio": bio, "avatar_received": False } if avatar: # Process avatar (e.g., upload to S3 or resize) response["avatar_received"] = True response["avatar_name"] = avatar.filename return response
🎯 Key Takeaways
- UploadFile is superior to raw bytes as it uses a spooling temporary file, preventing RAM overflow for large assets.
- File metadata like filename and content_type are accessible immediately without reading the file body.
- Synchronous vs Asynchronous: file.read() is an awaitable method; however, for truly massive files, use a streaming chunk approach to keep the memory footprint constant.
- The 'No Pydantic' Rule: When using multipart/form-data, you must declare fields individually using Form() or use a library like
python-multipart. - Always use a library like
python-magicor check the file header if you need deep validation of file types beyond the client-provided MIME type.
Interview Questions on This Topic
- QExplain the difference between `bytes` and `UploadFile` in a FastAPI endpoint. Which one would you use for a 2GB video upload and why?
- QHow does the `python-multipart` library interact with FastAPI under the hood? Why is it a required dependency for form handling?
- QDescribe the security risks associated with allowing user-defined filenames. How would you sanitize a filename before saving it to a local filesystem?
- QHow would you implement a progress bar or status tracker for a large file upload in a purely asynchronous FastAPI environment?
- QWhat is a 'SpooledTemporaryFile', and how does it prevent the 'Out of Memory' (OOM) killer from terminating your API process during large transfers?
Frequently Asked Questions
How do I save an uploaded file to disk in FastAPI?
While you can use file.read(), the most professional approach for production is to stream the file in chunks to a permanent location. This prevents your server from buffering gigabytes of data. Use shutil.copyfileobj(file.file, destination) for synchronous writes, or an async chunk loop for better concurrency: while chunk := await file.read(size): buffer.write(chunk).
Why can't I use a Pydantic model for form data?
This is fundamentally due to how browsers and HTTP handle the multipart/form-data encoding. It does not map natively to a single JSON object. While you can hack this by receiving a JSON string as a Form field and parsing it manually inside the endpoint, the standard FastAPI pattern is to define individual Form() parameters.
Can I upload multiple files at once?
Yes. Simply define the type hint as a list: files: list[UploadFile] = File(...). This allows the client to send multiple files under the same key. You can then iterate through the list and process each file individually.
Developer and founder of TheCodeForge. I built this site because I was tired of tutorials that explain what to type without explaining why it works. Every article here is written to make concepts actually click.