AI FastAPI Learning Zone

Click a hack → follow the suspended steps → unlock deeper tasks
Mode: Interactive
Stage 1
ML Focus
DL Focus

Hack 1 — Hello World (FastAPI basics)

Install → create app → run. Start simple and unlock the rest.

AI Mentor: Run pip install fastapi uvicorn in your env (or use a venv).
pip install fastapi uvicorn
Press Next below to unlock the next step.
AI Mentor: Create main.py with this minimal API:
from fastapi import FastAPI app = FastAPI() @app.get("/") def read_root(): return {"message":"Hello World"}
AI Mentor: Run the server:
uvicorn main:app --reload
Open http://127.0.0.1:8000/docs to see the automatic Swagger UI. Try the GET endpoint!

Hack 2 — Routes & Methods

Learn GET/POST/PUT/DELETE patterns and path/query params.

AI Mentor: Example: Path + Query
@app.get("/items/{item_id}") def read_item(item_id: int, q: str | None = None): return {"item_id": item_id, "q": q}
AI Mentor: Example: POST body
from pydantic import BaseModel class Item(BaseModel): name: str price: float @app.post("/items/") def create_item(item: Item): return item
AI Mentor: Combine with in-memory CRUD or DB to practice.
// Implement update/delete endpoints and test with curl or Swagger UI

Hack 3 — ML Model Integration

Integrate scikit-learn models with FastAPI endpoints.

AI Mentor: Load a scikit-learn model and prepare it for predictions.
import joblib import numpy as np # Load your pre-trained model model = joblib.load('model.pkl') # Define input schema class PredictionInput(BaseModel): features: list[float]
AI Mentor: Create an endpoint that accepts input and returns predictions.
@app.post("/predict") def predict(input_data: PredictionInput): # Convert input to numpy array features = np.array(input_data.features).reshape(1, -1) # Make prediction prediction = model.predict(features) return {"prediction": prediction.tolist()[0]}
AI Mentor: Add validation and error handling for robust ML APIs.
from fastapi import HTTPException @app.post("/predict") def predict(input_data: PredictionInput): try: features = np.array(input_data.features).reshape(1, -1) # Validate input shape matches model expectations if features.shape[1] != model.n_features_in_: raise HTTPException(status_code=400, detail=f"Expected {model.n_features_in_} features") prediction = model.predict(features) return {"prediction": prediction.tolist()[0]} except Exception as e: raise HTTPException(status_code=500, detail=str(e))

Hack 4 — Deep Learning API

Serve PyTorch/TensorFlow models with FastAPI.

AI Mentor: Load a PyTorch model and set it to evaluation mode.
import torch import torchvision.models as models # Load a pre-trained model model = models.resnet50(pretrained=True) model.eval() # Define transformation pipeline from torchvision import transforms transform = transforms.Compose([ transforms.Resize(256), transforms.CenterCrop(224), transforms.ToTensor(), transforms.Normalize(mean=[0.485, 0.456, 0.406], std=[0.229, 0.224, 0.225]), ])
AI Mentor: Create an endpoint that accepts image uploads.
from fastapi import UploadFile, File from PIL import Image import io @app.post("/classify") async def classify_image(file: UploadFile = File(...)): # Read and preprocess image image_data = await file.read() image = Image.open(io.BytesIO(image_data)) input_tensor = transform(image).unsqueeze(0) # Make prediction with torch.no_grad(): outputs = model(input_tensor) _, predicted = outputs.max(1) return {"class_id": predicted.item()}
AI Mentor: Optimize for high-throughput with async operations.
import asyncio from concurrent.futures import ThreadPoolExecutor # Create a thread pool for CPU-bound operations executor = ThreadPoolExecutor(max_workers=4) @app.post("/classify") async def classify_image(file: UploadFile = File(...)): loop = asyncio.get_event_loop() # Read file asynchronously image_data = await file.read() # Run preprocessing and inference in thread pool result = await loop.run_in_executor( executor, process_image, image_data ) return result def process_image(image_data): # CPU-intensive operations image = Image.open(io.BytesIO(image_data)) input_tensor = transform(image).unsqueeze(0) with torch.no_grad(): outputs = model(input_tensor) _, predicted = outputs.max(1) return {"class_id": predicted.item()}

API Simulator (fast feedback)

This console simulates expected responses for the code examples above (for demo purposes).
Console ready — click Step 3 in a hack to simulate output.