Skip to main content

Deployment

This guide covers deploying the Email Assistant for production use.

Deployment Optionsโ€‹

Local Deploymentโ€‹

macOS LaunchAgentโ€‹

Create a LaunchAgent for automatic scheduling:

<!-- ~/Library/LaunchAgents/com.emailassistant.digest.plist -->
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd">
<plist version="1.0">
<dict>
<key>Label</key>
<string>com.emailassistant.digest</string>

<key>ProgramArguments</key>
<array>
<string>/usr/bin/python3</string>
<string>/Users/you/EmailAssistant/src/main.py</string>
</array>

<key>WorkingDirectory</key>
<string>/Users/you/EmailAssistant</string>

<key>StartCalendarInterval</key>
<dict>
<key>Hour</key>
<integer>8</integer>
<key>Minute</key>
<integer>0</integer>
</dict>

<key>StandardOutPath</key>
<string>/Users/you/EmailAssistant/logs/stdout.log</string>

<key>StandardErrorPath</key>
<string>/Users/you/EmailAssistant/logs/stderr.log</string>

<key>EnvironmentVariables</key>
<dict>
<key>GOOGLE_API_KEY</key>
<string>your-api-key</string>
</dict>
</dict>
</plist>

Load the agent:

launchctl load ~/Library/LaunchAgents/com.emailassistant.digest.plist

Linux Cronโ€‹

# Edit crontab
crontab -e

# Run daily at 8 AM
0 8 * * * cd /home/user/EmailAssistant && /usr/bin/python3 src/main.py >> logs/cron.log 2>&1

Web Serverโ€‹

Run the Flask server as a service:

# Using gunicorn
pip install gunicorn
gunicorn -w 2 -b 0.0.0.0:8001 server:app

# Using systemd (Linux)
sudo nano /etc/systemd/system/emailassistant.service
# /etc/systemd/system/emailassistant.service
[Unit]
Description=Email Assistant Web Server
After=network.target

[Service]
Type=simple
User=www-data
WorkingDirectory=/opt/emailassistant
ExecStart=/opt/emailassistant/venv/bin/gunicorn -w 2 -b 0.0.0.0:8001 server:app
Restart=always
Environment=GOOGLE_API_KEY=your-key

[Install]
WantedBy=multi-user.target
sudo systemctl enable emailassistant
sudo systemctl start emailassistant

Docker Deploymentโ€‹

Dockerfileโ€‹

FROM python:3.11-slim

WORKDIR /app

# Install dependencies
COPY requirements.txt .
RUN pip install --no-cache-dir -r requirements.txt

# Copy application
COPY . .

# Create non-root user
RUN useradd -m appuser && chown -R appuser:appuser /app
USER appuser

# Expose port
EXPOSE 8001

# Run server
CMD ["gunicorn", "-w", "2", "-b", "0.0.0.0:8001", "server:app"]

Docker Composeโ€‹

# docker-compose.yml
version: '3.8'

services:
web:
build: .
ports:
- "8001:8001"
environment:
- GOOGLE_API_KEY=${GOOGLE_API_KEY}
volumes:
- ./data:/app/data
- ./credentials:/app/credentials:ro
restart: unless-stopped
healthcheck:
test: ["CMD", "curl", "-f", "http://localhost:8001/api/status"]
interval: 30s
timeout: 10s
retries: 3

scheduler:
build: .
command: python src/scheduler.py
environment:
- GOOGLE_API_KEY=${GOOGLE_API_KEY}
volumes:
- ./data:/app/data
- ./credentials:/app/credentials:ro
restart: unless-stopped

Run with Docker Compose:

# Start services
docker-compose up -d

# View logs
docker-compose logs -f

# Stop services
docker-compose down

Cloud Deploymentโ€‹

Google Cloud Runโ€‹

1. Build and Pushโ€‹

# Authenticate
gcloud auth configure-docker

# Build
docker build -t gcr.io/YOUR_PROJECT/email-assistant:latest .

# Push
docker push gcr.io/YOUR_PROJECT/email-assistant:latest

2. Deployโ€‹

gcloud run deploy email-assistant \
--image gcr.io/YOUR_PROJECT/email-assistant:latest \
--platform managed \
--region us-central1 \
--allow-unauthenticated \
--set-env-vars "GOOGLE_API_KEY=xxx"

3. Cloud Schedulerโ€‹

Set up scheduled execution:

gcloud scheduler jobs create http email-digest \
--schedule="0 8 * * *" \
--uri="https://your-service-url.run.app/api/refresh" \
--http-method=POST \
--time-zone="America/New_York"

Railway / Renderโ€‹

Both platforms support automatic deployment from GitHub:

  1. Connect GitHub repository
  2. Set environment variables in dashboard
  3. Deploy automatically on push

Environment Variables:

VariableRequiredDescription
GOOGLE_API_KEYYesGemini API key
PORTNoServer port (auto-set)

Environment Configurationโ€‹

Production Settingsโ€‹

# config/production.py

import os

class ProductionConfig:
"""Production configuration."""

# Security
DEBUG = False
TESTING = False

# API Keys (from environment)
GOOGLE_API_KEY = os.environ["GOOGLE_API_KEY"]

# Logging
LOG_LEVEL = "WARNING"
LOG_FILE = "/var/log/emailassistant/app.log"

# Performance
CACHE_SIZE = 2000
REQUEST_TIMEOUT = 30

# Rate limiting
RATE_LIMIT = "100/hour"

Security Checklistโ€‹

  • API keys stored in environment variables or secret manager
  • Gmail credentials secured with appropriate permissions
  • HTTPS enabled for web interface
  • Rate limiting configured
  • Log files do not contain sensitive data
  • Regular credential rotation scheduled

Monitoringโ€‹

Health Check Endpointโ€‹

@app.route("/health")
def health():
"""Health check for container orchestrators."""
checks = {
"api": check_gemini_connection(),
"gmail": check_gmail_connection(),
"disk": check_disk_space(),
}

healthy = all(checks.values())
status_code = 200 if healthy else 503

return jsonify({
"status": "healthy" if healthy else "unhealthy",
"checks": checks,
"timestamp": datetime.now().isoformat()
}), status_code

Logging Configurationโ€‹

# config/logging.py

import logging
from logging.handlers import RotatingFileHandler

def setup_logging():
"""Configure production logging."""

handler = RotatingFileHandler(
"logs/app.log",
maxBytes=10_000_000, # 10MB
backupCount=5
)

formatter = logging.Formatter(
"%(asctime)s - %(name)s - %(levelname)s - %(message)s"
)
handler.setFormatter(formatter)

logger = logging.getLogger()
logger.addHandler(handler)
logger.setLevel(logging.INFO)

Alertingโ€‹

Set up alerts for:

  • Script execution failures
  • API rate limit errors
  • Low cache hit rates
  • High error rates

Example with email notification:

import smtplib
from email.message import EmailMessage

def send_alert(subject: str, body: str):
"""Send alert email on critical errors."""
msg = EmailMessage()
msg["Subject"] = f"[Email Assistant] {subject}"
msg["From"] = "alerts@yourdomain.com"
msg["To"] = "admin@yourdomain.com"
msg.set_content(body)

with smtplib.SMTP("smtp.yourdomain.com", 587) as server:
server.starttls()
server.login("user", "password")
server.send_message(msg)

Backup and Recoveryโ€‹

Data Backupโ€‹

#!/bin/bash
# backup.sh - Run daily

BACKUP_DIR="/backups/emailassistant"
DATE=$(date +%Y%m%d)

# Backup data directory
tar -czf "$BACKUP_DIR/data-$DATE.tar.gz" /app/data/

# Keep only last 30 days
find "$BACKUP_DIR" -name "*.tar.gz" -mtime +30 -delete

Recovery Procedureโ€‹

  1. Stop the service
  2. Restore data directory from backup
  3. Verify Gmail credentials
  4. Restart the service
  5. Run manual refresh to verify

Scaling Considerationsโ€‹

Horizontal Scalingโ€‹

For high-volume deployments:

Redis for Shared Cacheโ€‹

import redis

redis_client = redis.from_url(os.environ.get("REDIS_URL"))

def get_cached(key: str) -> dict | None:
"""Get from Redis cache."""
data = redis_client.get(key)
return json.loads(data) if data else None

def set_cached(key: str, value: dict, ttl: int = 3600):
"""Set in Redis cache with TTL."""
redis_client.setex(key, ttl, json.dumps(value))