Skip to content

Quick Start Guide

Get Vows Social AI running locally in under 30 minutes.

Prerequisites

Required

  • Node.js 18+ (for Cloudflare Workers)
  • Python 3.9+ (for ML services)
  • Git (for version control)

Accounts Needed (Free Tier)

Step 1: Clone Repository

git clone https://github.com/yourusername/vows_social_ai.git
cd vows_social_ai

Step 2: Install Dependencies

Cloudflare Workers

cd workers/orchestrator
npm install

ML Inference Service

cd services/ml-inference
pip install -r requirements.txt

Step 3: Configure Services

Qdrant Vector Database

  1. Create free account at https://cloud.qdrant.io
  2. Create a cluster (1GB free tier)
  3. Create collection:
    from qdrant_client import QdrantClient
    
    client = QdrantClient(url=QDRANT_URL, api_key=QDRANT_API_KEY)
    
    client.create_collection(
        collection_name="content_embeddings",
        vectors_config={
            "size": 384,  # Sentence-BERT dimensions
            "distance": "Cosine"
        }
    )
    

Supabase PostgreSQL

  1. Create project at https://supabase.com
  2. Run migrations:
    cd migrations
    psql $DATABASE_URL < 001_initial_schema.sql
    

Cloudflare Workers

  1. Install Wrangler CLI:

    npm install -g wrangler
    wrangler login
    

  2. Configure secrets:

    wrangler secret put QDRANT_API_KEY
    wrangler secret put SUPABASE_URL
    wrangler secret put SUPABASE_KEY
    

Step 4: Run Locally

Start ML Inference Service

cd services/ml-inference
python main.py
# Runs on http://localhost:8000

Start Cloudflare Workers (Dev Mode)

cd workers/orchestrator
wrangler dev
# Runs on http://localhost:8787

Test the API

# Get personalized feed
curl http://localhost:8787/api/feed/user-123

# Record interaction
curl -X POST http://localhost:8787/api/interactions \
  -H "Content-Type: application/json" \
  -d '{
    "userId": "user-123",
    "contentId": "content-456",
    "action": "view",
    "duration": 5
  }'

Step 5: Seed Sample Data

# Load sample wedding content
python scripts/seed-sample-data.py

# Verify in Qdrant
python scripts/verify-qdrant.py

Step 6: Deploy to Production

Deploy ML Service to Fly.io

cd services/ml-inference
fly launch
fly deploy

Deploy Workers to Cloudflare

cd workers/orchestrator
wrangler publish

Verify Everything Works

# Run integration tests
npm run test:integration

# Check services
curl https://your-worker.workers.dev/health
curl https://your-ml-service.fly.dev/health

Project Structure

vows_social_ai/
├── workers/
│   └── orchestrator/          # Cloudflare Workers API
├── services/
│   └── ml-inference/          # Fly.io ML service
├── migrations/                # Database schemas
├── scripts/                   # Utility scripts
├── docs/                      # Architecture docs
└── docs-site/                # MkDocs documentation

Common Issues

Qdrant Connection Fails

  • Check API key is correct
  • Verify cluster is running
  • Test connection: python scripts/test-qdrant.py

Workers Can't Reach ML Service

  • Ensure ML service is deployed and accessible
  • Check CORS settings
  • Verify secrets are set

Out of Free Tier Resources

  • Qdrant: 1GB limit (optimize embeddings)
  • Supabase: 500MB limit (archive old data)
  • Fly.io: 3 VMs, 256MB each (use shared CPU)

Next Steps

Getting Help

  • Issues: https://github.com/yourusername/vows_social_ai/issues
  • Docs: https://docs.vows.social
  • Architecture: See docs/ARCHITECTURE.md

Ready to build? Check out the Phase 1 Implementation Plan for what's next.