Global Flavor Archive — Infrastructure & System Design
Global Flavor Archive is built as a modern, full-stack web application using SvelteKit for the frontend and presentation layer, Deno as the server runtime, Supabase for the relational database and auth layer, and a self-hosted Ollama instance on a Mac Mini for AI-powered recipe intelligence. The architecture is designed for developer velocity, cost efficiency at early scale, and the ability to run sophisticated AI inference without per-token API costs.
The platform serves three distinct workloads: (1) static/SSR page delivery, (2) real-time API calls for recipe search and user data, and (3) AI inference for substitution suggestions and flavor pairing. The Ollama Mac Mini handles workload (3) exclusively, keeping AI costs predictable and enabling fine-tuned models specific to food and migration domain knowledge.
| Layer | Technology | Purpose | Hosting |
|---|---|---|---|
| Frontend Framework | SvelteKit 2.x | SSR + MPA + API routes | Fly.io |
| Server Runtime | Deno 2.x | TypeScript execution, edge-compatible | Fly.io |
| Database | Supabase (PostgreSQL 15) | Recipes, migrations, users, content | Supabase Cloud (EU) |
| Auth | Supabase Auth | Email + OAuth, JWT sessions | Supabase Cloud |
| File Storage | Supabase Storage | Recipe images, audio narrations | Supabase Cloud |
| AI Inference | Ollama + Llama 3.1 8B | Substitution engine, flavor pairing | Mac Mini (home lab) |
| Vector Search | Supabase pgvector | Semantic recipe + ingredient search | Supabase Cloud |
| CDN / Media | Cloudflare CDN | Image optimization, static asset delivery | Cloudflare |
| Resend | Transactional emails, receipts | Resend Cloud | |
| Payments | Stripe | Subscriptions, billing management | Stripe Cloud |
| Analytics | PostHog | Product analytics, funnel tracking | PostHog Cloud |
| Error Tracking | Sentry | Frontend + backend error monitoring | Sentry Cloud |
| Maps | Leaflet.js + Natural Earth | Interactive migration path maps | Client-side (CDN) |
| Rate Limiting | Upstash Redis | API rate limiting, caching layer | Upstash Cloud |
The central entity. Every recipe is a migration artifact.
CREATE TABLE recipes (
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
slug TEXT UNIQUE NOT NULL,
title TEXT NOT NULL,
description TEXT,
origin_region TEXT NOT NULL, -- e.g. "Punjab, India"
origin_country TEXT NOT NULL, -- ISO 3166-1 alpha-2
destination_country TEXT, -- where it migrated to
migration_wave TEXT, -- e.g. "1960s–80s Punjabi Sikh immigration to UK"
-- Migration metadata
migrated_at TIMESTAMPTZ, -- approximate historical date
adaptation_type TEXT CHECK (adaptation_type IN ('preserved','hybridized','replaced','fused')),
preservation_score INTEGER CHECK (preservation_score BETWEEN 1 AND 5),
-- Content
story_html TEXT, -- migration narrative (rich text)
story_audio_url TEXT,
cultural_context_json JSONB, -- festivals, occasions, regional variants
controversy_text TEXT, -- origin disputes, etc.
-- Recipe data
prep_time_minutes INTEGER,
cook_time_minutes INTEGER,
servings INTEGER,
difficulty TEXT CHECK (difficulty IN ('beginner','intermediate','advanced')),
cuisine_tags TEXT[], -- array: ['punjabi', 'vegetarian', 'lacto-vegetarian']
-- Ingredients (JSONB for flexibility; also denormalized for search)
ingredients_json JSONB, -- [{name, quantity, unit, substitutes[], substitution_role}]
instructions_json JSONB, -- [{step, text, tip}]
-- AI metadata
embedding VECTOR(3072), -- for semantic search via pgvector
ai_substitution_prompt TEXT, -- custom system prompt for this recipe's substitutions
-- Media
hero_image_url TEXT,
gallery_urls TEXT[],
-- Access
access_tier TEXT DEFAULT 'free' CHECK (access_tier IN ('free','pro','explorer')),
is_published BOOLEAN DEFAULT false,
published_at TIMESTAMPTZ,
-- Attribution
contributor_id UUID REFERENCES auth.users(id),
editor_notes TEXT,
-- Timestamps
created_at TIMESTAMPTZ DEFAULT NOW(),
updated_at TIMESTAMPTZ DEFAULT NOW()
);
CREATE INDEX idx_recipes_origin ON recipes(origin_country);
CREATE INDEX idx_recipes_destination ON recipes(destination_country);
CREATE INDEX idx_recipes_cuisine ON recipes USING GIN(cuisine_tags);
CREATE INDEX idx_recipes_embedding ON recipes USING ivfflat(embedding vector_cosine_ops);
Documents specific migration events that moved recipes between locations.
CREATE TABLE migrations ( id UUID PRIMARY KEY DEFAULT gen_random_uuid(), -- Names for the migration event migration_name TEXT, -- e.g. "Great Migration — Southern US to Chicago" description TEXT, -- Geographic scope origin_location TEXT NOT NULL, origin_lat NUMERIC(9,6), origin_lng NUMERIC(9,6), dest_location TEXT NOT NULL, dest_lat NUMERIC(9,6), dest_lng NUMERIC(9,6), -- Time window start_year INTEGER, end_year INTEGER, -- Context reason TEXT, -- economic, political, war, education, etc. historical_summary TEXT, -- Linked recipes (many-to-many via junction) linked_recipes UUID[] REFERENCES recipes(id), created_at TIMESTAMPTZ DEFAULT NOW() ); CREATE INDEX idx_migrations_origin ON migrations(origin_location); CREATE INDEX idx_migrations_dest ON migrations(dest_location);
Normalized ingredient reference with substitution mapping and flavor chemistry data.
CREATE TABLE ingredients (
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
name TEXT NOT NULL,
aliases TEXT[], -- ["tomato", "tomates", "tomaat"]
category TEXT, -- "nightshade", "dairy", "grain", "spice"
subcategory TEXT, -- "fresh herb", "dried legume"
-- Flavor chemistry (for pairing engine)
flavor_profile JSONB, -- {umami: 0.7, sweetness: 0.2, bitterness: 0.1, ...}
chemical_notes TEXT, -- "contains glutamates", "high in capsaicin"
-- Dietary flags
dietary_tags TEXT[], -- ["vegetarian", "vegan", "gluten-free", "halal"]
-- Substitution graph (adjacency list)
direct_substitutes UUID[], -- ingredient IDs that substitute this one
inverse_substitutes UUID[], -- ingredients this one substitutes
-- Regional availability
availability_regions TEXT[], -- ["NA_EAST", "NA_WEST", "EU_NORTH", "ASIA_EAST"]
-- Data quality
confidence_score INTEGER CHECK (confidence_score BETWEEN 1 AND 5),
created_at TIMESTAMPTZ DEFAULT NOW(),
updated_at TIMESTAMPTZ DEFAULT NOW()
);
CREATE INDEX idx_ingredients_name ON ingredients USING GIN(to_tsvector('english', name || ' ' || COALESCE(array_to_string(aliases, ' '), '')));
CREATE INDEX idx_ingredients_category ON ingredients(category);
CREATE TABLE recipe_migrations ( recipe_id UUID REFERENCES recipes(id) ON DELETE CASCADE, migration_id UUID REFERENCES migrations(id) ON DELETE CASCADE, role TEXT, -- 'origin', 'transit', 'destination' notes TEXT, PRIMARY KEY (recipe_id, migration_id, role) );
CREATE TABLE user_profiles (
id UUID PRIMARY KEY REFERENCES auth.users(id),
display_name TEXT,
avatar_url TEXT,
bio TEXT,
subscription_tier TEXT DEFAULT 'free' CHECK (subscription_tier IN ('free','pro','explorer')),
stripe_customer_id TEXT,
stripe_subscription_id TEXT,
subscription_status TEXT,
dietary_restrictions TEXT[], -- ["halal", "vegetarian"]
preferred_cuisines TEXT[],
-- Personal recipe vault
saved_recipes UUID[] REFERENCES recipes(id),
family_recipes JSONB, -- user-submitted family recipe data
created_at TIMESTAMPTZ DEFAULT NOW(),
updated_at TIMESTAMPTZ DEFAULT NOW()
);
CREATE TABLE ai_substitution_logs (
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
user_id UUID REFERENCES auth.users(id),
recipe_id UUID REFERENCES recipes(id),
original_ingredient TEXT NOT NULL,
suggested_ingredient TEXT NOT NULL,
acceptance BOOLEAN, -- did the user use it?
rating INTEGER CHECK (rating BETWEEN 1 AND 5),
response_time_ms INTEGER, -- for latency monitoring
created_at TIMESTAMPTZ DEFAULT NOW()
);
CREATE TABLE community_stories (
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
user_id UUID REFERENCES auth.users(id),
recipe_id UUID REFERENCES recipes(id),
story_text TEXT NOT NULL,
is_published BOOLEAN DEFAULT false,
moderation_status TEXT DEFAULT 'pending' CHECK (moderation_status IN ('pending','approved','rejected')),
upvotes INTEGER DEFAULT 0,
created_at TIMESTAMPTZ DEFAULT NOW()
);
The substitution engine runs on a dedicated Mac Mini (M2 Pro, 32GB RAM) hosted at our office, running Ollama with llama3.1:8b-instruct-q4_0. The model was fine-tuned on a custom corpus of 50,000 recipe-ingredient pairs and 10,000 curated substitution examples from professional chef datasets.
/api/substitute)
2. API validates subscription tier, checks rate limit (Upstash Redis)
3. API fetches recipe ingredients_json + cultural context from Supabase
4. API builds structured prompt with:
• Recipe context (title, origin, migration story excerpt)
• Ingredient role (why this ingredient matters culturally)
• User's dietary restrictions & location
• Substitution history (if any previous substitutions exist)
5. Prompt sent to Ollama endpoint (http://mac-mini:11434/api/generate)
6. Ollama returns JSON with 3 substitution options + explanations
7. API parses response, logs to ai_substitution_logs
8. Response returned to client (target <1.5s P95)
Response format from Ollama:
{
"substitutions": [
{
"ingredient": "coconut milk",
"role_preservation": 0.85,
"explanation": "Coconut milk mirrors the rich, fatty base of coconut cream while maintaining the dish's South Indian identity. It adds a subtle sweetness that complements the spice profile.",
"cultural_note": "Common in Kerala cuisine; maintains vegetarian identity if keeping dairy-free.",
"adaptation_type": "local_equivalent"
},
...
]
}
The pairing engine uses a two-stage approach: first, pgvector similarity search on recipe embeddings to find candidates, then the Ollama model scores cultural compatibility between dishes.
-- Stage 1: semantic vector search for recipe candidates
SELECT id, title, origin_region, 1 - (embedding <=> query_embedding) AS similarity
FROM recipes
WHERE access_tier IN ('free', 'pro', 'explorer')
ORDER BY embedding <=> query_embedding
LIMIT 50;
-- Stage 2: Ollama scores cultural linkage
-- "These two recipes share migration lineage: Recipe A traveled
-- from Punjab to Birmingham in the 1970s; Recipe B traveled from
-- Punjab to Toronto in the 1990s. Are they culturally complementary
-- for a single meal? Score 1-10 and explain."
ai_substitution_logscloudflared) to expose it as ollama-internal.globalflavorarchive.com with automatic reconnect. Fallback: static cached substitution recommendations from a pre-computed table for any recipe with >90% acceptance rate.
SvelteKit App (Deno runtime) ├── Primary region: ams (Amsterdam) ├── Replica regions: sjc (San Jose), fra (Frankfurt) ├── Autoscaling: 1-4 VMs per region based on CPU └── Volume: 10GB for SvelteKit build cache Deno Deploy (API routes for heavy SSR) └── Edge functions: 35+ regions via Deno Deploy Supabase ├── Primary: eu-west-1 (Ireland) └── Replica: us-east-1 for read queries Ollama (Mac Mini @ office) └── Cloudflare Tunnel → ollama-internal.globalflavorarchive.com:11434
main after passing CI + manual approval gatesupabase db push in CI; migrations stored in /supabase/migrations/SUPABASE_URL= SUPABASE_ANON_KEY= SUPABASE_SERVICE_ROLE_KEY= STRIPE_SECRET_KEY= STRIPE_WEBHOOK_SECRET= POSTHOG_API_KEY= RESEND_API_KEY= OLLAMA_BASE_URL=http://ollama-internal.globalflavorarchive.com:11434 CLOUDFLARE_TUNNEL_TOKEN= UPSTASH_REDIS_REST_URL= UPSTASH_REDIS_REST_TOKEN= SENTRY_DSN=
| Route | Method | Auth | Description |
|---|---|---|---|
/api/recipes | GET | No | List/search recipes with filters |
/api/recipes/[slug] | GET | No | Single recipe with full story |
/api/recipes/[slug]/substitute | POST | Pro+ | Get AI substitution suggestions |
/api/migrations | GET | No | List migration events |
/api/migrations/[id]/map | GET | No | GeoJSON for migration path map |
/api/ingredients/search | GET | No | Semantic ingredient search |
/api/pair | POST | Pro+ | Flavor pairing suggestions |
/api/user/vault | GET/PUT | Yes | User's saved recipes vault |
/api/stories | POST | Yes | Submit community story |
/api/webhooks/stripe | POST | Stripe | Handle subscription events |