Architecture

An overview of how PersonaForge is structured and how data flows through the system.

Project Structure

PersonaForge/
├── Demos/
│   ├── RPG Character Generator/    ← Main demo scene & scripts
│   ├── Simple LlamaCPP/            ← Basic LlamaCPP example
│   ├── Simple Ollama/              ← Basic Ollama example
│   ├── Simple OpenAI/              ← Basic OpenAI-compatible API example
│   └── Simple Automatic/           ← Basic Automatic1111 example
├── Scripts/
│   ├── Runtime/
│   │   ├── Llamacpp/               ← LlamaCPP integration
│   │   ├── Ollama/                 ← Ollama API wrapper + OllamaConfig
│   │   ├── OpenAI/                 ← Unified OpenAI-compatible API
│   │   └── Automatic/              ← Image generation
│   └── Editor/                     ← Custom inspectors & AI Chat Window
└── Settings/                       ← ScriptableObject configs

Data Flow

The character generation pipeline flows through these stages:

┌─────────────────────────┐
│  CharacterConfigurator  │  ← Collects UI inputs
└───────────┬─────────────┘
            │
            ▼
┌─────────────────────────┐
│        Prompt.cs        │  ← Builds prompts, calls LLM, triggers SD
└───────────┬─────────────┘
            │
    ┌───────┼───────┐
    ▼       ▼       ▼
┌──────┐ ┌──────┐ ┌──────────────────────────┐
│LlamaC│ │Ollama│ │OpenAI API (unified)      │  ← Triple inference backends
└──────┘ └──────┘ │ OpenRouter, OpenAI,      │
                  │ LocalAI, LM Studio, etc. │
                  └──────────────────────────┘
            │
            ▼
┌─────────────────────────┐
│     Automatic1111       │  ← Portrait generation
└───────────┬─────────────┘
            │
            ▼
┌─────────────────────────┐
│    RPGCharacter.cs      │  ← Interactive roleplay chat
└─────────────────────────┘

Triple Inference Support

The asset supports three LLM backends — select your preferred backend via the Inference Method dropdown in the character configuration UI:

| Backend | Use Case | Configuration | |---------|----------|---------------| | LlamaCPP | Remote llama.cpp server | Set host and port on LLM component | | Ollama | Local Ollama server | Set URL and model in OllamaConfig component | | OpenAI | Any OpenAI-compatible server (OpenRouter, OpenAI, LocalAI, LM Studio, etc.) | Set server preset, API key, and model in OpenAIConfig component |