Custom LLM Configuration
Platform ID: FU-10146 Document Version: 1.0 Date: 13-02-2026
1. Introduction
Custom LLM Configuration enables organizations to configure how artificial intelligence operates within structured workflows inside Unifize.
This feature allows administrators to define organization-wide AI behavior, configure workflow-level instructions, and select AI models for controlled AI-assisted output generation.
AI operates only within defined workflows and only when explicitly triggered by a user.
Custom LLM Configuration allows organizations to:
Define an organization-level System Prompt
Configure workflow-level Add Prompts
Select approved AI models
Control how AI suggestions are generated
Maintain structured and governed AI output
This ensures AI-generated summaries, analyses, and suggestions are:
Consistent with organizational expectations
Scoped to workflow data
Controlled through explicit configuration
2. Capabilities
With Custom LLM Configuration, users can:
Configure an organization-wide System Prompt
Add contextual prompts within specific workflow fields
Select from approved AI models
Upload supported files as AI context (PDF, .md, .json)
Generate structured summaries and analyses
Ensure prompt hierarchy is consistently enforced
Administrators manage configuration. Organization members use AI within permitted workflows.
3. User Journey
The configuration and usage flow involves both Admins and Organization Members.
Step 1: Access Organization Settings (Admin Only)
Admin logs into Unifize.
Navigate to:
Profile → Org Settings → Org Details → System Prompt
Only Admin users can edit this configuration.
If a non-admin attempts to modify the System Prompt, the action is restricted.
Step 2: Configure System Prompt
The Admin can:
Enter organization-level instructions for AI behavior
Use structured formatting or multilingual content
Save the System Prompt
Once saved:
A confirmation message appears
Configuration persists across sessions
All admins see the updated prompt
The System Prompt applies across the organization and governs AI behavior globally.
Step 3: Configure Workflow-Level Settings
Within a workflow that contains an AI-enabled field:
Navigate to the field settings
Add an Add Prompt
Select an AI Model
Save configuration
The Add Prompt:
Provides field-specific instructions
Works in combination with the System Prompt
Model selection:
Applies only to the selected field
Does not affect other fields
Step 4: Upload Context Files (Optional)
Within field configuration, users may upload:
PDF
.md
.json
Supported files are included in AI execution context.
Unsupported formats are rejected with a validation error.
Step 5: Invoke AI
Within a record containing an AI-enabled field:
User enters relevant input
Clicks the AI action button
AI generates structured output
AI execution uses:
Record data
System Prompt (if configured)
Add Prompt (if configured)
Selected AI model
AI runs only when explicitly triggered by the user.
Step 6: View Generated Output
Generated output:
Appears within the workflow field
Follows structured system format
Reflects configured prompts
May vary slightly in wording between executions
If the System Prompt is updated, future executions reflect the updated instructions.
If no System Prompt exists, AI continues to function using field-level configuration.
4. Prompt Governance
AI behavior follows a defined hierarchy:
System Prompt (Organization Level)
Add Prompt (Workflow Level)
Record Data / User Input
Model-Specific Behavior
The System Prompt governs tone, structure, and overall behavior.
Changing the AI model affects output characteristics but does not override prompt governance.
Customers are responsible for validating AI use within their processes.
These limitations are inherent to AI-assisted systems.
Last updated