User Guide

Comprehensive guides for using Fortified LLM Client in your projects.

Contents

This section covers:

Quick Reference

CLI Basics

1
2
3
4
5
6
7
8
# Minimal invocation
fortified-llm-client --api-url URL --model MODEL --user-text "prompt"

# With config file
fortified-llm-client -c config.toml --user-text "prompt"

# Save output to file
fortified-llm-client -c config.toml --user-text "prompt" -o output.json

Library Basics

1
2
3
4
5
6
7
8
9
10
use fortified_llm_client::{evaluate, EvaluationConfig};

let config = EvaluationConfig {
    api_url: "http://localhost:11434/v1/chat/completions".to_string(),
    model: "llama3".to_string(),
    user_prompt: "Your prompt".to_string(),
    ..Default::default()
};

let result = evaluate(config).await?;

Common Workflows

1. Basic LLM Interaction

CLI:

1
2
3
4
fortified-llm-client \
  --api-url http://localhost:11434/v1/chat/completions \
  --model llama3 \
  --user-text "Explain Rust ownership"

Library:

1
2
3
4
5
6
7
let config = EvaluationConfig {
    api_url: "http://localhost:11434/v1/chat/completions".to_string(),
    model: "llama3".to_string(),
    user_prompt: "Explain Rust ownership".to_string(),
    ..Default::default()
};
let result = evaluate(config).await?;

2. PDF Analysis

CLI:

1
2
3
4
5
fortified-llm-client \
  --api-url https://api.openai.com/v1/chat/completions \
  --model gpt-4 \
  --pdf-file document.pdf \
  --system-text "Summarize the key points"

See PDF Extraction for details.

3. Structured JSON Output

CLI:

1
2
3
4
5
6
fortified-llm-client \
  --api-url https://api.openai.com/v1/chat/completions \
  --model gpt-4 \
  --user-text "Generate a product catalog" \
  --response-format json-schema \
  --response-format-schema schema.json

See Response Formats for details.

4. With Input Validation

CLI:

1
2
3
4
fortified-llm-client -c config.toml \
  --enable-input-validation \
  --max-input-length 1MB \
  --user-text "Your prompt"

Config file:

1
2
3
4
5
6
7
[guardrails.input]
type = "patterns"
max_length_bytes = 1048576

[guardrails.input.patterns]
detect_pii = true
detect_prompt_injection = true

See Guardrails for comprehensive security options.

Next Steps

  • New users: Start with CLI Usage for a complete flag reference
  • Library users: See Library API for Rust integration
  • Advanced users: Explore Configuration for complex setups

Table of contents