Skip to content

Getting Started

Install Conduit, enable provider traits, and run your first generation.

Overview

Conduit uses Swift package traits to control which providers are compiled. No traits are enabled by default, keeping the package lightweight and Linux-compatible.

Installation

Add Conduit to your Package.swift:

swift
dependencies: [
    .package(url: "https://github.com/christopherkarani/Conduit", from: "0.3.0")
]

Then add "Conduit" to your target's dependencies:

swift
.target(
    name: "MyApp",
    dependencies: ["Conduit"]
)

Enabling Provider Traits

Enable specific providers with traits:

swift
// MLX for on-device inference (Apple Silicon only)
.package(url: "https://github.com/christopherkarani/Conduit", from: "0.3.0", traits: ["MLX"])

// Cloud providers
.package(
    url: "https://github.com/christopherkarani/Conduit",
    from: "0.3.0",
    traits: ["Anthropic", "OpenAI", "OpenRouter"]
)

// Multiple providers
.package(
    url: "https://github.com/christopherkarani/Conduit",
    from: "0.3.0",
    traits: ["MLX", "Anthropic", "OpenAI", "HuggingFaceHub"]
)

Trait Reference

TraitCompile FlagProviders Enabled
OpenAICONDUIT_TRAIT_OPENAIOpenAIProvider (OpenAI, Ollama, Azure, custom)
OpenRouterCONDUIT_TRAIT_OPENROUTEROpenAIProvider (OpenRouter mode)
AnthropicCONDUIT_TRAIT_ANTHROPICAnthropicProvider
KimiCONDUIT_TRAIT_KIMIKimiProvider (requires OpenAI trait too)
MiniMaxCONDUIT_TRAIT_MINIMAXMiniMaxProvider (requires OpenAI trait too)
MLXCONDUIT_TRAIT_MLXMLXProvider (Apple Silicon only)
CoreMLCONDUIT_TRAIT_COREMLCoreMLProvider
HuggingFaceHubHuggingFace Hub downloads
LlamaLlamaLlamaProvider (llama.cpp via llama.swift)

API Keys

Cloud providers need API keys. Set them as environment variables:

bash
export ANTHROPIC_API_KEY=sk-ant-api-03-...
export OPENAI_API_KEY=sk-...
export OPENROUTER_API_KEY=sk-or-...
export MOONSHOT_API_KEY=sk-moonshot-...
export MINIMAX_API_KEY=...
export HF_TOKEN=hf_...

Most providers support .auto authentication that resolves keys from the environment automatically.

Quick Start — Cloud (Anthropic)

swift
import Conduit

let provider = AnthropicProvider(apiKey: "sk-ant-...")
let response = try await provider.generate(
    "Explain quantum computing in one paragraph",
    model: .claudeSonnet45,
    config: .default.maxTokens(300)
)
print(response)

Quick Start — Local (MLX)

swift
import Conduit

let provider = MLXProvider()
let response = try await provider.generate(
    "Explain quantum computing in one paragraph",
    model: .llama3_2_1b,
    config: .default.maxTokens(300)
)
print(response)

Quick Start — Streaming

swift
import Conduit

let provider = AnthropicProvider(apiKey: "sk-ant-...")
for try await text in provider.stream("Tell me a joke", model: .claude35Sonnet) {
    print(text, terminator: "")
}

Building on Linux

Conduit supports Linux for server-side Swift. Build normally with Swift 6.2+:

bash
swift build
swift test

No traits are enabled by default, so MLX and Foundation Models dependencies are excluded. Cloud providers (Anthropic, OpenAI, HuggingFace) work out of the box. For local inference on Linux, use Ollama via OpenAIProvider.

Next Steps

Released under the MIT License.