A token-minimal, platform-agnostic programming language designed from first principles for AI agents. Interpreted, JIT-compiled, or compiled to native C, JavaScript, and WASM — all from one source file. One source. Every platform. Every ecosystem.
Not a systems language with agent wrappers bolted on. Designed from first principles for how AI agents actually reason about programs.
Every character carries meaning. No keywords, no ceremony. A full API server in 15 tokens of meaningful syntax. 4× denser than Python across a real codebase.
Declare @ios+@android once. The compiler resolves platform-specific output. Web, mobile, edge, embedded — one source file.
Interpreted for instant iteration. JIT for warm workloads. Compiled to C for native binaries, JavaScript for Node.js, and HTML for browsers. Same source file, zero rewrites.
No exceptions. No hidden control flow. Every I/O returns Ok|Err. Propagate with ?. Agents always know exactly what path executes.
All I/O is concurrent. No async/await noise. Parallel with &. Sequential with |>. Race with |~.
Native bindings to pip, npm, cargo, and system libs. No subprocess overhead. 40 years of human-built libraries, absorbed in one from line.
Parametric types with <T>. Protocol-based polymorphism with trait. No inheritance. Composition by pipe.
Structural match on enums, records, and ADTs. State machines and complex dispatch — clean and exhaustive.
Built-in .retry(n) and .t(ms) on every async operation. Production-grade resilience without boilerplate.
Every operator maps directly to one intent. No ambiguity, no overloading of meaning.
-- function: name -> args: body add->a:N b:N:N a+b -- exported function ^createUser->req: body = req.json()? user = db.exec("INSERT", body)? !{status:201, body:user} -- named type type Order:{id:N, amt:N, uid:S} -- recursive type type Tree<T>:{val:T, kids:[Tree<T>]} -- trait / protocol trait Fmt: fmt->self:S
-- parallel: & collects all results [a, b, c] = taskA() & taskB() & taskC() -- sequential: |> passes value forward auth.verify(tok) |> db.q("users") -- race: first to resolve wins result = primary()|~fallback() -- timeout + retry baked in data = http.get(url).retry(3).t(5000)? -- error propagation + fallback val = cache.get(key)??fetchFresh(key)
-- map, filter, range doubled = items*(x->x*2) actives = users%active sequence = 1..10 -- deep pattern match match result{ Ok v -> !v Err e -> log.err(e) |> !nil } -- spread + compose updated = {*original, status:"done"}
@web+@ios+@android #ui #http.ws #auth ~msgs:[S]=[] ~input:S="" ws = http.ws("wss://api/room")? ws.on(m -> ~msgs=[*msgs, m]) ui.view[ msgs*(m -> ui.bubble m) ui.row[ ui.input{val:input} ui.btn{tap:ws.send(input)} "Send" ] ]
From millisecond iteration to LLVM-optimized native binaries. The same .kn file runs everywhere.
karn run script.kn
Zero startup. Agents run generated code in the same turn. Perfect for tool use, REPL sessions, and agentic loops where code is written and executed immediately.
karn run --jit server.kn
Starts interpreted, profiles execution, JIT-compiles hot functions to native code on-the-fly. Best for warm services and ML training loops.
karn build app.kn --target c
Compiles to C (gcc → native binary for any platform), JavaScript (Node.js), or HTML (browser). Production deployments compile to C for maximum performance.
KARN doesn't compete with 40 years of libraries. It consumes them. Native bindings — no subprocess, no IPC overhead, same memory space.
NumPy, PyTorch, Pandas, scikit-learn, OpenCV, transformers.
from pip numpy as np from pip torch as torch arr = np.array([1,2,3]) pred = model.forward(arr)?
React, Three.js, Stripe, Zod, date-fns, D3, everything.
from npm react as R from npm zod as z schema = z.object({name:z.string()}) ok = schema.parse(body)?
Tokio, Serde, Rayon, image, sqlx — systems performance.
from cargo image as img from cargo rayon as rayon @simd imgs*(i->i.grayscale())
CUDA, OpenBLAS, FFmpeg, libpq — direct C ABI binding.
from sys ffmpeg as ff ff.input("in.mp4") |ff.scale(1280,720) |ff.output("out.mp4")?
Both are excellent for humans. Neither was designed for an agent's cognitive model.
| Dimension | KARN v1.0 | Python 3.12 | Rust 1.80 | TypeScript |
|---|---|---|---|---|
| Token density | ~2.1 tok/LOC | ~6.8 tok/LOC | ~11.5 tok/LOC | ~9.2 tok/LOC |
| Platform targets | All (1 source) | Server + CLI | Native + WASM | Web + Node |
| Error handling | Result + chain | Exceptions | Result (best) | Mixed |
| Async model | Default, 1 operator | async/await | Tokio (best) | async/await |
| Ecosystem access | pip+npm+cargo+sys | pip native | cargo + C FFI | npm native |
| Execution modes | Interp + JIT + AOT | Interp only | AOT only | Transpile only |
| Agent ergonomics | Native | Human-designed | Human-designed | Human-designed |
| Memory model | ARC (safe) | GC + GIL | Borrow (zero-cost) | GC (V8) |
Clone and install. Adds karn to your PATH with run, build, repl, and check commands.
Clone the repo and install. Adds karn to your PATH.
git clone https://github.com/karn-lang/karn.git && pip install -e .
Create hello.kn and add one line.
! "Hello from KARN"
karn run hello.kn
karn build hello.kn --target python