Skip to content

Why Chuks?

Chuks sits at the intersection of scripting-language simplicity and compiled-language performance. This page explains where Chuks excels compared to popular alternatives, where it still has ground to cover, and why its combination of features is unique.

FeaturePythonNode.jsBunJavaGoChuks
Clean, concise syntax⚠️
Static type checking❌¹❌²❌²
Single binary deployment
Classes & OOP
Generics
Async/await⚠️³
Structured concurrency⚠️⁴
VM + AOT dual mode
Built-in HTTP server

¹ Python has optional type hints, but they’re not enforced at runtime. ² TypeScript adds types to JS/Bun, but requires a separate transpilation step. ³ Java uses CompletableFuture — verbose compared to native async/await. ⁴ Java 21+ has structured concurrency as a preview feature.

This is what matters for real-world backend services. Chuks’ multi-threaded HTTP runtime handles more requests per second than Go, Java, Node.js, Bun, and Python — out of the box.

Benchmarked with wrk -t4 -c100 -d10s on Apple M4 Max, macOS, March 2026.

LanguageRequests/sec
Chuks (AOT)175,742
Bun172,190
Node.js107,525
Java100,367
Go (net/http)82,921
Python20,001

Chuks achieves 2.1× the throughput of Go’s net/http because it uses gnet (epoll/kqueue event-loop) as the transport layer — combined with a zero-overhead VM runtime and pooled request handling:

import { createServer, Request, Response } from "std/http"
var app = createServer();
app.get("/", function(req: Request, res: Response) {
res.send("ok")
})
app.get("/json", function(req: Request, res: Response) {
res.json('{"status":"ok"}')
})
app.listen(9000)

That’s a production-grade, multi-threaded HTTP server in 12 lines.

Chuks was built from the ground up for backend services:

  • Multi-threaded by default — HTTP requests are handled across all CPU cores, not single-threaded like Node.js or Python
  • spawn for parallel work — Fire off background tasks (database queries, API calls) in parallel with zero boilerplate
  • async/await for sequential flow — Write asynchronous code that reads like synchronous code
  • Structured concurrency — Parent tasks automatically cancel child tasks. No dangling Promises
  • Native compilation — AOT compiles to machine code. No interpreter overhead, no JIT warmup
// Handle a request with parallel database + API calls
async function handleRequest(req: Request, res: Response): Task<void> {
// Spawn two parallel tasks
var userTask: Task<any> = spawn fetchUser(req.params.id);
var ordersTask: Task<any> = spawn fetchOrders(req.params.id);
// Await both results
var user: any = await userTask;
var orders: any = await ordersTask;
res.json({ "user": user, "orders": orders })
}

Pure compute benchmarks test raw arithmetic, recursion, and data structure performance. These measure the compiler’s code generation quality.

All benchmarks run on Apple M4 Max, macOS, March 2026. Measured with hyperfine (1 warmup, 5 runs, mean ± σ).

BenchmarkChuks AOTGoJavaNode.jsBunPython
Fibonacci (fib 38)86.4 ms91.1 ms80.8 ms230.4 ms147.1 ms5,251 ms
N-Body (500K steps)18.1 ms17.1 ms49.7 ms44.7 ms33.0 ms4,601 ms
Binary Trees (depth 16)34.4 ms28.2 ms35.1 ms36.1 ms27.3 ms606 ms
Quicksort (100K)7.7 ms6.3 ms35.0 ms36.2 ms34.8 ms146 ms
Matrix Multiply (200×200)7.7 ms5.1 ms30.0 ms30.4 ms19.3 ms1,064 ms
Prime Sieve (1M)7.4 ms3.0 ms28.7 ms24.5 ms13.2 ms151 ms

Key takeaways:

  • Chuks beats Go on Fibonacci — 86.4 ms vs 91.1 ms. Essentially identical on N-Body (18.1 ms vs 17.1 ms)
  • Chuks beats Java, Node.js, and Bun on every benchmark — 2–5× faster than Java, 2–5× faster than Node.js, 1.7–4.5× faster than Bun
  • 60–255× faster than Python — Fibonacci: 86 ms vs 5,251 ms (61×), N-Body: 18 ms vs 4,601 ms (255×)
  • Near-native code generation — The AOT compiler produces devirtualized method dispatch, typed arithmetic, and zero-cost nil checks — code that looks nearly identical to hand-written Go

Chuks is what Python would be if it compiled to native code and had real types.

AdvantageDetail
8.8× more HTTP throughput176K req/s vs 20K req/s. Multi-threaded vs Python’s GIL-bound single thread
60–255× faster computeNative compilation vs interpreted bytecode. Fibonacci: 86 ms vs 5,251 ms (61×), N-Body: 18 ms vs 4,601 ms (255×)
True parallelismspawn creates real parallel tasks across all CPU cores. Python’s GIL limits concurrency to one thread
Static typesErrors caught at compile time, not at 3am in production
Single binarychuks build → one file to deploy. No virtualenv, no pip, no requirements.txt
No dependency chaosNo version conflicts, no pip freeze nightmares

Where Python still wins: Massive ecosystem (ML/data science/AI), 35+ years of libraries, universal availability on every server.

Chuks gives you TypeScript’s type safety and Go’s deployment model — with 63% more throughput.

AdvantageDetail
63% more HTTP throughput176K req/s vs 108K req/s. Multi-threaded vs Node’s single-threaded event loop
True multi-threadingspawn runs tasks across all CPU cores. Node runs everything on one thread (worker_threads are cumbersome)
Structured concurrencyspawn + await with automatic parent-child cancellation. No leaked Promises, no unhandled rejections
Single binaryDeploy one file vs node_modules/, package.json, and a Node runtime
Types built inNo TypeScript transpilation step, no tsconfig.json, no source maps
2–5× faster on computeFibonacci: 86 ms vs 230 ms (2.7×), Quicksort: 7.7 ms vs 36 ms (4.7×), N-Body: 18 ms vs 45 ms (2.5×)
No toolchain churnNo Webpack/Vite/esbuild/Turbopack debate. chuks build just works

Where Node.js still wins: NPM ecosystem (2M+ packages), mature web frameworks (Express, Fastify, Next.js).

Chuks compiles to true machine code — same HTTP throughput as Bun, but with true parallelism and static types.

AdvantageDetail
Matched HTTP throughput176K req/s vs 172K req/s. Both fastest in class — but Chuks has true multi-threading
True AOTCompiled to native binary, not JIT-compiled at runtime. Zero startup cost
Multi-threaded concurrencyspawn distributes work across CPU cores. Bun is fundamentally single-threaded
Structured concurrencyParent-child task cancellation and contexts. Bun has no equivalent
1.7–4.5× faster on computeFibonacci: 86 ms vs 147 ms (1.7×), Quicksort: 7.7 ms vs 35 ms (4.5×), N-Body: 18 ms vs 33 ms (1.8×)
Static typesBuilt-in, not bolted on via TypeScript
Zero runtimeThe binary is fully self-contained. Bun requires the Bun runtime installed

Where Bun still wins: Drop-in Node.js replacement with existing NPM ecosystem, built-in bundler and test runner.

Chuks is Java without the ceremony — 75% more HTTP throughput and faster compute too.

AdvantageDetail
75% more HTTP throughput176K req/s vs 100K req/s. Lightweight native binary vs heavy JVM thread pool
2–5× faster computeQuicksort: 7.7 ms vs 35 ms (4.5×), N-Body: 18 ms vs 50 ms (2.7×), Matrix: 7.7 ms vs 30 ms (3.9×)
Instant startupNative binary starts in microseconds vs JVM cold start (30ms+). Critical for serverless/containers
Modern concurrencyspawn + await with structured cancellation vs CompletableFuture<Void> boilerplate
Concise syntaxvar x = 5 vs int x = 5;. No public static void main(String[] args)
Simple deploymentSingle binary vs JAR + JVM + classpath
Low memoryNative binary uses MBs vs JVM using 100s of MBs
No build systemNo Maven, no Gradle, no pom.xml. Just chuks build

Where Java still wins: JIT compiler can be faster on long-running CPU-bound tasks (Fibonacci: 81 ms vs 86 ms), 30 years of enterprise libraries (Spring, Hibernate), best-in-class tooling (IntelliJ, profilers, debuggers).

Chuks beats Go on HTTP throughput — and offers classes, async/await, and a developer experience Go doesn’t have.

AdvantageDetail
2.1× more HTTP throughput176K req/s vs 83K req/s. gnet event-loop vs Go’s net/http
Tied on CPU computeFibonacci: 86 ms vs 91 ms. N-Body: 18 ms vs 17 ms. Essentially identical
Classes & OOPFull class hierarchy with inheritance, abstract classes, interfaces. Go has none
Async/awaitasync function + await vs Go’s goroutine + channel + select ceremony
Structured concurrencyTask trees with automatic child cancellation. Go leaks goroutines by default
Expression-basedif, while, for all produce values — like Rust and Kotlin
Try/catchFamiliar error handling vs Go’s if err != nil on every single line
Concise syntaxClosures, ternary operator, generics with familiar angle brackets

Where Go still wins: Slightly faster on some CPU micro-benchmarks (matrix, prime sieve), goroutines are extremely lightweight, massive cloud-native ecosystem (Kubernetes, Docker, Terraform).

No other language combines all six of these properties:

  1. Serve faster than Go — 176K req/s HTTP throughput, multi-threaded, native binary
  2. Write like TypeScript — Clean, concise syntax with closures, classes, and expression-based control flow
  3. Type-check like Java — Static types catch bugs before your code runs
  4. Deploy like Go — Single binary, zero dependencies, zero runtime
  5. Concurrency done rightspawn + await with structured concurrency. No leaked tasks, no callback hell, no if err != nil
  6. Develop like a scripting language — VM/REPL for instant feedback, then chuks build for production
HTTP Throughput ← Beats Go, Node, Bun, Java, Python.
Concurrency Model ← spawn + await + structured cancellation
Developer Experience ← Python, TypeScript
Type Safety ← Java, TypeScript
Deployment Model ← Go
Dual Execution ← Chuks only

Chuks is built for what developers actually ship: backend services. It beats Go on HTTP throughput (2.1×), beats Java/Node.js/Bun on both HTTP and compute benchmarks, and gives you a developer experience that Go, Java, and Python can’t match.