Introduction: What WebAssembly Is and Why It Exists

This chapter explains what WebAssembly is, why it matters, sets up the Rust toolchain, and compiles your first module.

What WebAssembly Is

WebAssembly (abbreviated Wasm) is a portable binary instruction format for a stack-based virtual machine. That sentence is dense; here's the plain version.

  • Portable. One .wasm file runs in Chrome, Firefox, Safari, Node, Wasmtime, Wasmer, and many places in between.
  • Binary. It's a compact byte format, designed to be fetched over HTTP and decoded fast.
  • Instruction format. It looks like a machine language: small, typed, low-level opcodes.
  • Virtual machine. There's no "Wasm CPU"; runtimes interpret or JIT-compile the instructions.
  • Stack-based. Operations push and pop operands from a stack, similar to JVM or .NET IL.

If that sounds like a weird target to ship over the web, it is. It's also what makes Wasm possible. The format is designed to be validated in milliseconds, compiled to native code quickly, and sandboxed by default.

Why Wasm Exists

The original motivation was in-browser performance. JavaScript is fast, but it's dynamically typed and garbage-collected, which puts a ceiling on what you can do with CPU-bound workloads: image processing, games, audio synthesis, 3D rendering, crypto.

Before Wasm, the workaround was asm.js: a restricted subset of JS that browsers could pattern-match and compile ahead of time. It worked, but was a hack. Wasm is the principled replacement: a real binary format with its own semantics, designed for the job.

Three things Wasm gave over asm.js:

  1. Smaller downloads. Binary is 30 to 50% smaller than equivalent JS.
  2. Faster parsing. Validating and compiling Wasm is much faster than parsing minified JS.
  3. Better performance. Closer to native for compute-bound code.

Wasm shipped in all major browsers in 2017. It hasn't replaced JS; JS is still the right tool for 99% of web work. But for the 1% where perf matters, Wasm is uncontested.

What Wasm Isn't

A few misconceptions worth dispelling early.

  • Not a JavaScript replacement. Wasm doesn't touch the DOM directly (yet; a proposal exists). You still need JS to manipulate the page. Wasm is a compute engine; JS is the glue.
  • Not inherently faster than JS for every task. For I/O-bound code, Wasm is not meaningfully faster. For arithmetic-heavy code, it can be 2 to 10x faster.
  • Not a magic portability solution. "Write once, run anywhere" is only true within the constraints Wasm supports. If your code relies on OS APIs, you need WASI (Chapter 8), and even then the surface is narrower than Linux.
  • Not just for browsers. The server-side ecosystem (Wasmtime, Wasmer, WasmEdge) is growing fast. Plugin systems, multi-tenant execution, and edge compute are the big server-side use cases.

The Two Worlds

Wasm lives in two worlds, with different concerns.

Browser Wasm

  • Loads from .wasm files over HTTP.
  • Runs in the same V8 or SpiderMonkey process as your JS.
  • Talks to the outside world via imported JS functions.
  • Used for CPU-bound hotspots: image editing, games, ML inference, cryptography.

Server-Side Wasm

  • Loads in a runtime like Wasmtime or Wasmer, embedded in your application.
  • Talks to the outside world via WASI (system calls: files, env, stdio, sockets).
  • Used for: sandboxing untrusted code, plugin systems, edge functions, polyglot codebases.

Chapters 1 through 7 lean browser-side. Chapters 8 through 10 pivot to server-side. Chapter 11 maps the ecosystem across both.

Installing the Toolchain

You need Rust and a few Cargo-installed helpers.

Rust

If you don't have Rust:

curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh

Add the Wasm targets:

rustup target add wasm32-unknown-unknown   # for browsers
rustup target add wasm32-wasip1             # for WASI / server

wasm-bindgen CLI

For browser interop (Chapter 6):

cargo install wasm-bindgen-cli

wasm-pack

For packaging Wasm as an npm module (Chapter 7):

cargo install wasm-pack

WABT

For converting between binary Wasm and text (WAT):

brew install wabt
# or: https://github.com/WebAssembly/wabt/releases

A Server-Side Runtime

For Chapters 8 and 9:

brew install wasmtime
# or: curl https://wasmtime.dev/install.sh -sSf | bash

Verify

rustc --version
cargo --version
rustup target list --installed | grep wasm
wasm-bindgen --version
wasm-pack --version
wasm2wat --version
wasmtime --version

If each prints a version, you're set.

Your First Wasm Module

The smallest complete example. Rust adds two numbers, compiled to Wasm.

cargo new --lib hello-wasm
cd hello-wasm

Edit Cargo.toml:

[package]
name = "hello-wasm"
version = "0.1.0"
edition = "2021"

[lib]
crate-type = ["cdylib"]

Edit src/lib.rs:

#[no_mangle]
pub extern "C" fn add(a: i32, b: i32) -> i32 {
    a + b
}

Build:

cargo build --target wasm32-unknown-unknown --release

The output:

target/wasm32-unknown-unknown/release/hello_wasm.wasm

Inspect it:

wasm2wat target/wasm32-unknown-unknown/release/hello_wasm.wasm | head -20
(module
  (type (;0;) (func (param i32 i32) (result i32)))
  (func $add (type 0) (param i32 i32) (result i32)
    local.get 0
    local.get 1
    i32.add)
  (table (;0;) 1 1 funcref)
  (memory (;0;) 16)
  (global $__stack_pointer (mut i32) (i32.const 1048576))
  (export "memory" (memory 0))
  (export "add" (func $add)))

That's your function as Wasm: load local 0, load local 1, i32.add. A memory, a table, a couple of exports.

Chapter 2 goes through what each of those lines means.

What Actually Ran

You wrote Rust. rustc compiled it to the Wasm target. The output is a binary file that:

  • Is 100% portable across runtimes.
  • Contains a single exported function (add).
  • Can't talk to the outside world without imports.
  • Can't read or write files (it's wasm32-unknown-unknown, no OS).

That's minimal Wasm. Everything else in this tutorial is expanding on that core. Adding memory, imports, JS glue, WASI, interfaces.

A Quick Look at Use Cases

To anchor the rest of the tutorial, here's where Wasm actually ships:

  • Figma: the rendering engine is Rust compiled to Wasm. The reason Figma is fast in the browser.
  • Photoshop on the web: the C++ codebase ported via Emscripten. Wasm is what made this possible.
  • Shopify Functions: merchants write Rust, compile to Wasm, Shopify runs their code in an isolated sandbox to customize checkout and discounts.
  • Cloudflare Workers and Fastly Compute@Edge: edge functions where Wasm provides fast start-up and sandboxed execution.
  • Envoy proxy filters: Wasm modules extend Envoy's request-processing pipeline without linking into the binary.
  • ML inference: running models in the browser with Wasm SIMD for speed.

Every chapter hereafter is building blocks for these kinds of use cases.

Common Pitfalls

Assuming Wasm speeds up any code. Wasm helps for CPU-bound work. For I/O-heavy or DOM-heavy code, no speedup.

Forgetting to add the target. cargo build --target wasm32-unknown-unknown fails with a useful error if you haven't run rustup target add.

Using a default crate-type. Without cdylib, Cargo builds a normal Rust library, not a Wasm module. The build succeeds, but the output isn't what you want.

Skipping --release and wondering why the output is huge. Debug builds include a lot of metadata. Always --release for real Wasm.

Treating Wasm like JS. It's not. Different types, different memory model, different boundary semantics. Chapter 3 covers the memory boundary; Chapter 5 covers JS interop.

Next Steps

Continue to 02-wat-and-binary.md to see what's inside a Wasm module.