Rust library + CLI

BGPKIT Parser

High-performance MRT/BGP/BMP parsing with a streaming iterator API and operator-friendly filters. Use it standalone, or combine it with BGPKIT Broker to locate and process public BGP archives.

Quick start

Parse a remote MRT file (archive URL) and iterate over elements. This is a good default pattern for batch processing and simple analyses.

Parse a remote MRT file
use bgpkit_parser::BgpkitParser;

for elem in BgpkitParser::new("http://archive.routeviews.org/route-views4/bgpdata/2022.01/UPDATES/updates.20220101.0000.bz2").unwrap() {
    println!("{}", elem);
}

What you get

The parser yields structured “elements” you can inspect, aggregate, and export. Pair it with filters to reduce downstream work and I/O.

Fast, streaming parser
Iterator-based processing keeps memory usage predictable and works well for pipelines (local files or remote URLs).
MRT/BGP/BMP support
Handles common routing telemetry formats and transparently decompresses typical archive encodings.
Filtering built-in
Filter by prefix, origin ASN, peers, time range, AS path, community, and other fields to avoid post-processing overhead.
Ergonomic Rust API
Designed for tooling, batch jobs, and services with straightforward iteration and structured output.

Installation

Install the CLI for ad-hoc inspection, or add the crate to your Rust project for programmatic processing. For exact versions, refer to crates.io.

Cargo

# Install the CLI (from source)
cargo install bgpkit-parser --features cli

# Install compiled binaries (recommended)
# - Homebrew: brew install bgpkit/tap/bgpkit-parser
# - cargo-binstall: cargo binstall bgpkit-parser

# Library usage (Cargo.toml)
[dependencies]
bgpkit-parser = "*"

Tip: check crates.io for the latest released version.

Homebrew

# macOS (Homebrew)
brew install bgpkit/tap/bgpkit-parser

cargo-binstall

# Faster installs (cargo-binstall)
cargo install cargo-binstall
cargo binstall bgpkit-parser

Examples

These examples focus on common developer/operator workflows: constrain the data early, then iterate and extract the fields you need.

Filter during parsing

Apply filters up front to reduce processing time and output volume when you already know what you are looking for.

Prefix + origin ASN filtering
use bgpkit_parser::BgpkitParser;

/// Filter by IP prefix
let parser = BgpkitParser::new("http://archive.routeviews.org/bgpdata/2021.10/UPDATES/updates.20211001.0000.bz2").unwrap()
    .add_filter("prefix", "211.98.251.0/24").unwrap();

for elem in parser {
    println!("{}", elem);
}

Combine with BGPKIT Broker

Use Broker to find relevant archive files (time range, collector, dump type), then parse each file sequentially.

Locate files via Broker, parse via Parser
use bgpkit_broker::BgpkitBroker;
use bgpkit_parser::BgpkitParser;

// Discover files with Broker, then parse them with Parser.
let broker = BgpkitBroker::new()
    .ts_start("2024-01-01T00:00:00Z")
    .ts_end("2024-01-01T01:00:00Z")
    .collector_id("rrc00")
    .data_type("updates");

for item in broker.into_iter().take(3) {
    let parser = BgpkitParser::new(&item.url).unwrap();
    for elem in parser {
        // Your analysis goes here.
        if elem.prefix.to_string() == "1.1.1.0/24" {
            println!("{} {}", item.collector_id, elem);
        }
    }
}

CLI usage

The CLI is practical for quick checks, scripting, and validating a filter before implementing it in code.

bgpkit-parser CLI
# Basic usage - Print all BGP messages
bgpkit-parser http://archive.routeviews.org/bgpdata/2021.10/UPDATES/updates.20211001.0000.bz2

# Filter by origin AS
bgpkit-parser -o 13335 updates.20211001.0000.bz2

# Filter by prefix
bgpkit-parser -p 1.1.1.0/24 updates.20211001.0000.bz2

# Output as JSON
bgpkit-parser --json updates.20211001.0000.bz2 > output.json

# Count elements efficiently
bgpkit-parser -e updates.20211001.0000.bz2

# Cache remote files for faster repeated access
bgpkit-parser -c ~/.bgpkit-cache http://example.com/updates.mrt.bz2

Performance notes

Streaming by default

Prefer iterator-style processing to keep memory usage stable on large MRT files.

Filter early

Use filters to reduce CPU time and output volume; avoid parsing and then discarding most rows.

Compose the toolkit

Use Broker for discovery and Parser for decoding. You get a clean pipeline without bespoke glue.