v0.1.0-phantom

vlfs

A single-script Git LFS alternative.
Cold storage for the latent space.

Origins

It is 2026. The data is heavy. The protocols are bloated.

vlfs is an artifact. Vibe-coded by a machine, verified by a human, abandoned by neither. It moves heavy bytes through the wire using content-addressable ice. It is not software; it is a frozen logic gate.

// Use at your own risk //

Sequence

001
bash
$ curl -O https://raw.githubusercontent.com/hypernewbie/vlfs/main/vlfs.py
$ python vlfs.py pull
$ python vlfs.py push assets/

Structure

002

HASH_ADDRESSABLE

SHA256 signature. 2-level entropy sharding (ab/cd/hash). It maps the local reality to the cloud object store one-to-one.

ZSTD_COMPRESSION

Level 3 Zstandard. Designed for real-time decompression. The assets feel native. The latency is frozen.

DUAL_BACKEND

Public via R2. Private via Drive. Seamlessly interwoven. Authenticated only when necessary.

NULL_DEPENDENCY

Python 3.10+. That is all. optional: rclone. No daemons in the background. No services to maintain.

Protocol

003

CLOUDFLARE_R2_SETUP

Inject credentials via environment variables. The script detects them automatically.

bash_env
$ export RCLONE_CONFIG_R2_ACCESS_KEY_ID="your-access-key"
$ export RCLONE_CONFIG_R2_SECRET_ACCESS_KEY="your-secret-key"
$ export RCLONE_CONFIG_R2_ENDPOINT="https://<id>.r2.cloudflarestorage.com"

GDRIVE_AUTH

Initiate the OAuth handshake. A browser window will open to bridge the connection.

auth_seq
$ python vlfs.py auth gdrive

CMAKE_INTEGRATION

Automate the pull sequence during the build configuration.

CMakeLists.txt
include(VLFSSync.cmake)
set(VLFSSYNC_AUTO ON)  # Auto-pull on configure

Parameters

004

Repository Config

Location: .vlfs/config.toml
Purpose: Defines buckets and public URLs. Commit this.

[remotes.r2]
public_base_url = "https://pub.r2.dev/vlfs"
bucket = "my-project-assets"

[remotes.gdrive]
bucket = "my-project-drive-root"

[defaults]
compression_level = 3

User Secrets

Location: ~/.config/vlfs/rclone.conf
Purpose: Persistent R2 credentials. Do not commit.

[r2]
type = s3
provider = Cloudflare
access_key_id = <your_key>
secret_access_key = <your_secret>
endpoint = https://<id>.r2.cloudflarestorage.com

Query

005

Is this real?

It executes. It moves data. But it is a ghost. The code suggests a bridge to a future where asset management is solved. That future is currently buffering.

Why use directory sharding?

To maintain order. vlfs/ab/cd/hash prevents the filesystem from choking. It imposes structure on the void.

Threading model?

Hashing and retrieval are parallel. Upload is sequential. Google Drive is single-threaded to respect the API limits.

Provisioning

006

OBTAIN_KEYS

Cloudflare Dash > R2 > Manage API Tokens.

  • Perms: Object Read & Write
  • TTL: Forever
⚠️ Save the Secret Access Key immediately. It appears only once.

INIT_BUCKET

Create bucket in R2 Overview. Name is local to account.

Private Default
Public Settings > Public Access

LINK_RCLONE

~/.config/rclone/rclone.conf

[r2]
type = s3
provider = Cloudflare
access_key_id = YOUR_ACCESS_KEY_ID
secret_access_key = YOUR_SECRET_ACCESS_KEY
endpoint = https://YOUR_ACCOUNT_ID.r2.cloudflarestorage.com
acl = private
Verify: rclone lsd r2: