← Back to Sandbox

Asset library auto-tagging

Infrastructure Orchesta

The Hypothesis

Can AI auto-tag an entire image library from filenames, folder structure, and image analysis — making assets queryable by AI agents?

The Concept

AI agents that generate landing pages, emails, or ads need to know what images are available and what they represent. But most asset libraries are just folders of files with cryptic names. This experiment connects to an S3/R2 bucket, scans every asset, and auto-generates metadata tags using filename patterns, folder structure, and AI image analysis. A file at `/products/blue-widget/hero.jpg` gets tagged `product`, `blue-widget`, `hero`. The result: an asset library that AI agents can query semantically.

The Flow.
Connect to bucket
S3, R2, or similar object storage
Scan all assets
index files, folder paths, MIME types
Extract metadata from structure
folder names → categories, filenames → descriptors
AI image analysis
visual classification: product shot, lifestyle, icon, trust badge, headshot
Generate queryable tags
structured metadata: type, product, variant, usage context
Human review and enrich
grid UI for correcting and adding tags

From unstructured folder of files to a semantically queryable asset library.

Asset library auto-tagging

The hypothesis

Can AI auto-tag an entire image library from filenames, folder structure, and image analysis — making assets queryable by AI agents?


The concept

AI agents that generate landing pages, emails, or ads need to know what images are available and what they represent. But most asset libraries are just folders of files with cryptic names. This experiment connects to an S3/R2 bucket, scans every asset, and auto-generates metadata tags using filename patterns, folder structure, and AI image analysis. A file at /products/blue-widget/hero.jpg gets tagged product, blue-widget, hero. The result: an asset library that AI agents can query semantically.


How it works

  1. Connect to bucket — S3, R2, or similar object storage
  2. Scan all assets — index files, folder paths, MIME types
  3. Extract metadata from structure — folder names → categories, filenames → descriptors
  4. AI image analysis — visual classification: product shot, lifestyle, icon, trust badge, headshot
  5. Generate queryable tags — structured metadata: type, product, variant, usage context
  6. Human review and enrich — grid UI for correcting and adding tags

From unstructured folder of files to a semantically queryable asset library.


What it explores


What we found


Learnings


Where it goes next

This feeds directly into the self-optimising landing pages experiment. When the page agent can query a tagged asset library, it makes better visual choices. We’re testing whether the tagging system can also infer brand guidelines from visual patterns — not just categorise, but understand style.

Want early access?
Some of these become products.

Innovation and frustration start in the sandbox. Tell us about your what-ifs and let's test something.

Start a conversation