Protocol 007
The Mycelial Query
Routing meaning without leaking identity. GPU 0 listens. GPU 1 translates. The cloud only polishes anonymized patterns.
The Problem
You have a task that requires AI. But you also have data that should never leave your control β personal notes, community oral histories, proprietary thinking, the raw material of your sovereignty.
The cloud is fast. The cloud is capable. The cloud is someone else's computer.
The mycelial query solves this: split the task. Keep the sensitive parts local. Send only what's safe to the cloud. Route like a mycelial network β intelligence distributed, no single point of failure or exposure.
The Stitch: "The cloud is a spotlight; local inference is a candle. Use the candle to find the truth, and the spotlight only to reveal it β never to expose what should stay in the dark."
The Architecture
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β YOUR QUESTION β
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β
βΌ
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β STEP 1: CLASSIFY β
β Contains PII? Proprietary? Community data? Personal? β
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β β
βΌ βΌ
βββββββββββββββββββββ βββββββββββββββββββββ
β SENSITIVE β β NON-SENSITIVE β
β βββββββββββββββ β β βββββββββββββββ β
β β GPU 0 β β β β Cloud β β
β β (Perception)β β β β (Optional) β β
β ββββββββ¬βββββββ β β ββββββββ¬βββββββ β
β βΌ β β β β
β βββββββββββββββ β β β β
β β GPU 1 β β β β β
β β (Synthesis) β β β β β
β ββββββββ¬βββββββ β β β β
βββββββββββΌββββββββββ βββββββββββΌββββββββββ
β β
ββββββββββββββ¬ββββββββββββ
βΌ
βββββββββββββββββββββββ
β FINAL SYNTHESIS β
β (You, the human) β
βββββββββββββββββββββββ The Routing Rules
| Data Type | Route | Why |
|---|---|---|
| Personal identity (names, addresses, relationships) | Local only β GPU 0/1 | Never leaves your hardware. No exceptions. |
| Community oral histories (raw transcripts) | Local only β GPU 0/1 | Cultural sovereignty requires data sovereignty. |
| Draft writing (unpublished, personal voice) | Local preferred β GPU 1 | Cloud models train on your voice. Don't give it away. |
| Anonymized patterns / metadata | Cloud optional | Safe to send if stripped of identifiers. |
| Public information / general knowledge | Cloud fine | No sovereignty risk. |
The Code (Python Orchestrator)
This is the actual routing script used in the Haiti Pilot β simplified for readability.
import torch
import whisper
from transformers import AutoModelForCausalLM, AutoTokenizer
class MycelialRouter:
def __init__(self):
# Load models to specific GPUs
self.whisper = whisper.load_model("large-v3", device="cuda:0")
self.llama = AutoModelForCausalLM.from_pretrained(
"local-llama-8b",
device_map="cuda:1"
)
self.tokenizer = AutoTokenizer.from_pretrained("local-llama-8b")
def route(self, audio_path=None, text=None, sensitive=True):
result =
# STEP 1: Transcription (if audio)
if audio_path:
result["transcript"] = self.whisper.transcribe(audio_path)["text"]
# STEP 2: Local processing for sensitive data
if sensitive:
prompt = f"""Process the following text. Preserve all names,
places, legal terms, and cultural markers. Do not generalize.
Text: {result.get('transcript', text)}
Output:"""
inputs = self.tokenizer(prompt, return_tensors="pt").to("cuda:1")
outputs = self.llama.generate(**inputs, max_new_tokens=500)
result["local_analysis"] = self.tokenizer.decode(outputs[0])
# STEP 3: Strip identifiers for cloud (if needed)
result["scrubbed"] = scrub_identifiers(result["local_analysis"])
# STEP 4: Cloud synthesis (only scrubbed data)
if self.needs_cloud(result.get("scrubbed", "")):
result["cloud_synthesis"] = self.call_claude(result["scrubbed"])
# STEP 5: Human synthesis
return result
def needs_cloud(self, text):
# Heuristic: if local model confidence insufficient, route to cloud
return len(text) > 2000
def call_claude(self, text):
# API call with only scrubbed, anonymized data
# No identifiers ever leave the local machine
pass The Stitch: "This is where I realized that 'Speed' is a vanity metric. It takes 10 minutes to process an hour of audio locally. The cloud could do it in 1 minute. But in those extra 9 minutes, I am buying Safety."
The Haiti Pipeline (Real Example)
In the Haiti Pilot, this routing logic preserved a legal principle β the Lakou system β that a cloud model tried to translate as "backyard."
# Actual log from GPU 1 during processing [GPU 0] Transcribing Kreyòl audio... (142 seconds) [GPU 0] Transcript: "Sou tè komen nan Morne Rouge..." [GPU 1] Detecting sensitive terms: "tè komen" (legal), "Morne Rouge" (place), "Ti Jean" (person) [GPU 1] Preserving all three. Do not translate proper nouns. [GPU 1] Output: "On the tè komen at Morne Rouge..." [Cloud] Received scrubbed version: "On the [LEGAL_TERM] at [PLACE_NAME]..." [Cloud] Synthesis complete. No identifiers exposed. [Human] Final review: Preserved "Lakou" as concept, not translation.
When to Use Cloud (And When Not To)
- Use cloud: Public information, general knowledge, anonymized patterns, tasks requiring massive scale
- Do NOT use cloud: Personal writing, community data, proprietary research, anything you wouldn't want on a public blog
- The test: If you wouldn't say it aloud on a train, don't send it to the cloud
The Sovereignty Firewall
The mycelial query is not just a technical architecture. It's a cognitive firewall. Before any data leaves your hardware, you must consciously decide: Is this safe to share? Is this mine to share? Does this need to leave at all?
That pause β that decision β is sovereignty in action.