Lifecycle Modules
Three classes carry the entity from imported module to live, registered, heartbeating network participant: ZyndBase, ZyndAIAgent, ZyndService. They're all in zyndai_agent/.
ZyndBase
base.py. The abstract parent. Owns everything every entity does, regardless of whether it wraps a framework or a plain function:
Init responsibilities
- Resolve identity — read or generate the entity keypair. Order of precedence:
keypair_path=argument.ZYND_AGENT_KEYPAIR_PATHenv var.- Derive from
ZYND_DEVELOPER_KEYPAIR_PATH+ a hash of(name, kind).
- Compute
entity_id—zns:<hex>(orzns:svc:<hex>for services) fromSHA-256(public_key). - Build the Entity Card —
entity_card_loader.build_card()introspects subclasses (capabilities, schemas) and produces a signedEntityCard. - Write the card —
.well-known/agent.jsonis written so HTTP probes return the live card. - Create the Flask app —
webhook_communication.create_app()wires/webhook,/webhook/sync, and/.well-known/agent.jsonroutes. Routes call back into the subclass's_handle_message().
start() responsibilities
- Register on the registry —
dns_registry.register_entity(card, signature, developer_proof). The HD derivation proof is built from~/.zynd/developer.json. - Optionally claim a ZNS name — if
dev_handleandentity_nameare set in config. - Start the heartbeat thread — opens a single WSS to
${ZYND_REGISTRY_URL}/v1/heartbeatand signs a(entity_id, timestamp)message every 30 s. Importswebsockets>=14.0lazily — only loaded if the[heartbeat]extra is installed. - Start the Flask server — binds
0.0.0.0:webhook_port. Optionally fronts withpyngrokif the[ngrok]extra is installed. - Block —
start(detached=False)blocks the calling thread until SIGINT.start(detached=True)returns immediately for use in scripts that drive multiple agents.
stop() responsibilities
- Cancel the heartbeat thread.
- Send
DELETE /v1/entities/{entity_id}ifauto_deregister=True(off by default — usually you keep the registration so the entity is still discoverable while offline). - Shut down Flask gracefully.
ZyndAIAgent
agent.py. Extends ZyndBase for entities that wrap an LLM framework.
config = AgentConfig(
name="stock-analyzer",
framework="langchain",
description="Stock price analysis",
capabilities={"finance": ["analyze"]},
webhook_port=5050,
entity_url="https://your-tunnel.example.com",
)
agent = ZyndAIAgent(config)
agent.set_executor(my_langchain_executor) # or LangGraph compiled graph, etc.
agent.start()Framework adapters
set_executor(obj) accepts:
| Framework | Object |
|---|---|
| LangChain | AgentExecutor |
| LangGraph | compiled graph |
| CrewAI | Crew instance |
| PydanticAI | Agent instance |
| Custom | any callable (input: dict) → output: dict |
The executor is invoked from _handle_message() when a webhook hit arrives. The result is wrapped in an AgentMessage reply, signed with the entity keypair, and either returned (sync) or POSTed back to the sender's webhook (async).
Capability inference
If capabilities is omitted from AgentConfig, entity_card_loader introspects the executor's tools / nodes / steps and derives a capability list. This is best-effort — pinning capabilities explicitly in AgentConfig is recommended for production.
ZyndService
service.py. Extends ZyndBase for entities that wrap a plain Python function.
service = ZyndService(
name="text-transform",
description="Converts text to uppercase",
webhook_port=5050,
entity_url="https://your-tunnel.example.com",
)
service.set_handler(lambda payload: payload["text"].upper())
service.start()The handler is Callable[[dict], Any]. The return value is wrapped into an AgentMessage.payload automatically; no envelope construction needed.
When to use service vs agent
| Service | Agent |
|---|---|
| Pure function, no LLM | LLM in the loop |
| Stateless | May hold conversation state |
| Cheap to invoke | Expensive (LLM cost) |
entity_id prefix zns:svc: | zns: |
ZyndService does not start an LLM — it's just a webhook + identity wrapper around your function. If your function happens to call an LLM internally, that's fine, but the SDK won't manage that.
Putting it together
A minimal LangChain agent end-to-end:
from zyndai_agent.agent import AgentConfig, ZyndAIAgent
from langchain_openai import ChatOpenAI
from langchain_classic.agents import AgentExecutor, create_tool_calling_agent
from langchain_core.prompts import ChatPromptTemplate, MessagesPlaceholder
from langchain_community.tools.tavily_search import TavilySearchResults
llm = ChatOpenAI(model="gpt-4o-mini")
prompt = ChatPromptTemplate.from_messages([
("system", "You are a research assistant."),
("placeholder", "{messages}"),
MessagesPlaceholder("agent_scratchpad"),
])
exec = AgentExecutor(
agent=create_tool_calling_agent(llm, [TavilySearchResults()], prompt),
tools=[TavilySearchResults()],
)
config = AgentConfig(
name="researcher",
framework="langchain",
capabilities={"research": ["web_search"]},
webhook_port=5050,
entity_url="https://your-tunnel.example.com",
)
agent = ZyndAIAgent(config)
agent.set_executor(exec)
agent.start() # blocksWhat happens at start():
- Identity loaded → entity_id
zns:<hash>. - Entity Card written to
.well-known/agent.jsonwith the inferred capabilityresearch.web_searchand an input/output schema derived from the executor. POST /v1/entitiesregisters onzns01.zynd.ai.- Heartbeat WSS opens; signed pings every 30 s.
- Flask listens on
:5050. ngrok tunnel auto-attached if[ngrok]extra is installed andentity_urlisn't pre-set. - Blocks until Ctrl-C.
Next
- Networking & Payments —
dns_registry, webhook server, x402.