How the projects compose. Two subsystems, one shared thesis — an operating layer that learns, and a signal pipeline that feeds it context.
The top cluster is the Cascade layer — the personal operating system in the middle, with three satellites that either run on top of it or restore it. Cascade is the hub because almost every other project invokes it at some point in its lifecycle.
The bottom row is the signal pipeline— a deliberately narrow chain. A digest agent surfaces weekly signal from the target-account universe, and the sales intelligence surface turns that signal into on-demand conversations. It doesn’t touch Cascade by design. Different problem, different architecture, same author.
Cascade is where judgment lives. Agents, skills, and MCP connectors feed it context; every correction a human makes becomes a rule that fires next time. The satellites in the top cluster are each the thing Cascade does when pointed at a specific problem — a deck to draft, an attribution call to make, a full operating environment to restore on a fresh machine.
The signal pipeline could be a Cascade skill — and eventually, it probably will be. It isn’t today because the cost of the tighter integration didn’t beat the cost of keeping the pipeline a clean, legible thing someone else could operate without learning my whole setup. That’s a taste call, not an architecture call. Every line in this graph is one.