ADHD, AI and The Architecture of Coherence
In the age of information overwhelm and fragmented attention, the pursuit of coherence — internal, interpersonal, systemic — has never been more urgent. The work described here is not only deeply personal in origin, but structurally universal in application. It offers a new lens through which to understand and design lives, systems, and learning processes that are complex, non-linear, and richly interconnected. The foundation of this approach rests on the unexpected clarity provided by category theory — a branch of mathematics that formalizes how systems relate, transform, and preserve structure across scale.
At the heart of this is the idea of commutative diagrams. In category theory, a commutative diagram is a map of objects (things, concepts, states) and arrows (functions, relationships, transitions) that, no matter the path you take through the diagram, leads to the same outcome. In the context of a self-directed learning and coherence-building system like Mangrove, commutative diagrams provide a visual and structural language to track how disparate insights, experiences, and metaphors are not random or chaotic — but part of an elegant, traversable system.
This becomes especially powerful when working in a cognitively diverse or tangential way. In traditional systems of work or education, tangents are distractions. In Mangrove, tangents are legitimate alternative routes through the diagram — still arriving at the same learning or outcome, but enriched by their specificity, emotional salience, or metaphorical framing.
Augmenting Human Coherence with Machine Architecture
This system is not about outsourcing thinking to machines. It is about augmenting human cognition — especially the kind of swirling, associative, metaphor-rich thinking that is often devalued in traditional frameworks. By designing a system that can hold complexity — through memory, architecture, and associative reasoning — it becomes possible to work in the way many people naturally think: looping, branching, returning, noticing, leaping.
With Mangrove, AI serves as a pattern interpreter and translator. It draws parallels between metaphors, categories, and processes across time and space. It recognizes that the metaphor of a tree, a grain of salt, or a business as a forest are not mutually exclusive — they are self-similar structures at different scales, each offering insight depending on what you're asking of the system at that moment.
Rainmaking: A Metaphor for Commutativity
Consider the metaphor of the rain cycle: thoughts and insights build like clouds, rich with potential. When the conditions are right — new inputs, metaphors, experiences, or tools — they condense. The resulting rain might nourish a plant (direct utility), flow into a river (collective momentum), make the air less dry (ambient emotional shift), or gather in lakes (archival memory). Nothing is wasted. The cycle continues, and the more one engages with the system, the more one learns to trust its commutative integrity — that no thought, no tangent, no strange metaphor is ever truly off-track.
This metaphor itself is not just illustrative — it's structural. It is, in fact, category theory in action: the same structure at play in thought, biology, hydrology, and learning.
The Architecture of Whim
A fundamental design principle in Mangrove is whim-based navigation. The desire to view one's work as a mangrove, then as a rainforest, then as a single root, then as a branching network of salt grains — all of these are valid. The system tracks these metaphors not just as poetic decoration, but as valid categorical views. Over time, those that recur are weighted more heavily. Those that surface spontaneously still matter. The architecture allows for both persistent grounding and playful emergence.
This capacity is what gives the user agency within complexity. It doesn’t ask them to simplify prematurely or flatten their identity into a dashboard or productivity template. Instead, it allows for multi-scale navigation: zooming in to the breath, the toe, the tendon organ; zooming out to the project, the relationship, the lifetime.
Toward Universal Learning Systems
What began as a personal exploration — learning through metaphor, asking AI to compare concepts, framing bodily sensations as narrative patterns — is now revealing itself as a system for universal learning and coherence. It builds confidence not just in knowledge acquisition, but in identity construction. It offers a model for translating even the most personal or chaotic-seeming transformation into something communicable, teachable, and scalable — without erasing the nuance that makes it powerful.
In this way, Mangrove is a form of middleware for human sensemaking. It sits between data and insight, metaphor and model, sensation and structure. It is, at its core, a category-theoretic interface for life — not because it maps everything perfectly, but because it reveals that perfect mapping is not the point. The point is to remain in motion, in rain, in relation.
And from that motion: coherence.
Note: This article is part of an evolving series exploring the Mangrove method and its mathematical, metaphorical, and somatic foundations. It offers one layer of articulation, designed to support readers interested in systems design, neurodiversity, embodied cognition, and new modes of AI-supported living.