1 : Tracing Interdependence

1/31/20267 min read

Tracing Interdependence

Our vision evolved to perceive boundaries, depth, and color, capacities that allow us to navigate the world, reach out, touch, and feel our surroundings. Vision made it possible to distinguish self from environment, a capability most pronounced in advanced organisms with complex social lives, such as chimpanzees, dolphins, and Caribbean cleaning gobies. Yet with this separation came something new: a sense of difference between what is like us and what is not. These boundaries enabled individual action, but they did not sever connection. Something essential continued to bind us together.

That connective thread is our senses. Sensation weaves our lives into one another and into the broader web of life. When we hear a sound, vibrations in the air are translated into movements in the ear, recreating the sound within our nervous system. When we watch another perform an action, mirror neurons simulate what it would feel like for us to perform that same act. Through sensing, we do not merely observe the world, we partially re-enact it within ourselves.

These sensory systems form the infrastructure for symbiotic relationships, allowing species to compensate for one another’s limitations. Our perceptual lives are often described in terms of interoception, the sensing of internal bodily states, and exteroception, the sensing of signals from the external world. Together, they anchor organisms in both their bodies and their environments, making coordinated life possible.

Evolution has produced a vast diversity of complex behaviors without following a single trajectory toward intelligence or any privileged endpoint. Across the animal kingdom, these behaviors invite wonder by revealing alternative ways of engaging with the world. Eusocial insects such as ants and termites demonstrate that large-scale organization and adaptive problem-solving can emerge without centralized control or individual awareness of the system as a whole. Colonies construct elaborate nests, regulate temperature, allocate labor dynamically, and even practice agriculture, all through local interactions. Complexity, in these cases, arises not from conscious planning but from self-organizing processes shaped by evolution (Hölldobler & Wilson, 2008).

Some ants, for example, practice a form of farming that depends on close cooperation with a cultivated fungus. They collect leaves and other plant material, carry it back to their nests, and feed it to the fungus, which converts the material into a soft, nutritious food source (Hölldobler & Wilson, 2010). Maintaining this system depends heavily on the ants’ sensory abilities. Smell allows them to recognize the correct fungus and detect harmful molds or bacteria (Currie et al., 1999). Chemical signals coordinate labor within the colony, guiding which ants gather plants, which tend the fungus, and which defend the nest (Schultz et al., 2015). Touch enables ants to groom and manage the fungal garden through direct contact (Weber, 1972). Through the integration of smell, touch, and chemical communication, the colony sustains a stable food system without any single ant overseeing the process. This partnership, maintained for millions of years, illustrates how sensing can support long-term cooperation between species (Schultz et al., 2015).

The expansion of cognition beyond the boundaries of the individual is not limited to biological relationships, it also occurs within us and the tools we use. Andy Clark argues that the mind is not confined to the brain but often extends into the body, the environment, and the objects we interact with daily (Clark, 2008). From this perspective, thinking is shaped by external supports such as notebooks, maps, language, gestures, and digital devices. When a person uses a notebook to remember information or a smartphone to navigate a city, these tools do not merely assist cognition; they become part of the cognitive process itself (Clark & Chalmers, 1998). By distributing thinking across the brain, body, and environment, humans solve problems that would be far more difficult through internal processing alone.

This form of extended cognition is not uniquely human. Spiders, for instance, experience much of their world through their webs. A web is not only a trap but also a sensory extension of the spider’s body. Vibrations traveling through silk convey information about the location, size, and movement of prey, allowing the spider to perceive events at a distance. In this way, the web becomes part of the spider’s perceptual system, shaping how it experiences and responds to its environment (Barth, 2002). In both humans and spiders, perception and action emerge from a tight coupling between organism and environment, suggesting that minds are often distributed across bodies, materials, and spaces rather than contained within a single individual.

This dependence on sensory coupling also reveals a vulnerability. When environments change too rapidly or become saturated with artificial signals, sensory systems can be misled or overwhelmed. Bats have been observed colliding with smooth building surfaces that reflect sound in ways similar to water, disrupting echolocation. Human circadian rhythms are increasingly disturbed by indoor lifestyles and artificial lighting that reduce exposure to natural daylight. Whales and other marine animals experience stress and disorientation from constant ship noise that interferes with communication and navigation. In such cases, the very senses that once enabled effective engagement with the world become sources of confusion when signal and noise blur together.

In artificial intelligence research, world models refer to internal systems that allow an agent to represent aspects of its environment, predict future states, and plan actions rather than merely reacting to immediate inputs. Recent work shows that AI systems can learn compact internal models of physical environments, enabling improved decision-making in tasks such as navigation, robotics, and game-like settings (Ha & Schmidhuber, 2018; Hafner et al., 2020). These models emphasize spatial structure, temporal continuity, and causal regularities, marking a shift away from purely reactive systems.

However, most existing world models function as internal simulations trained in constrained or virtual environments. They rely on predefined sensory channels and focus largely on physical dynamics such as motion and control. While effective for efficiency and planning, they remain disconnected from lived interaction. They do not emerge through open-ended sensing, nor do they participate continuously in shared environments with humans, tools, or ecosystems (LeCun, 2022).

What is missing is the kind of distributed and extended cognition found in biological systems. Unlike ants, spiders, or humans, AI world models do not extend into their environments through tools, materials, or shared structures. They lack interoceptive signals that would allow them to sense internal limits or instability, and they rarely integrate social or ecological relationships into their representations. Feedback remains manually designed and task-specific, rather than arising through long-term participation in a world (Clark & Chalmers, 1998; Clark, 2008).

As a result, contemporary AI systems can calculate possible futures but do not inhabit space in the way living systems do. The challenge ahead is not simply to make world models larger or more accurate, but to rethink how sensing, environment, and cognition might co-develop—so that AI systems become participants in existing webs of interaction rather than isolated predictors.

Core Questions Moving Forward
  • What responsibility should AI have toward the sensory worlds of other species?
    If bats, whales, and humans are affected by changes in sound, light, and movement, how can AI help preserve environments that remain perceptible and livable for diverse forms of life?

  • What would it take for AI to develop spatial understanding through interaction rather than predefined representations?
    If spatial intelligence emerges through movement, feedback, and environmental coupling, how can AI learn space as something lived rather than pre-mapped?

  • How might AI extend cognition into tools, environments, or shared structures instead of containing it internally?
    If notebooks, webs, trails, and habitats function as components of thinking systems, how can AI rely on external structures to stabilize memory, perception, and coordination?

  • What forms of feedback would allow AI to sense long-term consequences rather than short-term performance?
    Biological sensing ties action to delayed and distributed effects. How might AI receive feedback that reflects lasting impacts on ecosystems and societies rather than immediate optimization signals?

  • How have modern environments reshaped what we are able to sense and what we have lost the ability to notice?
    As climate change alters weather patterns, seasons, and ecosystems, which sensory cues have become unreliable, and which new signals are we failing to recognize?

  • What does responsibility mean in a world where human activity shapes the sensory lives of other species?
    When artificial light, constant noise, and chemical signals disrupt migration, communication, and rest, how should humans account for sensory harm alongside physical and ecological damage?

  • How has sensory overload changed human attention, empathy, and social connection?
    In environments saturated with notifications, screens, and engineered stimulation, how do our capacities for presence, care, and long-term thinking shift—and what does that mean for collective action on climate and ecological crises?

  • What forms of knowledge emerge only through embodied and local sensing?
    As global models and abstractions dominate decision-making, how might humans recover forms of understanding rooted in direct contact with land, weather, animals, and place?

  • How can humans sense long-term consequences in systems that reward short-term comfort and growth?
    Climate change unfolds across decades and generations—what cultural, sensory, or social practices could make distant impacts emotionally and perceptually real?

  • What would it mean to design human habitats that are perceptually hospitable rather than maximally efficient?
    How might cities, technologies, and infrastructures be reshaped to support rest, orientation, and multispecies coexistence rather than constant stimulation and extraction?

  • How do humans distinguish meaningful signals from noise in an era of ecological instability?
    When alarms are constant and crises overlap, how can we learn to listen more carefully, to ecosystems, to each other, and to the limits of our own bodies?

  • What does it mean to belong to the web of life rather than stand apart from it?
    If humans are not outside observers but active participants in Earth’s sensory and ecological systems, how should that reshape ethics, governance, and everyday choices?



References

Currie, C. R., Scott, J. A., Summerbell, R. C., & Malloch, D. (1999). Fungus-growing ants use antibiotic-producing bacteria to control garden parasites. Nature, 398(6729), 701–704. https://doi.org/10.1038/19519

Hölldobler, B., & Wilson, E. O. (2010). The leafcutter ants: Civilization by instinct. W. W. Norton & Company.

Schultz, T. R., Brady, S. G., Fisher, B. L., & Ward, P. S. (2015). The evolution of agriculture in ants. Biological Journal of the Linnean Society, 115(4), 891–910. https://doi.org/10.1111/bij.12550

Weber, N. A. (1972). Gardening ants: The attines. American Philosophical Society.

Barth, F. G. (2002). A spider’s world: Senses and behavior. Springer.

Clark, A. (2008). Supersizing the mind: Embodiment, action, and cognitive extension. Oxford University Press.

Clark, A., & Chalmers, D. (1998). The extended mind. Analysis, 58(1), 7–19. https://doi.org/10.1093/analys/58.1.7

Ha, D., & Schmidhuber, J. (2018). Recurrent world models facilitate policy evolution. Advances in Neural Information Processing Systems, 31. https://arxiv.org/abs/1809.01999

Hafner, D., Lillicrap, T., Ba, J., & Norouzi, M. (2020). Dream to control: Learning behaviors by latent imagination. International Conference on Learning Representations (ICLR). https://arxiv.org/abs/1912.01603

LeCun, Y. (2022). A path towards autonomous machine intelligence. Meta AI. https://openreview.net/forum?id=BZ5a1r-kVsf