Conflict & Consensus in Swarms
The ethnographer’s conception of the internet is an information-sharing assemblage. Information sharing—often called the information economy—is now shaped mainly by AI-driven content generation and discovery. At the individual level, feeling informed amid many sources/outlets requires an active process of curation. At the community level, feeling integrated is the result of confronting the information (a)symmetries with others. AI’s role in curating content and community on the user’s behalf is both opaque and highly fragmented. This erosion of informed collective action forces us to reconsider how information and everyday politics intertwine—and what kinds of relationships we actually want.
Each field has a set of solutions for facilitating collective action: cryptography’s proofing solutions, copyright’s expanded legal templates, machine learning’s study of bias, entrepreneurship’s communal business models, journalism’s defused or satirical language, etc. For every solution geared at securing information and capital distribution, there are efforts to maintain its accumulation. Why does that happen, and what lessons might guide safer future iterations of the internet? We’ve created an information machine with the potential to empower and disempower individuals—information-sharing is a tale as old as time, and so are its dilemmas. Yet today’s information machine moves faster than ever, and escaping glib conclusions demands the very creativity that drives any paradigm shift.
Breathing creativity into an internet reality we’re all so familiar starts with taking a step back. Let’s “enstrange” ourselves to these details in order to get at process of comparison—a perspective which can inspire our own context. Science fiction uses enstrangement as a device for social commentary by reformulating our experiences in that of strongly othered encounters. Like aliens. Closer to home, I draw inspiration from swarming insects—specifically, ants. I’ll anthropomorphize them to explore the questions of information, collaboration, and resistance, so that we may come back to our own processes with a fresh perspective.
The “Anternet”
Ants survive and reproduce through collaborative action. Information-sharing is the substrate upon which those goals are achieved. Ants collaborate by constantly exchanging information through pheromones in their antennae. Exchange in a swarm is about circulation: receiving and distributing. It’s cyclic, and failure in one means failure in the other. Looking at those failures shows what collaboration really requires:
Overload: Excessive influx of information overwhelming the cognitive capacity of individual ants, leading to reduced attention, memory, and decision-making capabilities.
Noise: Propagation of erroneous or irrelevant information within the colony, creating a noisy environment that hampers the transmission of accurate information.
Disintegration: Breakdown of communication channels or loss of information flow, causing fragmentation and impaired coordination within the colony.
Diffusion: Incomplete or restricted distribution of important information, inhibiting the colony from effectively utilizing valuable knowledge.
Misinterpretation: Incorrect understanding or decoding of shared information, leading to confusion and potential misalignment of collective actions.
Feedback: Reinforcement of existing beliefs or information within specific ant groups, limiting exposure to diverse perspectives and hindering the exploration of alternative solutions.
Communication works when it’s clear, diverse, credible, and attentive. Mitigating miscommunication is an important part of information-sharing. When validation or consensus falters, a swarm can unravel from within. Preserving those validation mechanisms is critical to a swarm’s survival. Unchecked information exists only as belief, and there are mechanisms to solidify—to justify (as epistemology would call it) information. Redundancy, reasoning, consensus, memory, and intuition regulate an information economy.
But information isn’t unbiased, and there’s no such thing as a foundational truth (anthropology calls this a radical ideal). Instead, there are interpretations and judgements; and how those are made is highly variable. Much like us, ants don’t always agree on the same things. Disagreement can impact their collaboration—they can’t work together. But, just like us, that doesn’t mean they stop working or their goals disappear. Instead, their goals come into conflict. This point elegantly relates the volatility of information to conflict: not working together turns into working against one another. Information breakdowns are now the source of competitive behaviors.
Miscommunication becomes an arena for competition; some ant species evolve opportunistic tricks that tip the scales. There are ant species that manipulate information sharing and its corrective mechanisms. First, expand the cycle of communication: receive, interpret, and distribute. The interpretation opens up a new domain for miscommunication:
Misdirection: Deliberate dissemination of misleading or false information, intentionally steering the colony towards suboptimal decisions or behaviors.
Exploitation: Manipulation of shared information to gain advantages or control over resources or other ants.
Sabotage: Intentional disruption or interference with the transmission or reception of information within the colony.
Manipulation thrives not on information sharing, but on information disparity. We can associate strategies for manipulating information with kinds of coalition building (politics & marketing) —activities whose goal is to organize collective behavior. This flip shows information isn’t neutral or evenly shareable; swapping it doesn’t guarantee collaboration. Instead, information is actively shaped and communicated in meaningful ways (manipulated) between individuals to facilitate a mutually-beneficial negotiation that is constantly renewed.
Ant-Sized Information Wars
Here, information shifts from substrate to resource, shaped at both the individual and collective scales. Manipulation, filtering, and framing are central features, not corruptions, of communication. Because survival hinges on shared information, colonies evolve ways to counter and redirect it. Some fascinating instances of this at work:
Enslavement through pupal capture: Species like Polyergus lucidus raid neighboring colonies and steal pupae, which eclose within the raiders’ nest and adopt colony-specific odors, allowing them to function as integrated workers. This coerces non-kin individuals into informational and labor systems not their own. (Wheeler, 1904; PMC2999504)
Genetic self-recognition in queen policing: In red fire ants (Solenopsis invicta), workers use a genetically encoded recognition system to identify and kill queens lacking a specific allele, ensuring genetic homogeneity among reproductive individuals and reinforcing colony identity. (Keller & Ross, 1998; doi:10.1038/29064)
Subgroup-based information roles: Social network analyses reveal that individual ants in a colony often self-sort into subgroups (e.g., nurses vs. foragers), each maintaining different information access and transmission roles, reflecting localized informational economies within the same colony. (Mersch et al., 2013; PMC11288679)
Asymmetrical chemical signaling for conflict mitigation: In some eusocial systems, individuals use graded chemical signals to resolve disputes without escalating to physical conflict—enabling dominance hierarchies and task allocation to emerge through low-cost informational control. (Van Wilgenburg & Sulc, 2014; doi:10.1016/j.anbehav.2014.05.013)
Resistance via brood destruction: In slave-making systems, enslaved Temnothorax workers have been documented killing parasite brood, disrupting the informational reproduction of their captors and signaling misalignment within imposed hierarchies. (Achenbach & Foitzik, 2009; PubMed)
Information asymmetry in mutualisms: In ant–homopteran associations, ants protect homopterans in exchange for access to sugary secretions, but the value of this relationship can fluctuate—leading colonies to adjust protective behavior, compete over high-output partners, or regulate access among nestmates. (Cole, 1983; Bulletin of the Ecological Society of America)
Trail sabotage and defense mechanisms: Experimental “detractor” ants using misleading pheromone trails can significantly disrupt colony foraging; colonies respond by evolving “cautionary” pheromones to counteract deceptive routing (Aswale et al., 2022; arXiv preprint)
Chemical/vibroacoustic deception by inquiline species: Certain workerless parasitic ants infiltrate host nests using chemical mimicry and vibration mimicry to bypass recognition systems (Inquiline ant Myrmica karavajevi; 2024 preprint)
Chemical Disguise by Social Parasites: Temporary parasitic queens of Polyrhachis lamellidens invade host colonies and apply host cuticular hydrocarbons through rubbing behavior to disguise themselves chemically, reducing aggression and gaining acceptance. (Iwai et al., 2022; Front. Ecol. Evol.)
Taken together, these ant maneuvers sketch a miniature atlas of information power—one that travels with us as we pivot back to our own, human-scale infrastructures. These cases reveal that what’s at stake in each behavior isn’t just action, but the regulation of how information flows, accumulates, or is withheld within the colony. At this point, information appears inseparable from those who wield it. Pheromones and online data are both real objects, but the patterns are extensions of our internal activity. Information isn’t only a resource, but an appendage to agency. Cognition becomes emergent from swarm dynamics, undermining the sender–receiver model by focusing on co-production. Returning to humans and AI, this shift reopens our ethics of relation.
Back to the Internet
Comparing our internet to non-human systems productively defamiliarizes how we think about communication. Humans and ants share resemblances in terms of information sharing, threats to information economies, and conflict resolution mechanisms. As the colony showed, miscommunication doesn’t just stall action; it becomes the very terrain on which advantage is fought for The point isn’t whether humans are like ants. The point is to look at systems and scaffold as forms of collective sense-making. Once motivations enter, those systems become competitive. Discovering that our systems can be retuned to funnel capital toward dominant interests doesn’t indict ‘human nature.’ It exposes a flaw in how we treat information: as neutral, transferable, and evenly distributed.
The systems we build to manage communication—rate limits, moderation protocols, relay structures—are not just technical. They encode expectations about who participates, who adapts, and how disagreement is absorbed. The same is true in AI-mediated infrastructures, where information is filtered through centralized models and uneven feedback loops. If these systems shape how people collaborate, they must also shape what counts as legitimate contribution, credible evidence, or shared understanding. That makes the design of AI models a normative one—requiring specific strategies to ensure that outputs are justifiable, not merely probable. As such, AI-mediated information environments require strategies for the generation of justified outputs that disarm miscommunication as a negotiation strategy:
Each mechanism (rate limits, proofing schemes, friction protocols) encodes a theory of what collaboration is, what coordination should feel like, and who gets to matter within it. Because negotiation happens inside the architectures we build, those very architectures also delimit what can be meant. A platform’s constraints are linguistic constraints. In AI infrastructure, this plays out through centralized training pipelines, asymmetric feedback loops, and increasingly obfuscated logics of negotiation. Here’s one pragmatic sketch of safeguards—design constraints that translate the ant lessons into AI infrastructures:
Diversity: Train and prompt across heterogeneous corpora. Ensure models are exposed to epistemically diverse datasets to prevent monocultures.
Credibility: Weight by source integrity. Calibrate model responses based on source provenance.
Verification: Implement recursive validation layers. Integrate mechanisms for cross-referencing claims.
Context: Encode situational sensitivity. Align model outputs with contextual cues such as audience, geographic norms, or historical reference frames to prevent semantic drift and misalignment.
Bias Correction: Surface and counteract latent model priors. Apply interpretability tools to render transparent model bias.
Communication infrastructures shape the conditions under which meaning is negotiated. Collaboration breaks down when interpretive distance becomes too wide—when information is patterned by assumptions that don’t align. AI accelerates the negotiations over attention: what is foregrounded, withheld, and made legible. Those systems reinforce asymmetries through design choices that shape visibility, sequencing, and semantic weight. Much like ants hijacking pheromone trails, our capital flows as much as by technical intent, and those flows often reward distortion over clarity. Continuing to build means asking not only how information is transmitted, but a semantic recalibration: how is meaning made available and to whom?
I’ve approached the internet sideways, through ants and asymmetries, to loosen our assumptions about communication. My aim is to frame information as context-bound negotiation, where the systems we build pull us toward or away from collaboration. Seeing the internet through ant eyes doesn’t solve our dilemmas; it simply reminds us that every communication architecture embeds a politics of collaboration—or conflict.
BIBLIOGRAPHY.
Achenbach, A., & Foitzik, S. (2009). First evidence for slave rebellion: Enslaved ant workers systematically kill the brood of their social parasite. Evolution, 63(4), 1068–1075. https://pubmed.ncbi.nlm.nih.gov/19243573/
Aswale, P., et al. (2022). Detractor ants reduce foraging efficiency through deceptive pheromone trails. arXiv preprint. https://arxiv.org/abs/2202.01808
Berlina, A. (2018). Let Us Return Ostranenie to Its Functional Role: On Some Lesser-Known Writings of Viktor Shklovsky. Common Knowledge.
Cole, B. J. (1983). Mutualism and Competition in Ant-Homopteran Associations. Bulletin of the Ecological Society of America, 64(4), 90–91.
Couzin, I. D., Krause, J., James, R., Ruxton, G. D., & Franks, N. R. (2005). Collective memory and spatial sorting in animal groups. Journal of Theoretical Biology, 232(4), 587–594. https://doi.org/10.1016/j.jtbi.2004.09.035
Greene, M. J., & Gordon, D. M. (2007). Colony odor and recognition of resident and nonresident ant species by the Argentine ant, Linepithema humile. Insectes Sociaux, 54(2), 161–170. https://doi.org/10.1007/s00040-007-0938-7
Hölldobler, B., & Wilson, E. O. (2000). The Ants. Harvard University Press.
Iwai, H., et al. (2022). Temporary social parasites acquire chemical disguise by rubbing against host workers in ants. Frontiers in Ecology and Evolution, 10, 915517. https://doi.org/10.3389/fevo.2022.915517
Keller, L., & Ross, K. G. (1998). Selfish genes: a green beard in the red fire ant. Nature, 394(6693), 573–575. https://doi.org/10.1038/29064
Kronauer, D. J., & Pierce, N. E. (2013). Myrmecia pilosula, an Ant with Only a Single Pair of Chromosomes. PNAS, 110(2), 691–692.
Mersch, D. P., Crespi, A., & Keller, L. (2013). Tracking individuals shows spatial fidelity is a key regulator of ant social organization. Science, 340(6136), 1090–1093. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11288679/
Pratt, S. C., Mallon, E. B., Sumpter, D. J., & Franks, N. R. (2005). Quorum sensing, recruitment, and collective decision-making during colony emigration by the ant Leptothorax albipennis. Behavioral Ecology and Sociobiology, 59(1), 149–158. https://doi.org/10.1007/s00265-005-0022-1
Tateo, L. (2020). Viktor Shklovsky, Bronislaw Malinowski, and the invention of a narrative device: Implications for a history of ethnographic theory. HAU: Journal of Ethnographic Theory. https://www.journals.uchicago.edu/doi/10.1086/708111
Van Wilgenburg, E., & Sulc, R. (2014). Conflict resolution mediated by asymmetrical chemical signals in a termite. Animal Behaviour, 94, 29–35. https://doi.org/10.1016/j.anbehav.2014.05.013
Wheeler, W. M. (1904). A new type of social parasitism among ants. Bulletin of the American Museum of Natural History, 20, 347–375. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2999504/