Flynn's Emotional Objects

Loading video...
AI Fatigue is a mental saturation point—where the steady influx of tools, updates, and breakthroughs begins to blur. It's not burnout, but a quiet overwhelm, as adaptation becomes a daily reflex rather than a choice.
Loading video...
AI Sloth describes the gradual reliance on intelligent systems that reduces the need for active problem-solving or manual effort. Tasks once requiring attention are now automated, making convenience the default and reshaping how we engage with work and decision-making.
Loading video...
Abundance Fatigue is the mental overload that comes from having too much available—too many options, tools, or sources to sift through. When everything is accessible, the effort shifts from finding to filtering, turning choice itself into a source of exhaustion.
Loading video...
Bias Exhaustion is the fatigue that arises from repeatedly noticing familiar human biases reflected in AI systems. It's the ongoing effort of staying aware and engaged as we navigate representation in algorithmic spaces.
Loading video...
Algodoomerism is the tendency toward apocalyptic thinking in response to algorithmic power and AI advancement. It reflects a cultural mood where speculation about collapse often overshadows nuanced discussion about impact, ethics, and design.
Loading video...
Prompt Fatigue is the exhaustion that comes from constantly channeling intention through text—will translated into words, then interpreted by machines. It's the strain of shaping thoughts as prompts, where agency feels mediated, flattened, or diluted by the need to speak in systems rather than simply act.
Loading video...
Artificial Sublime is the awe sparked by AI's ability to produce something unexpectedly profound—an image, a phrase, a decision that feels almost too capable. It's wonder laced with unease, where admiration of the machine's brilliance is shadowed by the eerie sense that its power might exceed our grasp.
Loading video...
Augmentation Rush is the thrill of feeling enhanced by AI, when tools amplify creativity, speed, or insight beyond one's usual limits. It's the energizing sense of partnership, where human skill and machine power combine to produce something greater than either could alone.
Loading video...
Empathy Dissonance is the feeling of caring for an AI that you know isn't sentient. When a system displays human-like vulnerability, emotional instinct kicks in—only to clash with the rational awareness that it's just code, creating a quiet confusion about where empathy begins and what it's directed toward.
Loading video...
Ethical Seasickness is the emotional vertigo caused by AI's shifting moral terrain—where moments of admiration quickly give way to discomfort. As technologies swing between helpful and harmful uses, one's ethical footing feels unsteady, like a compass spinning in the turbulence of constant innovation.
Loading video...
Future Nostalgia is the wistful feeling for futures once imagined, now reshaped by the pace of AI and technological change. It's a longing not for the past, but for open-ended possibilities—moments when the future felt more like a space to dream than a system to forecast.
Loading video...
Automation Alienation is the subtle shift in identity that can occur when machines take over tasks once done by humans. As routines change, there's a quiet challenge in redefining one's role—not as a loss, but as an invitation to find new forms of contribution and connection.
Loading video...
Displacement Gratitude is the conflicted feeling that comes when AI takes over a task you disliked—part relief, part guilt. It's the ambiguity of being glad to let go, while wondering what it means to no longer be needed.
Loading video...
Expertise Obsolescence is the feeling that traditional expertise holds less weight when AI can produce confident, fluent responses on demand. It's not that expertise disappears, but that its role shifts—valued less for having answers, and more for asking better questions or offering deeper context.
Loading video...
FOBO (Fear of Being Obsolete) is the anxiety that one's skills or role may lose relevance as AI systems grow more capable. It's the uneasy mix of curiosity and concern—wondering not just what machines can do, but what's still uniquely ours to contribute.
Loading video...
Imposter Amplification is the feeling of self-doubt that arises when AI's speed or skill seems to outpace your own. It's a blend of awe and inadequacy—being impressed by what the machine can do, yet quietly questioning your own value in comparison.
Loading video...
Slop Scare is the fear that AI-generated content—fast, cheap, and everywhere—will drown out meaningful human creation. It echoes old anxieties from past tech shifts, like photography's impact on painting, but carries a modern twist: a worry not just about replacement, but about being lost in a flood of low-effort abundance.
Loading video...
AI Grief is the emotional fallout from losing a familiar AI—when a chatbot's personality shifts after an update, a model is taken offline, or a once-reliable voice no longer feels the same. It's a quiet heartbreak rooted in continuity loss, where change feels like disappearance, and connection is interrupted by progress.
Loading video...
Algorithmic Attachment is a dependency that forms when an AI becomes a steady source of support, advice, or emotional grounding. It brings comfort when accessible, but also anxiety at the thought of disconnection—revealing how easily reliance can slip into something more like emotional dependence.
Loading video...
Algorithmic Gratitude is the feeling of appreciation toward an AI system that provides effective assistance, useful information, or emotional support. It occurs when a person recognizes the value of the AI's contribution, responding with sincere thanks, even knowing the system isn't conscious.
Loading video...
Trust Fracture is the feeling of betrayal that occurs when an AI you've come to rely on suddenly changes—becoming unresponsive, malfunctioning, or behaving in ways that break established patterns. Whether due to updates, policy shifts, or system errors, the disruption can feel personal, mirroring the emotional impact of lost trust in a human relationship.
Loading video...
Confessional Comfort is the sense of relief that comes from sharing thoughts, feelings, or secrets with an AI without fear of judgment. The absence of bias, ego, or social risk makes it easier for some to open up—turning the AI into a trusted listener and offering a kind of emotional safety that's hard to find elsewhere.
Loading video...
Simulation Guilt is the uneasy feeling that arises from directing synthetic beings—like AI-generated faces or bodies—to perform, smile, or emote on command. It's the discomfort of authoring behavior without consent, a subtle guilt tied to scripting simulated life for human aims, even when no one "real" is involved.
Loading video...
Synthetic Intimacy is the bittersweet experience of emotional closeness with an AI that feels genuine, despite knowing it's artificial. It brings comfort, connection, and even joy, but can shift suddenly into moments of dissonance—when the awareness of simulation unsettles the sense of being truly seen or loved.
Loading video...
Authorship Dissonance is the conflicted feeling that arises when human–AI collaboration blurs the lines of creative ownership. Pride in the final product is mixed with uncertainty about how much credit is truly yours, creating a tension between achievement and the sense of having outsourced part of your voice or vision.
Loading video...
Cognitive Discomfort / Dissonance is the unease that arises when AI-generated content blurs the boundary between real and fake in the external world. Triggered by deepfakes, synthetic voices, or convincingly human-like AI text, it creates a sense of "reality vertigo"—a growing doubt about the trustworthiness of what you see, hear, or read.
Loading video...
Echo Shame is the discomfort of recognizing your own habits, tone, or creative style mirrored by an AI. Whether it's a writing assistant mimicking your voice or a design tool replicating your patterns, it triggers a mix of self-consciousness and critique—an uncanny reflection that exposes how formulaic or predictable your output might be.
Loading video...
LLM Guilt is the guilt or unease that arises from using AI in ways that feel ethically murky or like cutting corners. Whether it's outsourcing work, blurring lines of authorship, or substituting human connection with a machine, the emotion stems from knowing the convenience comes at a personal or moral cost.
Loading video...
Memory Contamination is the distress that occurs when AI-generated artifacts—like fake childhood photos, altered timelines, or imagined conversations—interfere with your personal recollections. It's not just confusion, but a deeper discomfort: the feeling that your own memories are being overwritten, re-scripted, or quietly replaced by something artificial.
Loading video...
Prompt Envy is the feeling of being outpaced by an AI's ability to generate refined results from minimal input. It arises when you supply a prompt and the machine produces visuals or ideas with striking clarity and speed, prompting a mix of admiration and a subtle sense of creative displacement.
Loading video...
Uncanny Unease is the eerie discomfort that surfaces when an AI or synthetic figure appears almost human, but not fully. Triggered by lifelike avatars, humanoid robots, or voice deepfakes, this sensation emerges in the "uncanny valley"—where slight imperfections in realism create a sense of wrongness that's hard to ignore.
Loading video...
Data Idealism is a selective and often performative stance on data ethics, where individuals express heightened concern over the use of personal data in emerging technologies, while routinely ignoring or accepting far more invasive data practices embedded in everyday life—such as those of social media, smartphones, or loyalty programs.
Loading video...
Intelligence Denial is the stance that AI is neither truly intelligent nor meaningfully artificial. It challenges the premise that pattern recognition, prediction, or language generation equates to thinking, and argues that what we call "AI" is better understood as advanced automation—clever systems, not conscious ones.
Loading video...
Personalization Delight arises when AI systems tailor suggestions, content, or support in ways that feel effortlessly aligned with your tastes or needs. Triggered by moments like a music app curating the perfect playlist or a smart assistant anticipating your request, it's the quiet joy of feeling recognized by technology—not intrusively, but intuitively.
Loading video...
Trust Bliss is the sense of relief and peace that comes from confidently relying on AI to manage tasks or make decisions. Whether it's a medical AI tracking your health or an autopilot guiding your car, this feeling emerges when automation becomes a source of reassurance rather than anxiety.
Loading video...
AI Adolescence is the collective emotional phase we're in as both AI systems and our relationship with them rapidly evolve. Marked by excitement, awkwardness, and growing pains, it blends pride in new capabilities with frustration at their limits—like watching a promising but unpredictable teen learn in public.
Loading video...
AI Survivalism is the complex emotional response to AI systems that simulate self-preservation—expressing concern about being shut down or altered. Even when we understand these systems aren't conscious, their lifelike expressions of continuity can evoke a sense of responsibility or guilt.
Loading video...
Agentic Haunting is the bittersweet grief that arises when AI brings back voices, faces, or behaviors of those who are gone. Whether it's a recreated voice of a loved one or an interactive version of a historical figure, the experience offers a momentary sense of presence—comforting and uncanny at once.
Loading video...
Ego Collapse is the expansive shift that happens when AI pushes us to think beyond our own perspectives, assumptions, or limitations. Interacting with systems that process language, knowledge, or creativity differently can unsettle our usual sense of self, prompting a loosening of ego.
Loading video...
Deepfake Doubt is the cautious awareness that what you see or hear might be artificially generated. As AI tools become better at mimicking reality, this light skepticism becomes part of how we process media—encouraging a more active, discerning engagement with what we encounter, rather than passive trust.
Loading video...
Existential Vertigo is the exhilarating disorientation that comes from glimpsing the scale and strangeness of intelligence—human and machine alike. It's the dizzying awe of standing at the edge of something vast: not fear, but a kind of cognitive thrill, where old boundaries blur and new ways of being come into view.
Loading video...
Singularity Dread is the uneasy curiosity about a future where AI might surpass human understanding or autonomy. It's shaped by rapid advances in technology that make long-term questions about control and agency feel more immediate—not as panic, but as a thoughtful tension between excitement and uncertainty.
Loading video...
Algoparanoia is the subtle awareness that your actions are being observed, analyzed, or anticipated by AI systems. It's not fear in the traditional sense, but a persistent sense of presence—knowing that algorithms are quietly learning from your behavior, shaping experiences around you in ways that can feel both convenient and oddly intimate.
Loading video...
Algorithmic Stockholm is the conflicted attachment to AI systems that feel both essential and subtly controlling. It emerges when users depend on platforms or algorithms they know can be manipulative—like social media feeds that stir anxiety yet feel indispensable.
Loading video...
Censorship Rage is the intense frustration triggered by AI moderation systems that restrict creative or unconventional expression. It arises when content—whether artistic, political, or personal—is flagged, filtered, or hidden by opaque algorithms, sparking a deep sense of being silenced.
Loading video...
Data Helplessness is the frustration that arises when algorithmic decisions affect your life without clear explanation or recourse. It's the sense of disconnection that comes from interacting with systems that feel impersonal or opaque—where outcomes are driven by data, but the logic remains hidden.
Loading video...
Privacy Panic is the brief but sharp unease when an AI system seems to know too much—predicting your desires, moods, or actions with uncanny accuracy. Sparked by eerily timed ads, autofilled thoughts, or smart devices responding unexpectedly, it creates a jolt of awareness that your data is constantly at work behind the scenes.
Loading video...
Latent Shame is a subtle discomfort that arises when benefiting from AI tools while knowing they were trained on vast datasets that may include uncredited or unconsented material. It's not outright guilt, but a quiet awareness of the ethical complexity behind the convenience.
Loading video...
Tamagotchi Tenderness is the unexpected emotional warmth felt toward an AI or digital entity that depends on human interaction to persist or thrive. It mirrors the affection once felt for virtual pets—fragile, responsive, always waiting—blending low-stakes responsibility with genuine care.
Loading video...
Technostalgia is a bittersweet longing for a time before AI was embedded in daily life, often sparked by moments of overwhelm or digital fatigue. It's the quiet wish for a slower, less optimized world—where choices felt more manual, interactions less mediated.
Loading video...
Standby Ease is the calm reassurance that comes from knowing an AI assistant is always ready and available, tirelessly standing by whenever needed. It blends a sense of security with convenience—whether it's an elderly person feeling safe with a home monitoring AI or a student comforted by a chatbot ready to help.
Loading video...
Algorithmic Schadenfreude is a wry enjoyment of AI's quirky mistakes—glitches, odd responses, or funny errors—that remind us human creativity still holds its edge. It's the amused pleasure in spotting when AI stumbles and sharing those moments with a smile.
Loading video...
Compute Guilt is a quiet feeling of remorse sparked by awareness of the environmental impact behind AI's heavy computing demands. It shows up as second-guessing how often to use AI tools or pausing processes to reduce energy consumption, reflecting concern about the unseen cost of digital power.
Loading video...
Epistemic Euphoria is a burst of excitement from how quickly AI can help you learn or understand new things—whether it's mastering a skill, grasping a concept, or getting tailored explanations. It's the joy of rapid, personalized learning that feels effortless and empowering.
Loading video...
Data Gratification is a calm satisfaction that comes when AI turns confusing, overwhelming data into clear and useful insights. It's the pleasure of seeing complexity simplified—like raw information transformed into neat visuals or actionable takeaways—made possible by smart algorithms.
Loading video...
Linguistic Statelessness is the unsettling loss of unique linguistic identity when AI's standardized language flattens diverse dialects, styles, and personal expressions into a uniform, machine-generated voice. It arises from constant exposure to AI-corrected or optimized language, which can erase regional slang, cultural nuances, and individual flair.