What Is Cyberspace? Who Are You When You’re Online? Technology, Identity, and What It Means to Be Human.

Cyberspace: The Digital Frontier of Modern Business

Cyberspace is the interconnected digital environment where global networks, the Internet, and technology converge to enable communication, commerce, and innovation. It’s a space where how we communicate is just as critical as what we communicate—a concept media theorist Marshall McLuhan captured in the phrase “the medium is the message.”

Origins and Evolution

Science fiction author William Gibson introduced “cyberspace” in 1982, describing it as a “consensual hallucination”—a shared digital reality experienced by billions. Combining “cybernetics” and “space,” the term entered everyday language as the Internet expanded in the 1990s.

What Cyberspace Encompasses Today

  • Digital Infrastructure: The Internet, cloud systems, and communication networks that power modern business
  • Virtual Environments: Spaces for collaboration, training, and innovation
  • Online Communities: Professional networks and knowledge-sharing platforms
  • Digital Commerce: E-commerce, fintech, and digital transformation ecosystems
  • Communication Systems: The backbone of global connectivity and data exchange

Strategic Implications

From speculative concept to business reality, cyberspace now drives societal and economic transformation. For organizations, it represents both opportunity and responsibility—requiring thoughtful navigation of emerging technologies, AI ethics, digital security, and human-centered design.

The Strategic Question: As we shape this digital frontier, businesses face a fundamental choice: Will technology serve as a tool for empowerment and innovation, or create new barriers? The answer lies in how we architect, govern, and leverage these digital spaces.

Technology is Anthropology: A Human-Centered Perspective

Technology isn’t just about systems and code—it’s fundamentally about people. When we say “technology is anthropology,” we’re acknowledging that every digital solution reflects and shapes human culture, behavior, and relationships.

Core Principles

1. Technology Reflects Our Values
What we build reveals who we are. The smartphone’s ubiquity demonstrates our collective prioritization of connectivity, instant access to information, and networked communication. Our tools mirror our cultural priorities and societal organization.

2. Technology Shapes Human Interaction
Just as anthropologists study how tools influence societies, we observe technology transforming communication patterns, relationship dynamics, and community formation. Social media hasn’t just connected us—it’s fundamentally altered how we relate to one another.

3. Development is a Cultural Process
Creating technology involves more than technical problem-solving. It’s influenced by social norms, organizational power structures, and cultural assumptions. Consider facial recognition: its development both reflects and can reinforce existing societal biases.

4. Context Determines Impact
Like any cultural artifact, technology cannot be understood in isolation. Its true meaning emerges from how people use it, the organizational structures within which it operates, and the practices communities develop around it.

The Innovation-Education-Collaborartion Imperative

This perspective demands that we approach technology, innovation, implementation, and education differently. Understanding digital transformation requires understanding people—their workflows, cultural norms, communication patterns, and decision-making processes. This is what we call Collaboration!

The bottom line: Successful technology consulting requires anthropological thinking. We’re not just deploying systems; we’re shaping how organizations work, communicate, and create value in an increasingly digital world.

Who Are You When You’re Online? Technology, Identity, and What It Means to Be Human

Every time you open an app, post a photo, or let an algorithm recommend your next video, you’re not just using a tool—you’re being shaped by it. This talk explores how technology has become far more than gadgets and code: it’s the medium through which we build our identities, find community, and understand what it means to be human in the 21st century. Drawing on ethics and anthropology of technology, we’ll examine the hidden ways platforms mold us into predictable categories, the promise and peril of our hybrid digital selves, and what we can do to reclaim agency in an algorithmic age.

STEP ONE: SEE – Looking at Our Digital Reality

Think about the last time you posted something online. A photo on Instagram, a thought on X, BlueSky, Facebook, a professional update on LinkedIn. Before you hit “post,” did you pause? Did you think about how it would look, how many likes it might get, and whether it fits your “brand”?

Now here’s the real question: When did we start thinking of ourselves as having a brand?

What We’re Actually Seeing

We live in a world where identity has layers we didn’t have twenty years ago. There’s the person you are at the dinner table, the person you are at work, and then there’s this whole other dimension—your digital self. Your profile. Your data trail. The algorithmic version of you that platforms have constructed from every click, every pause, every late-night scroll.

Let me paint the picture more clearly:

The Multiplied Self

You wake up. Your phone already knows you better than you know yourself—it knows you’re a night owl who watches cooking videos at 2 a.m., that you’re probably thinking about upgrading your laptop, that you pause longest on posts about climate anxiety. You have a LinkedIn you that’s professional and accomplished, an Instagram you that’s curated and aspirational, a private account you that’s messy and real, and somewhere buried in recommendation algorithms, there’s a data-ghost you made of predictions about what you’ll buy, watch, and believe next.

The Feedback Loop

And here’s where it gets weird: these digital versions of you start to shape the real you. You post a joke that flops—five likes. You feel that slight sting of rejection. You post a sunset photo—200 likes. You feel validated. Slowly, subtly, you start performing for an invisible audience, shaping yourself to match what gets rewarded. Teenagers describe feeling like they’re “on stage” all the time. Adults find themselves crafting stories not to share experience, but to harvest engagement.

The Cultural Kaleidoscope

This isn’t happening in a vacuum. Across the world, technology is creating what I call “hybrid identities”—people weaving together their local culture, their family traditions, and global digital norms. A teenager in Lagos might navigate traditional family expectations, global Black identity politics on Twitter, and K-pop fandom communities. At the same time, TikTok’s algorithm tries to figure out which version of her to feed. Diaspora communities maintain “virtual homelands” through WhatsApp groups and YouTube channels, keeping languages and rituals alive across oceans.

We’re experiencing something unprecedented: an identity that is simultaneously more diverse and more homogeneous than ever before.

What the Research Shows

Anthropologists, philosophers, and historians studying digital culture describe what we’re living through as a shift to “liquid identity”—identity that flows and changes rapidly, no longer anchored in stable communities or physical places. We move between roles, profiles, and personas with unprecedented fluidity.

Digital anthropologists talk about “algorithmic selves”—the way we’re learning to anticipate what the algorithm wants, performing not just for human audiences but for the recommendation systems themselves. Content creators describe “speaking to the algorithm.” We’ve internalized the machine’s logic.

And here’s what makes this both fascinating and troubling: these technologies aren’t neutral tools we pick up and put down. They’re the water we swim in, shaping how we think, who we can become, and what feels possible.

STEP TWO: JUDGE – What’s Really at Stake

The Core Question

So what’s the problem? Isn’t it amazing that we can explore different identities online, connect across borders, and find communities that accept us?

Yes. And also no.

Because what looks like freedom and personalization is actually something more complicated. Let me explain what’s happening beneath the surface.

The Hidden Machinery of Homogenization

The Reduction of Personhood

Right now, as you sit here, dozens of algorithms have compressed you into categories. You’re not a whole human being to them—you’re a cluster of predicted behaviors. “Users like you” typically click on this, buy that, and believe these things, life to ponder. The system takes your rich, contradictory, evolving self and flattens it into a statistical profile.

And here’s the kicker: these systems often treat their prediction of you as more real than your own self-understanding. Is this the “False Self” that Thomas Merton talked about? The algorithm “knows” you’re interested in luxury goods because you watched one video about watches, even though you were actually researching a gift. But now you’re in the “luxury consumer” bucket, and that will determine what ads you see, what prices you’re offered, maybe even what job opportunities appear in your feed.

The Discrimination Engine

This gets darker when we look at the consequences. These homogenizing systems don’t just predict—they decide. They decide who gets the job interview, who pays higher insurance rates, and who gets flagged as “risky” by the system.

And because they’re trained on historical data that already contains human prejudices, they amplify those biases. Systems that assume women are less interested in tech jobs, or being a car mechanic, etc. Facial recognition that works poorly on darker skin tones. Credit algorithms that penalize people from specific zip codes. The same “people like you” logic that seems harmless in entertainment becomes oppressive in housing, employment, and justice. Are we seeing way too much of the False Self?

What makes this particularly insidious is that it’s presented as objective, data-driven, and fair. But it’s none of those things—it’s just automation of existing inequalities, made invisible and more complicated to contest.

The Cultural Flattening

Here’s what happens when algorithms optimize for engagement: they privilege whatever is most popular, most mainstream, most advertiser-friendly. Minority languages, non-Western aesthetics, challenging perspectives—these get pushed to the margins because they don’t fit the dominant patterns.

TikTok’s algorithm has been shown to suppress content from creators it deems “not attractive enough.” What does that mean? Too fat? Too old? Wrong sex? Indigenous creators struggle to share traditional knowledge that gets flagged as “misinformation.” LGBTQ+ content gets shadow-banned as “sensitive.”

The result? A slow, subtle pressure toward conformity. Now think of these algorithms in the hands of nefarious leaders, not through force, but through visibility. Be like everyone else, or be unseen.

What We’re Losing

Autonomy and Agency

When algorithms constantly predict and steer your behavior, showing you prepackaged paths of “people like you choose X,” something subtle happens: you stop exploring. You stop surprising yourself. The system optimizes for engagement, not for your growth, your complexity, or your capacity to change.

Think about it: How often does your algorithm show you something that truly challenges you, versus something that confirms what you already believe? How often does it introduce you to a perspective that might change your mind?

We’re being habituated to outsourced judgment. Why think hard about what to watch when the algorithm already knows? Why explore uncomfortable questions when your feed confirms you’re already right?

The Illusion of Connection

We have more “connections” than ever and report being lonelier than ever. We have platforms designed for “community,” yet we experience more isolation. Why?

Because mediated connection—connection through screens, through profiles, through curated highlight reels—is not the same as embodied presence. You can have 5,000 friends on Facebook and no one to call at 3 a.m. when you’re scared.

The technology promises to connect us, but actually interposes itself between us. We relate not to each other but to representations, to performances, to data.

Again, think about Merton’s True Self vs the False Self. Which one is the algorithm playing to?

The Ethical Crisis (you know I would get to this)

Here’s what’s at stake, in plain terms:

Human Dignity: When systems treat you as a cluster of data points rather than a person with inherent worth, they violate your dignity. You become a means to an end—engagement, profit, control—rather than an end in yourself.

Justice: When algorithmic homogenization entrenches discrimination and makes it invisible, it creates systemic injustice that’s harder to fight than explicit prejudice ever was.

Authenticity and Freedom: When your identity is constantly shaped by invisible forces optimizing for someone else’s goals, you lose the freedom to become who you might be. You’re sculpted into who the system wants you to be.

Cultural Diversity: When global platforms flatten difference, we lose the richness of human cultures, the wisdom of diverse ways of being, the creativity that comes from genuine pluralism.

Hear me on this one: This isn’t about technology being “bad.” It’s about technology being designed and deployed in ways that don’t respect the whole reality of human personhood.

STEP THREE: ACT – What We Can Do About It

A Different Vision

First, we need to be clear: we’re not going back to a world without technology. That ship has sailed. (If I said that phrase once, I have said it a million times.)

The question is: what kind of technological future do we want to build?

Imagine systems designed not for engagement and profit, but for human flourishing. Not for convergence toward an average, but for protection of diversity and dignity. Not for prediction and control, but for exploration and growth.

This isn’t naive idealism. It’s possible. But it requires action at multiple levels.

For Individuals: Reclaiming Agency

Cultivate Awareness

Start noticing when you’re performing for the algorithm. When you’re shaping yourself to match platform logic. It’s okay to do it—do it consciously. Ask yourself: “Is this who I want to be, or who the algorithm rewards me for being?”

Diversify Your Information Diet

Actively seek out perspectives different from your own. Follow people who challenge you. Read sources that the algorithm wouldn’t recommend. Visit the library. Have conversations with humans who aren’t filtered through screens.

Practice Digital Sabbaths

Regular breaks from devices aren’t just about rest—they’re about remembering who you are without the feedback loops, without the metrics, without the performance.

Protect Your Data

As a former CISO (Chief Information Security Officer), let me be clear: use privacy tools. Delete old accounts. Understand what you’re consenting to. You can’t opt out entirely, but you can reduce your datafication. Have you looked at who/what you are on the Dark Web? You might be surprised.

Cultivate Embodied Community

Invest in face-to-face relationships. Join local groups. Experience the messiness and richness of unmediated human presence. Your digital life should supplement your embodied life, not replace it.

Hear me, farewell, you Designers and Developers: Building Better Systems

Design from the Ground Up

Include affected communities in design processes. Don’t assume you know what users need—ask them, watch them, learn from them. Particularly include marginalized groups whose experiences often reveal system failures invisible to the majority of users.

Diversify Teams and Data

Homogeneous teams build homogenizing systems. If your entire design team looks the same, thinks the same, comes from the same background, your product will embed those blind spots. Actively recruit diverse perspectives and compensate people fairly for that expertise.

Build for Transparency and Control

Give users meaningful information about how they’re being categorized and predicted. Provide ways to contest algorithmic decisions. Make opting out actually possible, not just technically available.

Measure Success Differently

Stop optimizing solely for engagement or time on the platform. What if success meant helping users achieve their own goals, not keeping them scrolling? What if platforms measured their impact on user wellbeing, connection quality, or learning?

Protect Cultural Diversity Intentionally

Design systems that actively prevent convergence on dominant norms. Create space for minority languages, alternative aesthetics, and challenging perspectives. Treat diversity as a feature to preserve, not noise to filter out.

For Big Brother & Sister in Institutions and Policymakers: Creating Accountability

Require Algorithmic Audits

Just as we have financial audits and environmental impact assessments, we need regular, independent audits of algorithmic systems to assess bias, fairness, and their impact on vulnerable groups.

Establish Real Penalties

Self-regulation hasn’t worked. It obviously didn’t work for Cain & Able, what makes you think it works now? When systems cause discriminatory harm, there needs to be meaningful consequences. When platforms amplify misinformation or hate, they must be held accountable.

Protect Digital Rights

The right to contest algorithmic decisions. The right to explanation. The right to human review in high-stakes contexts. The right to be forgotten. These need legal protection.

Fund Alternative Models

Public investment in non-commercial platforms, community-owned infrastructure, and cooperative models. Not everything needs to be ad-funded and engagement-optimized.

Teach Digital Literacy. Let me say that again: Teach Digital Literacy.

From elementary school through adulthood, ongoing adult education is ever so important now, with the rise of the autonomous revolution. People need education about how these systems work, what they’re optimizing for, and how to maintain agency within them.

For All of Us: Cultural Shift

Ultimately, this requires a shift in how we think about technology itself. Think Societal Phase Change.

Technology as Practice, Not Tool

Stop thinking of technology as neutral tools we pick up and use. Recognize them as practices that shape us, as social systems that embody values and power relations. Ask: What kind of people do these technologies encourage us to become?

Evaluation Beyond Efficiency

Don’t just ask “Does it work?” Ask “What does it work for?” “Who benefits?” “What does it assume about what humans are and should be?” “What forms of life does it support or undermine?” (the last one is one of my favorites)

Human Flourishing as the Goal

Center the question: Does this technology support or undermine human dignity, meaningful work, democratic participation, cultural diversity, ecological sustainability, and genuine connection?

Not “can we build it?” but “should we build it?” Not “is it legal?” but “is it good for humanity and the greater good?”

Conclusion: The Stakes and the Hope

Where We Are

We stand at a hinge point in human history. For the first time, the tools we use to communicate, work, and understand ourselves are actively reshaping what it means to be human. Identity itself is being reworked—multiplied, datafied, predicted, and steered.

This is happening unevenly across cultures, but the pattern is clear: we’re becoming hybrid beings, our personhood co-produced with algorithms, our communities mediated by platforms, our self-understanding shaped by invisible systems optimizing for goals we didn’t choose.

What’s at Risk

If we continue on the current path—systems designed for engagement and profit, homogenization disguised as personalization, discrimination automated and made invisible—we risk:

  • A future where human dignity is subordinated to platform metrics
  • A world of diminished cultural diversity and enforced conformity
  • The erosion of autonomy and the capacity for genuine self-determination
  • Deepening inequality masked by the illusion of algorithmic objectivity
  • The loss of embodied community and authentic connection

What’s Possible

But it doesn’t have to be this way. We can build different technologies, grounded in different values:

  • Systems designed for human flourishing, not just engagement
  • Algorithms that protect and celebrate diversity rather than flattening difference
  • Platforms that enhance rather than replace embodied community
  • Technologies that support autonomy, exploration, and growth
  • Digital infrastructure accountable to democratic values and human rights

The Invitation

This isn’t just a technological challenge—it’s a profoundly human one. It asks us to think carefully about who we want to become, what kind of communities we want to build, and what kind of world we want to inhabit.

You don’t have to be a developer or policymaker to participate in this. Every time you choose awareness over automation, every time you invest in embodied presence over curated performance, every time you resist the pressure to conform to algorithmic logic—you’re participating in the creation of a different digital future.

The technologies we build and the way we use them will shape what it means to be human for generations to come. That’s not a burden. It’s an invitation to be thoughtful, intentional, and creative about the world we’re making together.

Final Thought

When you finish reading this and open your phone, I invite you to pause and ask: “Who am I when I’m online? Is that who I want to be? And if not, what am I going to do about it?”

Because in the end, technology doesn’t determine our future. We do. But only if we stay awake, stay human, and remain committed to the hard work of building systems worthy of human dignity.

The question isn’t whether technology will continue to shape us. It will. The question is: will we shape it back?

Discussion Questions (for Q&A)

  1. How have you noticed algorithms shaping your own identity or self-presentation online?
  2. What trade-offs between convenience and autonomy feel worth it to you? Which don’t?
  3. How do we balance individual responsibility with the need for systemic change?
  4. What examples have you seen of technology being designed well—for flourishing rather than just engagement?
  5. How might different cultures approach digital identity differently, and what can we learn from that diversity?
  6. What would accountability look like for the platforms and systems you use daily?


Create a website or blog at WordPress.com

Up ↑