When violence is served to us with such ease, might we be at risk of becoming numb? Or even worse, might we be letting algorithms determine not only what we see, but also how we feel about it?
Evi Varthi | September 18, 2025
It is not only the events
themselves that shock us, but also the way they reach us. Within just one week,
two shocking videos dominated social networks: the brutal murder of Irina
Zarutska on a bus and the cold-blooded execution of Charlie Kirk at
a public event. Beyond the tragedy itself, what caused greater concern was
their "packaging": the manner in which they were presented, shared,
and became a field of online confrontation.
The first video caused division,
with different sides promoting their own “truth.” The second provoked shock but
also a strange indifference, as if it were consumed with the same ease with
which we consume any other online content. And the question remains: when
violence is served to us with such ease, might we be at risk of becoming numb? Or
even worse, might we be letting algorithms determine not only what we see, but
also how we feel about it?
The illusion of
neutrality
Social media are not neutral.
They track our every move: what we click on, what we comment on, even where we
stop scrolling. Whatever captures our attention is multiplied. Not to offer us
the “truth,” but to keep us glued to the screen.
Thus is created the so-called “filter
bubble”: for the most part, we see content that agrees with our views,
while every now and then the opposing view slips in—not by chance, but because
anger yields more engagement.
When violence
becomes “content”
Repeated exposure to violent
images gradually leads to desensitization. The first video shocks us. The
second unsettles us. By the tenth, we might simply scroll past it. Thus, murder
or assault is turned into mere “content,” into spectacle. The line between
tragedy and entertainment begins to blur.
The platforms and
their traps
- Facebook & Instagram:
reinforce our views but also throw in “sparks” of disagreement.
- TikTok: traps users more
quickly in a “bubble,” with violent clips going viral within hours.
- X (Twitter): a mix of
echo chambers and extreme confrontations.
- YouTube: the classic
“rabbit hole” – from one video, you can find yourself in an entire chain of
one-dimensional information.
The impact on our
consciousness
Algorithms are changing the way
we see the world:
• They make us believe that
“everyone” agrees with us.
• They amplify the most extreme
voices of the “other side.”
• They numb us emotionally.
• They create an illusion that
the world is more dangerous than it actually is.
What we can do
The solution is not easy, but
there are antidotes:
• To follow a variety of voices,
even if we disagree.
• To cross-check news from
different sources.
• To limit our exposure to harsh
images.
• To ask ourselves: does this
post inform me, or does it simply enrage me?
The only certain thing is that
nothing happens by chance. What we see and how we perceive it is the result of
design. And the more conscious we become of this reality, the freer we will
remain in regard to what we truly believe — and feel.
About the author: Evi Varthi is a communications specialist, image consultant,
and founder of Idol Image Consulting, based in the USA. She holds a Master’s
degree in Communication and New Journalism, as well as a degree in Hellenic
Culture Studies from the Open University of Cyprus and in Liberal Arts from the
University of Hawaii, Maui College. She has lived and worked in Greece,
Bahrain, and the USA, gaining valuable international experience. She
specializes in strategic image management, intercultural communications, and
crisis communication management, with a focus on building strong personal and
professional identities, centered on public narratives and their influence on
public opinion.
No comments:
Post a Comment
Note: Only a member of this blog may post a comment.