XR EthicAll Dilemmas

by paradoxig

XR with its fantastic new experiences, worlds, interactions and avatars is becoming more and more mainstream. And is and will be used not only in arts and entertainment, but also games, education, therapy, training!
So, it is the best of times for XR! Just let’s make sure that in a few years we won’t say it is the worst of times for XR!

As we start enjoying the new ways of story living, we must also keep in mind the dangers.
XR is a technology, with good and bad, so it is up to us. But, creators, designers and users, will have to start thinking also of the risks associated and how to diminish them in time.
There are no golden rules, yet, which makes the debates even more important.

So, we highlighted a few risks related to data, stories, danger of embodiments, concerns around ethical-washing, and we recommend a few more in-depth readings for those who are really interested in the field. And in the end we will highlight why we need curated high quality creative content. PHI. Centre is one such a physical multi-art expositional space that exposes XR-experiences as a priority, having mediators that help with onboarding and offboarding. (read HERE our interview)

While VEER VR is primarily an online platform, but having also curated (LBE) Location Based Entertainment spaces, targeting the tech-savvy generation. (read HERE our interview)

+++Plus the importance of building an ethics culture around XR
So, let’s welt though some of the issues related to ethics in XR, that are important also for creators.

Nowadays, it comes as no surprise that Big Tech is collecting and monetizing our data.
Our innocent clicks, likes and texts means data and revenues for companies (data is the new oil, right?).
But, XR is a completely new level. In VR, once immersed in an environment perceived as real, your behavioral and emotional reactions are very authentic. Even unconscious reactions.
AR is the same – soon glasses will be like phones, and it will become part of our daily wearable device.
Your reaction to certain triggers can be measured, and just imagine all those data about emotions, reactions, even subconscious ones, collected globally, segmented on age, race, etc, in the hands of 1 or 2 companies….

And once those data are collected, the question is: where and how are they stored?

And once those data are collected, the question is: where and how are they stored users data?
For example, users from a VR world are assured by creators that their data are not used.
But how about it…
Does the distribution platform use it?
What happens in the cloud?
How will these data be used in the future?
Similarly, an AR experience, where you try out your cool outfit in your home, what the device sees and where it stored the information and your room?
And XR creators have to be aware of these issues and choose accordingly.

Should we remind us how data are collected on social media platform? Or see a little bit more about Cambridge Analitica scandal?

And let’s don’t forget about the user agreements – those long, long lists of informing the user – is anybody reading them?… question is: will those be ‘disrupted”? 🙂 

Everywhere in the XR ecosystem, these are two important keywords: your design has to be safe and secure!
But, what exactly does this mean?
What is risk-free for one user might be so-risk free for another user that is actually…… boooooring!
And for the third user the same thing might be……..super-scary!
In most cases there are no guidelines.
It is trial-and-error and learning by doing, because hey, we are just playing around with this new tech environment.
But, still several things should be taken into consideration.

One of the very specific characteristics of VR experiences is to transport you in worlds that you can’t physically go to.
Sometimes these can be worlds that exist physically, but you can’t be there (or only very few people can go there): for example a trip in the “universe” or experiencing animal wildlife, or an underwater experience.
In different circumstances there can also be worlds that do not exist, fantastic worlds, such as those created in SF movies, or full of fairy tales. And the full immersion in these worlds is why VR experiences are so fantastic.
What if even genuinely crafted experiences become “scary”?
For example, to have some ‘action’ in a story-driven experience, a monster appears, or you get “hit” by a car or plane, or you bump into the “bad-guy” or you find yourself underwater when actually you are afraid of water?
It might be funny to say that, well, it is only a digital world, and the car was a digital asset.
But in the same way, a positive experience is uplifting the user, isn’t it possible that a negative experience has a negative emotional impact?
It can be argued that it is the same in other forms of content.
For example, films are showing emotionally disturbing content, but you can easily close the TV. While in VR, because your entire mind is tricked to think that this is real (which is why VR is outstanding), users seem to forget that they can close their eyes or remember which is the button on the hand-controller that interrupts the experience.
Taking down the HMD, especially when you have the controllers in your hand, might take some time, and the monster might kill you 🙂
Creators have to take into consideration audiences, viewers will not come back to unpleasant or traumatic VR experiences, or for a time, might even avoid VR.
On the other hand, creativity should not be limited, creators would argue.
The threshold for what is scary or not, might really differ from user to user. Sometimes, a deeper understanding is needed on behalf of the user before immersion in a certain experience.

To see the world from another perspective. In VR, you can be an adult, a child, your skin color can be different, you can be a butterfly, or an alien or a disabled.
The effect of the embodiment was well studied in VR for years. And this is one of the reasons VR has earned its name as an “empathy machine”.
There is also research conducted by Mel Slater, that shows the positive effect on a person if being embodied in VR as Einstein or Steve Jobs.
But, what if you are embodied as a dictator? Or a genius criminal?

Can this have a negative effect and how?
Do we need to do debriefing after the immersive experience?
Or do we need to build some sort of debriefing inside the story?
What if you are a positive character but you have to catch the criminals or you are fighting and shooting?
What if those groups you are chasing/killing are from a certain gender or race?
Would actions you do in the VR world influence your actions in the physical world? (for example, considering people that you don’t like as “disposable assets”?)
Further research in the field is needed, but violence and killing enemies in 2D games and in VR might have different (long-term) effects.

At least for the moment, VR is considered less addictive – as social media, as you have to put the HMD on, so it is not “one click away”. It is also less manipulative compared with films, as it does not have the frame carefully controlled by the director, so you come to a certain conclusion.
But VR is persuasive, as it tricks you to believe that the digital world is reality. And here it opens the door to the questions: how much can it make you believe?
We know that human beings can be tricked quite easily to believe things that have not happened.
Research in producing false memories is very conclusive in this sense.
A false memory is a recollection that seems real for the mind but is fabricated in part or in whole.
Elisabeth Loftus showed that false memories can be easily implemented through suggestions and leading questions.
If subjects were asked to tell about the time they got lost in the shopping mall, they did that with detailed recollection plus a range of emotions. Even if the action never happened!

Elisabeth Loftus – How reliable is your memory?

As VR is persuasive, this raises some questions that might we should consider for the future, such as:
Can VR facilitate memory implantation through the resynthesis of experiences?
Can VR become the magic Mirror of Erised from Harry Potter where you can live “your deepest, most desperate desires’ ‘ and start to believe in them? Just like the small Harry stayed in front of it to relive experiences for days till Dumbledore found out and took away the mirror.

Ethics become more complex as XR is developing alongside other technologies, such as AI for example. But alongside, we also have social-VR or social media.
So, as Thomas K. Metzinger would argue, you have to come up with ethics in all, and can’t specialize only in one part. But, companies know technologies as well, and there might be a conflict between user’s interest and pressure from shareholders to make more profits.
In addition, companies come up with techniques as well, such as ethical washing.

***It is when the industry and/or politicians constantly organise and cultivate ethical debates to buy time, to distract the public and to prevent or at least delay effective regulation and policy-making.
Or simply put: apply the rules of old-type-PR: “show that you care, talk about it endlessly, but… let’s avoid doing the real-thing that might change something!”. And these ethics issues are favored by other factors.

The pacing gap, which means that existing government structures are not able to respond to the challenges fast enough.
Meaning: evolution is accelerated, tech is developing so quickly, that it is almost impossible to regulate, understand its effect in due time.
It is a question for the future how governance can be much faster when it comes to regulation?
Additionally to all of these, the lobby mechanisms that are hard to understand, see and comprehend for the general public, also known as “the invisible bottleneck”.
The Lobby-Machine-Mechanisms are not debated in the media, but between lobbyists and regulators at different levels.
How to do it? Future will need an answer for that as well. But, so far, it seems like a “mission impossible”. (for further readings, please check Ethics Guidelines for Trustworth AI and articles by Thomas K. Metzinger, references below)


WE should really start building a culture of ethics around XR, AR, AI, and have more discussions, and raise the future creators with an “ethical mindset”, where always the users best interests should prevail, and not the companies responsibilities toward shareholders that demand profits. There will be ongoing discussions about the ethics in XR.
Laws are really important, but on the other hand, tech is evolving so fast that complicated legal procedures to regulate some issues (*remember, Facebook trials, when politicians could not properly understand business models about a global company).

Another important issue is: the importance of high-quality curated content, that is safe, without violence, content that is checked and does not raise questions.
As with any new tech, it is crucial to have a creative experimental space for creators.
But, if consumers have bad experiences, it might not come back, and give up XR. Because of this, the first impact with XR is really crucial, and it is important to have curated content and prepare people for the new experience.
Just another reason (beside the need for more distributors) why we would need more space with curated content such as PHI.Center that understands all ethical issues involved and is open to both emerging and acknowledged artists, and open to experiments.
And, obviously platforms such as VEER VR, open to both hobbyist and high-quality content, with an LBE curated branch that is able to engage a great number of people.

So, investors are really welcome to join the XR creative party! 🙂

And remember some main ideas 🙂
Do not Harm!
Empower users to stay in control!
Invest in research on the long term implication of XR!
Be careful it’s about human perception of…
Respect Privacy!
Give us our data! And so on…

For those who want to dig-deeper in the issues, here are some further materials for study:

XR ethics manifesto – Kent Bye

*The Ethics of Realism in Virtual and Augmented Reality – Mel Slater and collaborators
*Real Virtuality: A Code of Ethical Conduct. Recommendations for Good Scientific Practice and the Consumers of VR-Technology by Michael Madary and Thomas K. Metzinger

Why Is Virtual Reality Interesting for Philosophers? – Thomas K. Metzinger

* Ethics Guidelines for Trustworth AI – European Commision 2019 – High-Level Expert Group on AI
* Policy and investment recommendations for trustworthy Artificial Intelligence – European Commision 2020 – High-Level Expert Group on AI
Books:
Emerging Ethical Issues of Life in Virtual Worlds (NA) 2010 by Charles Wankel (Editor), Shaun Malleck (Editor)

THE MYTH OF REPRESSED MEMORY. Elizabeth F Loftus and Katherine Ketcham. St. Martin’s Press, 1994.

You may also like

1 comment

X-Washing Syndrome. Fake-Changers Schemes – TechvangArt 17th October 2021 - 12:30 pm

[…] HERE ”XR EthicAll Dilemmas”TechvangArt highlighted a few risks related to data, stories, danger of embodiments, concerns […]

Comments are closed.