Like every industry, Culture and Arts are changed by Technology, and it seems that each year, a new Boom/Hype/Craze emerges in a rapidly changing creative landscape…
the pandemic has hit the entire society, the creative sector started to have serious discussions related to the need of digitalization, VR, AR and MR becoming more and more mainstream.
NFT and blockchain were the new hype – ‘what is NFT’ was the question of the year, after Beeple sold digital artwork for a record of 69 million. Closely related to NFTs, blockchain, crypto and token – related discussions emerged. NFTart was born, only to be killed at the end of 2022, when even collectors agree that NFT is a tech, and the art is digital art.
, DAOs and web3 seemed to be the main discussion-drivers, till AI shined above all! AI art is the new boom! With Midjourney, DALLE-E, StabilityAI, and ChatGPT entering the space, discussions are more and more vivid. Similarly to the NFT-craze, artist space asks: what is art? Is AI art, art? Who owns the art made by AI? Copy-cats, copyright or inspiration? Will artists lose their jobs? Protests against AIart vs democratization of creation with AI. All these are ongoing discussions.
In 2023… the next-BOOM is…>>>>
>>We will see what the next boom will be. BUT.…One thing is sure:
.
And in this process, spaces for debates need to be created where uncomfortable questions can be asked such as:
How will our Collective Future be in the context of Culture and Technology, especially with the progress of AI?
*What are the gains and losses
*Can culture give meaning to humans?
*Can legislation offer an efficient answer when it comes to the need to regulate technology?
*What is at stake when it comes to evolved technology and our Future?
*How can we exercise some agency in relation to the future?
These are some of the questions debated at the event “Future of Culture in the age of tech innovations. AI as a game changer”, with guest speakers:
![]() Carmody Grey | ![]() David Yang | ![]() Dmitry Aksenov |
Carmody Grey, Durham University, philosopher and theologist,
David Yang (a Silicon Valley-based serial entrepreneur, specialist in AI.founder and Board Director at ABBYY, co-founder of Yva.ai, and a member of the Band of Angels)
Dmitry Aksenov, entrepreneur, investor, Immaterial Future Association.
The event was organized at Ayb Educational Foundation — a organization whose goal is to implement novel approaches in the field of education in Armenia, and does joint programmes with MIT and University of Cambridge.
TechvangArt made a short overview of the main ideas, to start reflecting on our Creative Futures.

Carmoday Grey underline that the first mistake to be avoided when talking about how technology, and especially AI will shape our future is to say that this is inevitable (which most people say, actually) because if something is very likely and inevitable, there is nothing we can say or do about that, other than just accepting it.
As an analogy, in the 30s and early 40, it become evident to everyone that american scientist will develop a weapon capable of killing hundreds and thousands of people – the atomic bomb. But, still because many people evaluated the change as a tragic one, we now live in a world which tries to limit the applications of nuclear technologies.
As such, is it very important to give ourselves that time, the space, the dignity to create a space to ask questions about technology and our futures such as:
What is there to gain, what is there to lose in AI generating cultural production? What is at stake?
How do we interpret AIs?
Do we celebrate AIs?
Do we lament AI?
Do we support AI?
Do we try to hold it back from AI?
How do we evaluate this?
Is this a social change we want to support?
Is this good for the future?
How can we exercise some agency in relation to the future?
We must not short-circuit the process of evaluating technological change, AI change just because we think it is inevitable, underline Grey.
The gains are obvious and exciting – these are new forms of creativity and innovation, new forms of democratization, of universal access, new forms of dissemination, to put culture in the hands of more humans.
If Culture is one of the things that make us more human, and arguably, is the thing that makes us the most human, then democratizing access to culture can only be good.
Carmody Grey acknowledges that it is very prominent in the tech community that AI can be smarter and more ethical than us. Implicitly, with this way of thinking, humans are seen as defective computers. This can lead to the attitude that human beings are not needed anymore, they can be put into a museum. And what is good about humans can be simply uploaded into a hardware with none of the downsides of being biological, and all of the upsides of being artificial.
Grey warns that I think that that is a mistake. And if we go in the future like that, we will lose the human, and that will be a genuine loss.
Related to this topic, read also the inspiring debate between Westworld director and writer, Lisa Joy, and Kogonada (director After Yang) about AI, Ethics and Future of Humanity here



Dr. David Yang considers art is another language we use to communicate, to send messages to fill each other. If we define art this way, the question is who we are communicating to?
When looking at a painting, we are communicating with the artist, but, if a picture was created by a non-biological creator, the question would be: do we want to communicate, to be together, to bond with this subject?
If the answer is yes, this non-biological subject will become part of our society, otherwise it will stay just a more sophisticated tool, but not an artist.
DNA-based creatures are not the final step of our evolution, firmly believes Dr. David Grey. In line with transhumanist theories, he considers that soon all biological creatures will have non-biological implants, and also originally non-biological creatures will have DNA-based implants.
As such, there will be no sharp boundaries between biological and non-biological society members, and our future will be hybrid, and we will embrace non-biological society members as equals.
And in years, we might see movements such as years, “Robot Lives Matter”, when people will fight to put non-biological creatures rights. And then, the next step , when robots will say “Human Lives Matter ” and they will fight to say that Humans are also good.
Read also the Future of Workers protest https://techvangart.com/2021/05/01/the-future-of-workers-protest/

Related to Art, Dr. David Yang foresees a mix society with biological and non-biological artists, where non-biological artists which will share their art with us, biological artists will create art specifically for non-biological consumers and will seek likes or will receive dislikes from non-biological members
Carmody Grey considers that Art is Communication, but at the same time is much more than communication. In human culture, art becomes valuable, because it expresses the sense that somebody is struggling with the meaning of their life, and the meaning of their existence, and is facing mortality, fragility and vulnerability, and all of that actually comes from being biological.
As such, the culture and ethics that a human being has, are not the same as a computer, they are very distinctive for us being biological subjects who are fragile, and morntal and needy.
It is something valuable to that, something valuable to being biological and no computer can ever replicate.
If we lose that, warns Carmody Grey, we lose something of uncountable value that can never be replaced by a computer.
If AI can create something new or to have emotions has haunted researchers for years.
In extremely large machine learning models, with millions of data points, scientists find something that they call emergent behavior – if the model is extremely big, it can create something which has never been seen before.
Related to emotions, researchers try to find several electronic versions of neuro-mediators and hormones to input these models. In this way, these huge models would be able to display complex emotions such as anger, jealousy, to be romantic, and so on.


Dmitry Aksenov highlighted that societies are experiencing changes due to connectivity, to information, transparency, and mobility. Everybody has access to as much information as it can handle.
There is no longer a monopoly on the future which used to belong to Institutions (State, corporations, cultural or educational institutions, etc) that were telling the stories of the futures, making people engaged in their strategies.
Now, people have their own vision, and enough access to information, to build up their own understanding of the future. But there is another side: There is a tsunami of information, so people have difficulty choosing the future.
As a result, everybody needs meaning – there is a huge demand related to “meaning” on all levels of societies, when picturing the future.
Culture used to deal with meaning – meaning of life and questions such as who we are, what is the purpose. Dmitry Aksenov considered that culture will be one important player that will propose and offer questions and answers related to meaning and our future.
Due to technological advancement, people will have more free time – when there will not be a need to work everyday. And, with more free time, people will inevitably turn to Culture; and more complicated mental structures would be of need for the human brain to enjoy its capacity of learning and creating new knowledge, is a very optimistic forecast of Aksenov.
The Chairmen of Immaterial Future, is a supporter of CultTech, and set-up an accelerator for startups working at the intersection of Culture and Technology. In the the first batch, interesting startups took part such as: Enote an Online library and editor for interactive sheet music or the French startup Embodme that produces a new musical instrument based on easy-to-customize touchscreen. Read here more about batch 1 startups.

If you have a Culttech startup, calls to participate in the accelerator are open till January, 8th 2023 …. You can read more about the Accelerator Here https://culttechaccelerator.org/and you can Apply now
…
When it comes to AI and Art, Dmitry Aksenov considers that the future belongs to hybrid interdisciplinary knowledge, where visual art, music or dance or literature will mix to create an experience that has different aspects.
For example, composers might use visual AI, as their assistants to create new experiences that were never accessible before. AI would act as a sophisticated tool.

Can we regulate AIs, do we need more or less regulations?
Dr. David Yang highlighted that currently it is self regulated. Previous models were easily able to switch to hate speech or negative instructions, but now, the systems will not answer some harmful questions such as: how to kill somebody.
Computer will say don’t do it, it is not good. But, still, people found a way to trick the system, quite easily, for example by deceiving the machine by telling that it is happening in a game scene. The system can give instructions very precisely.
Yang doesn’t believe that regulation will work. He considers that there are a lot of challenges, also ethical ones, when for example, in case of an unavoidable accident, a self-driving car will have to decide whether to kill a child or an elderly. What to decide?

Carmody Grey considers that regulation should come only at the end of a much bigger debate and conceptually fundamental question: which role do we want AI to play in the societies of the future?
What end do we want it to serve? We need some kind of picture of what a flourishing society looks like when it has highly developed artificial intelligence.
This question is very hard to separate from the earlier question: what is the distinctive value of human beings or what is it that we value about human beings as creators of culture?
What would be lost if human beings did not create culture anymore, because we outsource it all to artificial intelligences?
Carmody admits that this debate should have happened 30 years ago. Because what we did as society, these technologies got out into the mainstream, and only by the time they were totally circulating, we started to ask: is it actually good? And by that time, people have lost agency.
Looking to the future, she says that the question is: what debate we wish to be having now – people who are creating the technology that will be mainstream in 30 years time, they are in position to know, what is it that we will regret because we did not talk about?
Entrepreneurs and business people have a great responsibility, and it is greatly hoped that those people have a social conscience! (legislators don’t control anymore anyone that they think they do or as most people think they do. And the analogy with climate is painful – climate legislation is famously ineffective. And the same is true with technological legislation)
Related to the future debates about technology, somehow the questions might be more about humans because as Carmody Grey puts it:
“We need to know who we want to be, otherwise how can we shape a society worth living in? And we don't know who we want to be anymore…”