Tag Archives: aiaesthetics

<< I Don't Understand >>

“What is a lingo-futurist?,” you ask?

It is a fictional expert who makes predictions
about the pragmatics and shifts in social connotations of a word.

Here is one such prediction by a foremost lingo-futurist:

“2023 will be the year where ‘understand’ will be one of the most contested words.

No longer will ‘understand’ be understood with understanding as once one understood.

Moreover, ‘I don’t understand’ will increasingly —for humans— mean ‘I disapprove’ or, for non-human human artifacts, ‘the necessary data was absent from my training data.’

‘Understand’, as wine during recession, will become watered-down making not wine out of water yet, water out of wine, while hyping the former as the latter.

All is well, all is fine wine, you understand?”

—animasuri’23

In The Age Of Information

In the Age of Information, the Age of Reason has been surpassed, signaling the return to finding meaning —confused as “knowledge”— in mesmerization, fad, hype, snake oil and data as snowflakes, moldable in any shape one desires, and quickly diffused, convolutedly, in the blinding Sun. In erotic dance this Age of Information copulates with the Age of Sharing giving its offspring heads to bump in shared gaseous dogmas.

                           ---animasuri’22


<< Transition By Equation >>

focus pair: Mechanomorphism | Anthropomorphism

One could engage in the following over-simplifying, dichotomizing and outrageous exercise:

if we were to imagine that our species succeeded in collectively transforming humanity, that is, succeeding in how the species perceives its own ontological being as one of:

“…we are best defined and relatable through mechanomorphic metaphors, mechanomorphic self-images, mechanomorphic relations and datafying processes,”

At that imaginary point, any anthropomorphism (as engine for designs or visionary aims) within technologies ( and that with a unique attention to those associated with the field of “AI”) might be imagined to be(come) empowered, enabled or “easier” to be accomplished with mechanomorphized “humans.”

In such imagination, the mechanomorphized human, with its flesh turned powerless and stale, and its frantic fear of frailty, surrenders.

It could be imagined being “easy,” & this since the technology (designer) would “simply” have to mimic the (human as) technology itself: machine copies machine to become machine.

Luckily this is an absurd imagination as much as Guernica is forgettable as “merely” cubistic surrealism.

<< Not Condemning the Humane into a Bin of Impracticality >>


There’s a tendency to reassign shared human endeavors into a corner of impracticality, via labels of theory or thing-without-action-nor-teeth: Philosophy (of science & ethics), art(ists),(fore)play, fiction, IPR, consent & anything in-between measurability of 2 handpicked numbers. Action 1: Imagine a world without these. Action 2: Imagine a world only with these.

Some will state that if it can’t be measured it doesn’t exist. If it doesn’t exist in terms of being confined as a quantitative pool (e.g. data set) it can be ignored. Ignoring can be tooled in a number of ways: devalue, or grab to revalue through one’s own lens on marketability.

(re-)digitization, re-categorization, re-patterning of the debased, to create a set for remodeled reality, equals a process that is of “use” in anthropomorphization, and mechanomorphization: a human being is valued as datasets of “its” output, e.g., a mapping of behavior, results of an (artistic or other multimodal) expression, a KPI, a score.

While technology isn’t neutral, the above is neither singularly a technological issue. It is an ideologically systematized issue with complexity and multiple vectors at play (i.e. see above: that what seems of immediate practicality, or that what is of obvious value, is not dismissed).

While the scientific methods & engineering methods shouldn’t be dismissed nor confused, the humans in their loops aren’t always perceiving themselves as engines outputting discrete measurables. Mechanomorphism takes away the “not always” & replaces it with a polarized use vs waste

Could it be that mechanomorphism, reasonably coupled with anthropomorphism, is far more a concern than its coupled partner, which itself is a serious process that should also allow thought, reflection, debate, struggle, negotiation, nuance, duty-of-care, discernment & compassion?

epilogue:

…one could engage in the following over-simplifying, dichotomizing and outrageous exercise: if we were to imagine that our species succeeded in collectively transforming humanity (as how the species perceives its own ontological being) to be one of “we are best defined and relatable through mechanomorphic metaphors, relations and datafying processes,” then any anthropomorphism within technologies (with a unique attention to those associated with the field of “AI”) might be imagined to be(come) easier to be accomplished, since it would simply have to mimic itself: machine copies machine to become machine. Luckily this is absurd as much as Guernica is cubistically surreal.

Packaging the above, one might then reread Robert S. Lynd’s words penned in 1939: “…the responsibility is to keep
everlastingly challenging the present with the question: But what is it that we human beings want, and what things would have to be done, in what ways and in what sequence, in order to change the present so as to achieve it?”

(thank you to Dr. WSA for triggering this further imagination)

Lynd, R. S. (1939). Knowledge For What?. Princeton: Princeton University Press

<< Morpho-Totem >>


Decomposition 1

my hammer is like my_______
my car is like my______
my keyboard is like my______
my coat is like my_____
my watch is like my______
my smart phone is like my______
my artificial neural network is like my______
my ink is like my_______
my mirror is like my________
my sunglasses are like my______
my golden chains are like my_________
my books are like my_________

Decomposition 2

my skin is like a_______
my fingertips are like a_______
my fist is like a_____
my foot is like a_______
my hair is like a_________
my bosom is like a________
my abdominal muscles are like a______
my brain is like a__________
my eyes are like a________
my genitalia are like a______
my dna is like a______
my consciousness is like a______

reference, extending
to the other desired thing
not of relatable life

—animasuri’22

<< One Click To Climbing A Techno Mountain >>


A Rabbi once asked: “Is it the helicopter to the top that satisfies?”

At times, artistic expression is as climbing. It is the journey that matters, the actual experience of the diffusion of sweat, despair, and to be taken by the clawing hand of an absent idea about to appear through our extremities into an amalgamation of tool- and destination-media.

The genius lies in the survival of that journey, no, in the rebirth through that unstable, maddening journey and that incisive or unstopping blunt critique of life.

That’s clogs of kitsch as blisters on one’s ego, sifted away by the possible nascence of art, the empty page from the vastness of potential, the noise pressed into a meaning-making form as function.

Artistry: to be spread out along paths, not paved by others. And if delegated to a giant’s shoulder, a backpack or a mule: they are companions, not enslaved shortcuts.

That’s where the calculated haphazardness unveiled the beauty slipping away from the dismissive observer, either through awe or disgust alike, ever waiting for you at your Godot-like top, poking at you

—animasuri’22

<< data in, fear & euphoria out >>


A recent New Scientist article stub [5] claims “More than one-third of artificial intelligence researchers around the world agree…”

Following, in this article’s teaser (the remainder seems safely and comfortably behind a paywall) “more than one third” seems equated with a sample of 327 individuals in a 2022 global population of an estimated 7.98 billion [2, 8] (…is that about a 0.000004% of the population?)

This would deductively imply that there are less than 981 AI researchers in a population of 7.98 billion. …is then 0.0000124% of the population deciding for the 100% as to what is urgent and important to delegate “intelligence” to? …surely (not)… ( …demos minus kratos equals…, anyone?)

Five years ago, in 2017, The Verge referenced reports that mention individuals working in the field estimated at totaling 10’000 while others suggested an estimate closer to 300’000 [9] (…diffusioningly deviating).

As an opposing voice to what the 327 individuals are claimed to suggest, there is the 2022 AI Impacts pole [4] which suggests a rather different finding

Perhaps the definitions are off or the estimations are?

When expressing ideas driven by fear, or that are to be feared, one might want to tread carefully. Fear as much as hype & tunnel-visioned euphoria, while at times of (strategic, rhetorical, or investment pitching) “use”, are proverbial aphrodisiacs of populist narratives [1, 3, 6, 7]

Such could harm to identify & improve on the issue or related issues which might indeed be “real”, urgent & important

This is not “purely” a science, technology, engineering or mathematics issue. It is more than that while, for instance, through the lens created by Karl Popper, it is also a scientific methodological issue.

—-•
References:

[1] Chevigny, P. (2003). The populism of fear: Politics of crime in the Americas. Punishment & Society, 5(1), 77–96. https://doi.org/10.1177/1462474503005001293

[2] Current World Population estimation ticker:https://www.worldometers.info/world-population/

[3] Friedrichs, J. (n.d.). Fear-anger cycles: Governmental and populist politics of emotion. (Blog). University of Oxford. Oxford Department of International Development. https://www.qeh.ox.ac.uk/content/fear-anger-cycles-governmental-and-populist-politics-emotion

[4] Grace, K., Korzekwa, R., Mills, J., Rintjema, J. (2022, Aug). 2022 Expert Survey on Progress in AI. Online: AI Impacts. Last retrieved 25 August 2022 from https://aiimpacts.org/2022-expert-survey-on-progress-in-ai/#Extinction_from_AI 

[5] Hsu, J.(2022, Sep).A third of scientists working on AI say it could cause global disaster. Online: New Scientist (Paywall). Last retrieved 24 Sep 2022 fromhttps://www.newscientist.com/article/2338644-a-third-of-scientists-working-on-ai-say-it-could-cause-global-disaster/

[6] Lukacs, J. (2006). Democracy and Populism: Fear and Hatred. Yale University Press. 

[7] Metz, R. (2021, May). Between Moral Panic and Moral Euphoria: Populist Leaders, Their Opponents and Followers. (Event / presentation). Online: The European Consortium for Political Research (ecpr.eu). Last retrieved on 25 September 2022 from https://ecpr.eu/Events/Event/PaperDetails/57114

[8] Ritchie, H., Mathieu, E., Rodés-Guirao, L., Gerber, M.  (2022, Jul). Five key findings from the 2022 UN Population Prospects. Online: Our World in Data. Last retrieved on 20 September 2022 from https://ourworldindata.org/world-population-update-2022

[9] Vincent, J. (2017, Dec). Tencent says there are only 300,000 AI engineers worldwide, but millions are needed. Online: The Verge. Last retrieved 25 Se 2022 from  https://www.theverge.com/2017/12/5/16737224/global-ai-talent-shortfall-tencent-report

—-•

<< Philo-AI AI-Philo >>

The idea of Philosphy is far from new or alien to the field of AI. In effect, a 1969 paper was already proposing “Why Artificial Intelligence Needs Philosophy”

“…it is important for the research worker in artificial intelligence to consider what the philosophers have had to say…” 

…have to say; will have to say

“…we must undertake to construct a rather comprehensive philosophical system, contrary to the present tendency to study problems separately and not try to put the results together…” 

…besides the observation that the “present tendency” is one that has been present since at least 1969, this quote might more hope-inducingly be implying the need for integration & transdisciplinarity

This 1969 paper, calling for philosophy was brought to us by the founder of the field of Artificial Intelligence. Yes. That human that coined the field & name did not shun away from transdisciplinarity

This is fundamentally important enough to be kept active in the academic & popular debates

Note, philosophy contains axiology, which contains aesthetics & ethics. These are after-thoughts in present-day narration that make up some parts of the field of “AI”

Some claim it is not practical. However note, others claim mathematics too is impractical.  Some go as far with the dismissal as to stating that people studying math (which is different from Mathematics) end up with Excel

These debasing narratives, which are also systematized into our daily modes of operation & relation, are dehumanizing

Such downward narration is not rational, & is tinkering with nuances which are not contained by any model to date

Let us further contextualize this

Machine-acts are at times upwardly narrated & hyped as humanized (ie anthropomorphism). Simultaneously human acts are (at times downwardly) mechanized (ie mechanomorphism)

These opposing vectors are let loose into the wild of storytelling while answering at times rather opaque needs, & offering unclear outcomes for technologies, packaged with ideological hopes & marketable solutions. The stories are many. The stories are highly sponsored & iterative.  The stories are powered by national, financing & corporate interest.  ok.  & yet via strategic weaponization of story-telling they divide & become divisive. ok; not all. Yet not all whitewash those who do not

In these exciting & mesmerizing orations, who & what is powering the enriching philosophical narratives in a methodological manner for the young, old, the initiated, the outlier or the ignorant? 

Here, in resonance with McCarthy, philosophy (axiology) comes in as practically as mathematics. These  imply beauty & complexity of a balancing opportunity which are not debasing technological creativity. This transdisciplinarity enables humanity. 

Nevertheless Bertrand Russell probably answered the question as to why Axiology is paid lip service yet is kept at bay over again: ““Men fear thought as they fear nothing else on Earth” (1916)


Reference

McCarthy, J., Hayes, P.J. (1969). Some Philosophical Problems from the Standpoint of Artificial Intelligence. In B. Meltzer and D. Michie. (eds). Machine Intelligence 4, 463–502. Edinburgh University Press
http://jmc.stanford.edu/articles/mcchay69/mcchay69.pdf

OR

McCarthy, J., & Hayes, P. J. (1981). Some Philosophical Problems from the Standpoint of Artificial Intelligence. In Readings in Artificial Intelligence (pp. 431–450). Elsevier. https://doi.org/10.1016/B978-0-934613-03-3.50033-7

<< Promethean Tech>>


The Ancient Greek Gods sadistically acknowledged Prometheus for giving fire to the humans. Democratization of fire was met with fierce opposition from those who controlled it: the other Gods. These Uber-creatures chained the individual Titan, a god of lesser stature, onto a rock so that a symbol of power and might could infinitely eat his liver: Zeus’ emblematic eagle.

The eagle was attracted to the magnetism of Prometheus’ liver. The symbols are imposing and heavy. In contrast, the potential interpretative coherence is elegantly following Brownian non-motifs: randomness or arbitrariness. The audience to this theatrical display of abuse are both human and those other godlike creatures. Observing the plight of Prometheus, both sentient sets might now be convinced to mute any ethical concern or dissent. One would not want to suffer what Prometheus is suffering.

A fun fact is that the story, its nascence(s), and its iterations are “controlled,” again in a Brownian manner, by collectives of humans alone. No Gods were harmed in the making of this story. A story in the making it still is nevertheless.

An abrupt intermezzo as a short interconnecting move: the humans who wrote this Greek story might have been as disturbing as the persons concocting any technology that is intended at rendering humans mute under the candy-flag of democratizing access to technological magic, sparkles, and meaning-making. At least both authors are dealing with whimsical-ness of themselves or with the fancifulness they think to observe and hope to control with their form of storytelling: text as technology, and technology creating text. Where lie the nuances which could be distinguishing or harmonizing the providers of technology with the authors of Prometheus?

The low hanging fruit is as often the plucking of simple polarizations such as past versus present, or fictional versus factual. And yet, nuancing these creates a proverbial gradience or spectrum: there is no “versus,” there is verse. The nuance is the poetry and madness we measure and journey together. The story is the blooming of relations with the other in the past, the now and the future.

Prometheus is crafted and hyped as the promoter of humanity. So too are they who create and they who bring technologies into the world. Their hype is as ambrosia. Yet here the analogy starts to show cracks. Prometheus was not heralded by the story-encapsulated Maker (i.e. Zeus). Only questionably Prometheus was heralded by the actual maker: the human authors of the story.

The innovations in the story were possibly intended not to be democratized, for instance: fire, and more so, the veiling of human authorship, and the diluting of agency over one’s past acts (e.g. Prometheus transferring fire). These innovative potentials (desirable or not) were fervently hidden and strategically used when hierarchical confirmations were deemed necessary. What is today hung from the pillory and what is hidden from sight? What is used as distraction by inducements of fear (of pain), bliss or a more potent mix thereof?

Zeus was hidden from sight. His User Interface (UI) flew in when data was needed to be collected. The pool of data was nurtured as was the user experience (UX): Prometheus’ liver grew back every night. In some technologies the character of the Wizard of Oz (which is as if Zeus hiding behind the acts of his eagle) is used to discuss how users were tricked in thinking that the technology is far more capable than what it actually is. In some cases there are actual humans at play, as if ghosts in the machine, flying in from a ubiquitous yet hidden place.

For the latter one can find examples, as well as for: people filtering content as if a bot catering to the end-user (making the user oblivious of the suffering such human filterer experiences); people not consenting to their output being appropriated into dehumanized databases; a linguistic construct and Q&A sliding a human into confusing consciousness, sentience or artistry as existent at genius levels in the majority of humans or in the human-made technology as much as in the human user. This could be perceived as a slow metamorphosis toward the making of anthropomorphic dehumanization. Indeed, one can dismiss this flippantly by asking: ” what makes a human, human? Surely not only this nor that, …nor that, nor…”

Democratization is not that of technology (alone). In line with the thinking of technology as democratizing; e.g. (as some are claiming) democratizing “art” via easily accessible technological *output,* one can then in extension as well argue that delusion is more easily democratized. Delusion of being serviced (while being used as data source and as if being a product offloading platform), of being cared for (while turned into a statistic); of being told to be amazing (rather than being touched by wonder, open inquiry, and duty of care); of being told to be uniquely better (rather than increasing being part of relational life with others as opposed to positioned above others); and the delusion of being in control of one’s input and output (yet being controlled); of having access, and so on. In this flippant manner of promoting a technology, democratization is promoted via technologies as analogous to stale bread and child-friendly games.

The idea of access is central: access to service, amazement, uniqueness, geniuses, and cultural heritages (especially those one does not consider one’s own). A pampered escapism.

The latter is especially intriguing when observing the Diffusion Models anyone can access which are based on billions of creations by humans that came before us or that even now still roam among us. (e.g. https://beta.dreamstudio.ai/home or https://www.midjourney.com/home/ ).

This offers a type of access to any set of words transformed into any visual. This offers a type of access without those being accessed having any knowledge of the penetration: the artist’s work stored in the databases. Come to think of it, it is not only the artist and rather it is any human utterance and output stored in any database. With these technologies, as if the liver of Prometheus, the human echoes are not accessed with the consent of their creators. “Democracy,” as access for all, by the beak of Zeus’ eagle. Is that democracy or is that more like unsolicited “access” to a debilitating drug slipped into one’s drink? At least Prometheus felt it when he was picked for his liver.

In closing:

The awesome and tricking power of story is that one and the same structural story can drape various types of functional intent, leaving meaning as opaque and deniable. Yes, so too is this story here. So too is Prometheus’ story as well as the stories of technologies, such as the story of Diffusion models and the word-to-visual technologies derived from this.

Transformation of history as a diffusible amalgamation via stable diffusion technologies is taking human artifacts as Promethean livers to be picked, regrown and picked again. This is irrespective of the proverbial or actually experienced “pleasurable” pain it keeps on giving. These technologies are promoted as democratizing. It isn’t because I state that my technology is democratizing that it actually was intended to be, that it turns out to be, or that it is applied to be democratizing. Moreover, if this is the depth of democracy, to blindly take what came before, one might want to reconsider this Brownian interpretation of the story of democracy.

The magnetism of life hinted at in human expressions can be borrowed, adapted, and adopted. We learn from the others if we know what it is they have left us to build upon. We can innovate if we understand or are enabled to understand over time what it is that is being transformed. And yet, the nuance, reference and elegance with which it could be considered to be done allows for consciousness, discernment and awareness to be communicated, related and nurtured.

At present the vastness and opaqueness of the databases, within which our data are gluttonously stored, do not yet allow this finesse. While the stories they reinterpret and aggregate could be educational, stimulating and fun, we might want to consider the value-adding meaning-making randomness outside of that of expert designers, into the hands of the masses. Vast technology-driven access is not synonymous to democratization. Understanding and duty of care are intricate ingredients as well for any demons in democracy to be kept at bay.

<< what’s in a word but disposable reminiscence >>


A suggested (new-ish) word that perhaps could use more exposure is

nonconsensuality

It hints at entropy within human relations and decay in acknowledgement of the other (which one might sense as an active vector coming from compassion). Such acknowledgement is then of the entirety of the other and their becoming through spacetime (and not only limited to their observable physical form or function).

It is however, secondly, also applicable in thinking when acting with treatment (of the other and their expressions across spacetime), with repurposing, and in the relation in the world with that what one intends to claim or repurpose.

Thirdly, this word is perhaps surprisingly also applicable to synthetic tech output. One could think about how one group is presented (more than an other) in such output without their consent (to be presented as such). Such output could be an artificially generated visual (or other) that did not exist previously, nor was allowed the scale at which it could be mechanically reproduced or reiterated into quasi infinite digital versions.

Fourthly, through such a tech-lens one could relate the word with huge databases compiled & used to create patterns from the unasked-yet-claimed other or at least their (creative, artistic or other more or less desirable) output that is digital or digitized without consideration of the right to be forgotten or not be repurposed ad infinitum.

Fifthly, one could argue in nurturing future senses of various cultural references, that could be considered to also be applicable to those (alienated) creations of fellow humans who have long past, and yet who could be offered acknowledgement (as compensation for no longer being able to offer consent) by having (in a metadata file) their used work referenced.

As such I wish I could give ode to they or that what came before me when I prompted a Diffusion Model to generate this visual. However I cannot. Paradoxically, the machine is hyped to “learn” while humans are unilaterally decided for not to learn where their work is used or where the output following their “prompt” came from. I sense this as a cultural loss that I cannot freely decide to learn where something might have sprouted from. It has been decided for me that I must alienate these pasts without my consent whether or not I want to ignore these.

—-•

aiethics #aiaesthetics #aicivilization #meaningmaking #rhizomatichumanity

Post scriptum:

Through such cultural lens, as suggested above, this possible dissonance seems reduced in shared intelligence. To expand that cultural lens into another debated tech: the relation between reference, consent, acknowledgment and application seems as if an antithetical cultural anti-blockchain: severed and diffused.