Post-non-classical modification of techno anthropology E. Kappa

Cover Page


Cite item

Full Text

Abstract

The article studies the evolution of techno-anthropology in the modifications of the classical, non-classical and post-non-classical periods of philosophical reflection on the example of the projection of human organs by E. Kapp. The reasons for the actualization of the design theory of human organs by E. Kapp in post-non-classical philosophy are described. The categorical apparatus of the post-non-classical philosophy of technology is analyzed. In the semantics of the post-non-classical philosophy of technology, a tendency for the dominance of technological determinism has been revealed. The situation of transformation of the paradigm of anthropocentrism into competing paradigms by modern man is stated. One of them is formulated as a paradigm of the creative disappearance of a person. The second paradigm is positioned as a strategy for the spatial expansion of a distributed human body, hybrid integrity, assembly and assemblages.

Full Text

There were three stages in the evolution of the philosophy of technology. The first stage is classical. It originates in modern times. But the term «philosophy of technology» was introduced into the scientific circulation of the German language by E. Kapp only in the second half of the 19th century [1]. And it was E. Kapp who developed one of the first theories of the philosophy of technology [2]. This theory states that technical devices are natural extensions of human organs, arms and legs. The constructor unconsciously reproduces his organs in all his creations and cognizes himself on the basis of these artificial creations. Man was motivated to continue himself in the technical tools of labor by the physical limitations of his organic body, which stimulated thinking and creative imagination and pushed him to design activity. Largely thanks to tool activity, man became an independent force in the biosphere. He was able to effectively defend himself against predators and provide himself with food and safe housing. F. Engels stated this in his research. But there was a long historical period when philosophical reflection delimited man and technology as entities. This period originates in ancient philosophy. It was completed by the studies of K. Marx and E. Kapp.

K. Marx discovered the direct influence of technology and technology on the anthropological status of a person in the system of industrial production. Industrial workers themselves have discovered this influence. Confirmation was the movement of the Luddites. E. Kapp did not study economic philosophy. Therefore, he removed issues related to technology from acute social topics. This perspective is closer to ontology and anthropology in their general consideration. But at the non-classical stage of the evolution of the philosophy of technology, it was not possible to get away from the problems arising from it for the existence of man. Philosophers themselves did not aspire to this. Among them were N. Berdyaev, J. Ortega y Gasset, Z. Freud, K.G. Jung, O. Spengler and K. Jaspers. Technique was formulated as a causal factor of problems and threats for mankind. It was associated with the last stage of the life cycle of industrial culture.

The problems arising from the phenomenon of technology were recognized and tried to be solved by the philosophizing engineers of the Union of German Engineers of the FRG. M. Heidegger endowed technology and the instrumental thinking associated with it with critical characteristics [3]. G. Marcuse identified and described the first anthropological product of industrial technologies. He called him a one-dimensional man [4]. Risks in human evolution were discovered by E. Fromm [5].

Critical reflections on technology in non-classical philosophy did not influence the growth of the influence of technological determinism. They have not even formed a subject field for the analysis of a new threat posed by technology in the form of artificial intelligence and cybernetics. On the contrary, analytic philosophy in the 20th century focused on the problems of mathematical logic and philosophy of mind. These are important grounds for creating artificial intelligence.

The post-non-classical philosophy of technology continued the theme of the integration of man and instrumental technological components and digital artifacts. D. Aidi comes from this position [6, p. 34]. The non-neutrality of technology is always concrete and is expressed through a response to a challenge from outside, the main parameter of which is its effectiveness. The subject is the technological background of the expansion of the human body. Technical objects act as a way to supplement and enrich the corporality in the life world. An example is the vast toolkit of research and transformation of reality created over the millennia. With the help of technical means, it is possible to expand the boundaries of human sensibility, to go far beyond the limits of the limited body of the individual. The experience of the world through instruments is very different from the experience gained in the flesh. The transformation of experience is carried out due to the property of instrumental transparency, which allows you to perceive the world in a new sensory range.

Technology becomes part of the person. At the moment of using the instrument, the perception of a person is brought to the fore. This makes the experience unique and new. The transformation leads both to a certain expansion and to the intensification of experience, as well as its integral localization and transformation.

Post phenomenology considers the transformation of human experience through the prism of technical operationality. For this reason, the process of interaction between man and technology goes unnoticed, which, in turn, leads to an incorrect interpretation of the result. Because of this, an idea arises of the neutrality of technology, which seems to exist in parallel with man and cannot claim any independence or activity.

B. Latour, J. Lo, M. Callon and T. Hughes attempted a broader interpretation of the action not as a traditional causal relationship. Such an idea of action is transferred to all technical artifacts of culture and expands the boundaries of perception of technical reality. The quasi-other arises in the process of human interaction with technical objects. The most obvious example of a forced transformation of interaction is a smartphone, in order to work with which people need to know about the process of turning it on, launching the desired program, and saving information. The use of a mobile phone leads to a significant reorganization of the thinking process.

The attitude of the quasi-other reveals the depth of the connection between the individual and the instrument, which is between the person and the environment. The attitude of the quasi-other manifests itself not only in the process of interaction with complex technical devices, but also with cutlery, lock keys and water taps. In order to use even the most primitive technical objects in everyday practice, a person must be able to handle them. Learning to use cutlery and personal hygiene items, along with upright walking and the ability to speak, is an essential attribute of human civilization. The need to master technical means and their subsequent inclusion in individual phenomenological experience reveals the ontological relationship between man and technology. Frequent human interaction with technical objects not only transforms everyday practices, it leads to the formation of specific connections, which are called hermeneutic.

Technique is not only quasi-other for the individual. It also forms a hermeneutic connection between people and technical objects [7, p. 199]. In the process of reading, the device acquires an intermediary function between a person and the environment. As a result of the interaction, the vector of the further attitude of a person to the environment is set. Knowing what the weather will be like, a person decides what to wear or put on. Technique organizes the environment through various transformational structures. In the process of using measuring instruments that collect disparate information and focus a person’s attention on certain parameters of the environment, the transformative nature of technology can be traced.

The world is perceived as a text mediated by technology and requiring decoding. People merge with technology into one whole in order to gain the experience of the world mediated by it. According to D. Aidy, “the tool allows you to incarnate at a distance in order to get to the subject through the tool. But at the same time, it is a genuine extension of my sensuality. There is a sense of difference in experience itself, which is indicated by my preliminary distinction between «in the flesh» experience and mediated experience» [8, p. 19]. As a result of this interaction, a transforming perception of the world arises, which is subsequently modified into knowledge about the world and an individual way of existing in it.

Techno-technological transformation means the possibility a certain expansion and enhancement of experience, as well as its reduction or transformation. Thus, the technical objects used are never neutral instruments. Their engagement, always taking a concrete form through an implicitly embedded bodies and a substantive user request, provides the conditions for something. Phenomenological experience is formed on the frontier of sensory perception mediated by a technical object and the world itself, where technology plays a mediating role. The variety of particular examples included in the contextual analysis of various technical artifacts postulates the departure of the post phenomenology of technology from the axiological assessments of the objects under study.

A particular artifact is comprehended not from the point of view of good or evil, but from the standpoint of expanding human experience. This levels the dichotomy of man and technology in the classical philosophy of technology, revealing the technical part of the human self and its life world. This approach is very successful among the leading design laboratories in the USA and Western Europe, the countries of the Asia-Pacific region, resulting in the design of intuitive, human-sized and ergonomic equipment.

The digital space of communication and professional activity has been technologically formed faster than the institutional environment of this space with the normative part corresponding to its characteristics. In this vacuum, various forms of digital evil have become actuality. His categorical apparatus was formed. It contains such concepts as fake, phishing, deep fake and cybernetic bullying.

Fake is a modification of deception. Deception has a technological form of implementation. It has a social and individual order. The fight against fake involves methods and measures that can protect a certain living space from the invasion of fakes that threaten its existence. The modern media industry, whose representatives are fighting to develop principles to counter the production of fakes, is in a situation of paradox, a clash of positions. Fake news is a particular example of a larger problem of information verification quality. In addition, objectivity constantly coexists with the production of illusions. Specific tools for the production of meanings become a source of excessively high-quality content that people are not able to interpret as constructed. There is a system of precedents for underestimating the ability of digital algorithms to program, assemble and create reality. Mockumentary, as a way of creating an ironic, often mocking narrative towards an unsuspecting viewer, is an example of deliberate forgery. The best examples of mockumentry are considered a kind of terrorist act against the audience. And how else is it worth evaluating films, television and radio programs, where any cultural norms, agreements and even heroes are viciously, surreptitiously and at the same time not always obviously ridiculed. Technologies do not just construct an illusion, but create pictures or any other objects that are more real and naturalistic than the consumer can imagine. The problem of distinguishing between the real and the virtual remains relevant. The simplest tools for creating and editing digital images are being replaced by machine learning technologies, the results of which are changing the photography industry. Any photograph easily acquires a look more suitable for a conventional work of art.

Modern children and teenagers live in a mixed reality, combining different ethical systems that exist online and offline, using the capabilities of artificial intelligence, and they are aware of this. They not only consume content, but also create it they themselves form the rules of communication. For example, you cannot call strangers; you can’t record voice messages if you can write. You cannot use someone else’s content without a link to a photo with filters. It is necessary to put a mark on the use of the filter. Being in a digital environment is strongly associated with emotions. Children worry, fight for justice, proves their point of view.

The Internet environment dictates a fairly strict policy of security, privacy and social support. At the same time, online mechanisms designed to control and predict human thinking and behavior, including the rules of ethics were not created by child psychologists. The digital world is practically not described by anyone. To harmonize it, new laws and rules are needed that will form the boundaries of what is permissible. The generation of digital children is at risk. If there is a lockdown in the network, they will be left alone. Without an understanding of real life lost in almost all areas: education, family, communication, friends or even leisure.

New norms are not only emerging, but are already being institutionalized. The peculiarity of digital ethics is that it must be woven into a new product, adapted for society and the younger generation. The emergence of bloggers, hackers and other representatives of the emerging industry dictates a set of ethical standards. The social sphere (family, education, military operations) is changing with digital reality and its new ethics.

Large businesses are rapidly increasing the amount of data that users transfer in exchange for convenient services, and often sign a personal data agreement automatically, allowing the owners of the service or application to use it at their discretion. Private companies are trying to personalize their digital products as much as possible, while they themselves are in a situation of legislative uncertainty.

Another side of the issue when discussing the ethics of digital technologies involves the transparency of decision-making by artificial intelligence systems, including their possible use in public services. When making erroneous decisions, the most vulnerable social groups (residents of remote settlements, people with low digital literacy, pensioners) are likely to suffer, for whom it will be technically difficult to challenge the decisions made by the robot. Ethical issues in the field of artificial intelligence have moved into the field of technical regulation and standardization. Technical standards are being developed that will define the properties of the system that must make decisions. Such systems cannot be a black box and must be able to explain why a decision was made in a certain way. The ethics of such systems will depend on what imperatives will be invested in them by the developer.

People are concerned about omnipresence (the inability to hide from surveillance systems), the problem of identification (it is impossible to determine whether you are currently being observed or not), unpredictable behavior (lack of warning about video filming and other data collection systems in public spaces). For young people, the word «Internet» is not operational. She uses platform names.

Generation Z is already in the virtual universe. NFT technologies, personalized avatars, 3D avatars, Web3 are available to him. Often, services and resources are abused by the fact that the user is on their site and tries to either impose some kind of service or product, or force them to spend more time on their services. These are dark patterns. These are manipulative measures that are created using the design, interfaces and navigation of various applications in order for the user to make the necessary commercial action for the site, for the digital resource: how to buy, spend more money, spend more time or get more data. The commercialization of influencers becomes a barrier because it reduces the level of trust in the influencers themselves. In the flow of information, it is difficult to isolate the information that is verified. It is important to come to an understanding that for every word in social networks, they are responsible, as they are responsible for words in physical life. But technology support is needed. Developers are focused on neural network technologies, computer vision, to detect and anticipate, to predict that this or that content will potentially be malicious or toxic. It is important to create rules of operation, and these rules should be based on respect for the digital rights of users. Among them is the right of a citizen to freely seek information. The goal is a comfortable digital environment. Knowledge of the ethical dilemmas that arise in connection with digitalization, of the emerging rules of ethics for digital technologies, is especially important for public servants and managers.

To obtain a positive effect, decision makers must be aware of the development of technologies, understand what economic and social consequences their application will cause. Attention to the ethical side will help to make the decision more and thereby avoid conflicts in society between stakeholders. Public administration and interaction with citizens are being digitized. The effectiveness of such interaction depends on whether ethical risks have been taken into account. This is useful for those responsible for the development of digital services, products and systems aimed at citizens as service recipients and consumers or as workers. Possible conflicts and risks associated with the ethical side of the use of technology can be prevented if attention is paid to this when designing a service or product.

From the variety of digital technologies, digital data; artificial intelligence (AI); Internet of things (IoT) are of interest.

×

About the authors

Alexander I. Loiko

Belarusian National Technical University

Author for correspondence.
Email: alexander.loiko@tut.by

doctor of philosophy, professor, head of the department of philosophical teachings

Belarus, Minsk

References

  1. Kapp E. Grundlinien einer philosophie der technik. zur entstehungsgeschichte der cultur aus neuen gesichtspunkten. Braunschweig: George Westermann; 1877. (In Ger.)
  2. Mitcham C. Thinking through technology: the path between engineering and philosophy. University of Chicago Press; 1994.
  3. Heidegger M. Ontology – the hermeneutics of facticity (Studies in continental thought). Indiana University Press; 2008.
  4. Marcuse H. One-dimensional man: studies in ideology of advanced industrial society. New York: Routledge; 1991.
  5. Fromm E. The anatomy of human destructiveness. New York: Henry Holt; 1992.
  6. Ihde D. Postphenomenology: essays in the postmodern context. Evanston: Northwestern University Press; 1993.
  7. Verbeek PP. What things do: philosophical reflections on technology, agency, and design. Penn State: Pennsylvania State University Press; 2005.
  8. Ihde D. Technics and praxis. Dordrecht: D. Reidel Publishing Company; 1979.

Supplementary files

Supplementary Files
Action
1. JATS XML

Copyright (c) 2023 Loiko A.I.

Creative Commons License
This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.

This website uses cookies

You consent to our cookies if you continue to use our website.

About Cookies