Научная статья на тему 'Civil Society and Monitoring the Social Impact of Technologies'

Civil Society and Monitoring the Social Impact of Technologies Текст научной статьи по специальности «СМИ (медиа) и массовые коммуникации»

107
32
i Надоели баннеры? Вы всегда можете отключить рекламу.
i Надоели баннеры? Вы всегда можете отключить рекламу.
iНе можете найти то, что вам нужно? Попробуйте сервис подбора литературы.
i Надоели баннеры? Вы всегда можете отключить рекламу.

Текст научной работы на тему «Civil Society and Monitoring the Social Impact of Technologies»

Liliia Zemnukhova

Sociologist, Candidate of Sciences, Research Fellow at the Sociological Institute of the FCTAS RAS and the STS Center of European University at Saint Petersburg

Civil Society and

g Modern technologies reflect social tensions in society,

o enhance established structural inequalities, and reproduce

£ cultural beliefs. Since the currently dominating techno-

u

£ cratic approach to technological development implies a

£ rational logic even regarding the social impact, it is limited

| and needs to be reconsidered; additionally, new actors must

= be involved at different stages of production. A change of

o paradigm can be brought about by improved social under-

co

js standing which should become the basis for technological

|> decision-making even before the closure of the black box

J that is technology which is not subject to change.

J The development process requires the involvement

H of participants with strong fields of expertise, especially

>■ when it comes to social development: the participation of

'§ local communities, socially oriented NGOs, and other civil

= representatives is necessary. At the same time, joint par-

's ticipation of developers and public representatives sets a

new range of problems and challenges. Who are these new actors? How do we control and monitor them, and make them accountable? What competencies will be required for this? What will define the borders of responsibility and serve as a guideline in estimating the actors' performance? Is it possible to achieve transparency/technology of transparency? Can technologies control technologies? This chapter is based on examples of technological trends such as privacy and ethics of technologies, AI-related development, and blockchain.

A problem area

In 2018, ethical issues of technological development and artificial intelligence in particular officially became the concern of industrial players, and IT giants started to assemble internal commissions and ethics committees. Following Microsoft's lead, Google, SAP and Facebook formed working groups on ethics. Ethics and data privacy had become critical not only for companies and product

93

g users but also for government actors. It is symptomatic

S that in 2019 Google dissolved its committee1 after one

| of its members made disparaging remarks about LGBT

n people and migrants, making her membership on the

TO

|= ethics committee an ethical problem in itself. This was a

sign of how technological development has become an arena for various political and ideological decision-making processes in which experts, ordinary employees, and the public can and want to participate.

Technologies cannot remain self-regulating systems because they affect too many areas of social life, and the social impact of their development depends on the degree to which social groups and communities are involved in technological production and distribution. This chapter focuses on the problem of civil society participation in technology development and ways in which it can be controlled. We will discuss three technological trends the development of which is critically impacted not by engineering participants: blockchain, privacy and ethics, and AI-based development. Social construction of technology (SCOT) was chosen as the theoretical and methodological background from the disciplinary field of science and technology studies (STS).

The ideas of SCOT are based on the notion that developing technology cannot be limited to engineering solutions, as the process is impacted by many other groups and participants who have no less influence on the process than the engineers themselves. In the research environment and academic literature, STS in general and SCOT in particular have evolved as an attempt to overcome the limited perspective of technological determinism, which continues to dominate in all technological directions, policies, and reports despite having been actively criticized since the 1970s.

SCOT marked a turnaround in technology and social research, allowing a critical approach to the process of development and distribution of technologies, as well as their use. The key idea was that science and technology

94

<0

>

should be considered as a process of joint participation of different social groups and their interests. In this context, civil society is becoming an active participant not only in the creation and modification of technologies but also in controlling their development. SCOT helps understand how technological decisions are made, how technological production is organized, and what actors and circum-o stances influence technological development. For exam-

ple, the "definition of a situation"2 was proposed in order to understand the direction of behavioral changes in the context of structures, processes, groups, and individuals. Thus, SCOT provides explanatory resources for identifying and shaping the roles of such stakeholders3 as civil society.

Studies tend to focus on the role of production and consumption in the process of technological development, whereas the clarity of boundaries between them is " called into question. The technology creation process was

previously considered as formation or production, as an attempt to suggest user practices.4 Users are given special attention because they are active agents of changes5 in technology, its modification and reconstruction. Users add unexpected practices6 and resist or even refuse technologies. The shift of attention towards users in the literature on social research technologies was designed to show how unpredictable and diverse they are; how they consume, modify, domesticate, project, reconfigure, and resist.7 Technological devices are designed due to interactions, as they are seen by relevant groups. The status of relevant groups is revealed in user debates, advertising, and political messages that organize and form common views; established institutional niches are defined, and ties are routinized. In this context, civil society representatives become special users because of their active position, symbolic power, and potential political influence.

Technologies introduce changes in a range of scales, so they should always be considered in different contexts, such as infrastructure, practices, institutions, and culture. SCOT aims to avoid extremes of sociological design and techno-

g logical determinism.8 As such, it is necessary to observe the

S social consequences of sociotechnical changes by long-

| term observations of real practices including production

n and use. In order not to separate the social and technical

elements, it is necessary to unwind the construction event as a piece of history with its own forces, interests, and sociotechnical ensembles. Sociotechnical ensembles are created with early adopters who provide quick feedback and the cooperation with whom is the main form of interaction.

Background and context

State-of-the-art technologies act as an accurate mirror of what is happening in society. They emphasize tensions, highlight conflicts, reveal problem areas, increase fears, and even create new types of differences or inequalities. This is especially evident in AI and machine learning, the work of which is based on high volumes of publicly available data. If such data is associated with human behavior and interactions, the algorithms quickly capture and reproduce the most popular and common patterns.

Stigmatization, stereotypes, and profanity are what the first traits learned by algorithms which have access to data on internet user behavior or applications such as voice assistants. Racism, sexism, and other types of discrimination are readily recognized and accepted as the norm. Social researchers define patterns such as bias in data: patterns are reproduced because they quantitatively dominate and are replicated almost without question. The "State of AI" report9 provides the following examples of such biases (for more see the chapter in this volume by Gunay Kazimzade "Technologies of diversity vs. technologies of discrimination: the case of Al-based systems"):

— the first page of results for a "CEO" query in an image search shows exclusively white men

— the Google image recognition app labels Black people as gorillas

96

<0

>

— searching for names that sound African American are accompanied by ads for verifying criminal records

— the YouTube voice-to-text function does not recognize women's voices

— HP facial recognition cameras do not recognize Asian people

— Amazon classifies LGBT literature into the 18+ category o and removes sales ratings

When empirical facts from a displaced "real life" turn into algorithmically confirmed facts, they change perceptions of norms and familiar beliefs. If such observations remain in the sphere of social interaction, they can be criticized, become a subject for discussion, and initiate a review of existing relationships and rules. The structural basis of such social assessments and categorizations often remains unconscious, but over time it can be brought into " discourse and even reach legal levels. Algorithms play

a dual role regarding such biases: on the one hand they make them visible, but on the other they reinforce them technologically, leaving their "normal" status as a matter of course. The objectives of civil society are to enhance the transparency of discussion, create a demand for social expertise, and express an active interest in access to technological changes and their monitoring or control.

Errors of data representation and imbalanced samples are the result of irresponsible algorithm development. It does not mean that developers make deliberately biased algorithms; it is rather that they rarely consider the social implications of algorithm design. It is essential that they are considered carefully, since algorithms can be something of a black box even for AI developers. It is no coincidence that growing numbers of professional associations are developing recommendations that focus on more responsible approaches to the development and "setting up" of social parameters of technologies. For example, half of the recommendations in the AI Now Institute10 report is specifically focused on social and ethical aspects that are most often ignored by developers.

g — When developing standards for database processing, S it is necessary to understand the nature of biases and mnu data errors.

n — It is important to refrain from an overtly technical

TO

|= approach as it oversimplifies the complexity of social

systems.

— A major problem is that the diversity of social groups (women, ethnic minorities, etc.) is inadequately considered, so more comprehensive research is needed.

— When involving experts from spheres other than engineering, it is necessary to make sure that their opinion and expertise are given certain power in decision-making, particularly when it comes to long-term projects.

— There is a need for constant support of technological development regarding ethical principles.

The dominant technocratic approach to technology development needs to be reconsidered. It can only be executed with the help of extensive social knowledge, which should form the basis of technological decision-making before the black box closes.

The role of civil society

Civil society is perhaps the most important potential participant in technological development for several reasons.

First, the expertise of individual active citizens or nonprofit organizations frequently brings a balanced and critical assessment of what is currently happening in society. Civil society brings together diverse views and supports the development of balanced policy decisions.

Civil society actors are groups whose experience and opinions are essential for understanding the opportunities and limitations of specific technological solutions. Feedback from civil society representatives is characterized by their interest in social responsibility, risk prevention, and thoughtful and balanced attitude towards technological development. NGOs introduce issues that necessarily

98

g complicate technological decisions and ensure they are

o not limited to the engineering community but are also

£ widely discussed outside limited circles of experts.

u

£ Public discussions help technologies to become more

£ flexible and diverse, first at the interpretive level and later

| at the material level. Before final decisions are made,

= representatives of different social groups can make sug-

0 gestions on how to enhance or adapt them, which makes js feedback a meaningful constructivist argument regarding | technology, its type, and even function. Under market J conditions, final users and representatives of expert cir-J cles can be such actors.

1 The communication process related to technology can

TO

& be described as a mutual framework: engineers present

'§ their vision of the project, while external circumstances

= and specific participants transform it in accordance with

" their traditional practices and cultural beliefs.

Interactions between different actors and relevant groups result in the formation of technological frameworks that reflect the technological challenges faced by engineers and the social aspects introduced by relevant groups, especially NGOs.

When the main issues have been resolved and compromises made, technology stabilizes. Stabilization comes only after feedback is collected from users and groups at which this technology is directly or indirectly aimed. Since modern technologies require constant revision and, accordingly, responses to changes, users are forced to monitor these changes constantly.

As is already clear, when it comes to the development of AI, NGOs and activists contribute to its more even distribution. At the level of recommendations from global associations, the need to involve NGOs and vulnerable (socially disadvantaged) groups takes one of the most important places in AI design. The fact is that their unique expertise highlights aspects and biases that are not considered by developers at the design stage, which causes errors in the representativeness of the data underlying the models.

g Discussions of ethical issues of biases and techS nologies in general, as well as problems of privacy and | security boundaries, are not yet the priority of developers n or even the state. In certain cases, users openly express

TO

|= their discontent regarding technologies which are of their

personal concern (for example, social media). In other cases, ethical rules may be regulated at the international standard level (for example, GDPR11). However, concerns over ethical aspects of technology development are in fact the responsibility of individual experts within companies or representatives of expert communities. At the same time, NGOs and other civil society actors take collective initiatives that, unfortunately, do not always impact the legalization of rights and freedoms.

Technologies to aid civil society

iНе можете найти то, что вам нужно? Попробуйте сервис подбора литературы.

Certain technologies are designed to make social relations more equal, transparent, and direct (not mediated by institutional, corporate, or individual players). Examples of technologies of the future which can be aimed at bolstering civil society are shown below.

1. Blockchain. Blockchain technology development is based on the ideology that implies equal access, knowledge, competence, and infrastructural opportunities. Of course blockchain itself does not change the logic or way of thinking about the nature of social relations, but rather reproduces/introduces existing problems, limitations, and inequalities despite good intentions. However, opportunities may emerge, such as the creation of independent communities (e.g., confronting corporate monopolies and verifying transactions in order to combat fraud; the development of new economies and currencies as an alternative to centralized banking and currency systems). The risks of creating closed networks, exchanging illegal resources and risks of totalitarian surveillance systems are the downside of such freedoms.

<0

>

2. Artificial intelligence. In addition to the previously identified shortcomings and biases, AI is also characterized by strong points that potentially contribute to civil society development. The widespread introduction of AI technology, which will be able to process large amounts of information quickly, will provide opportunities to control certain processes such as involving policy decision-mak-

o ing or tracking common social patterns. AI will allow to

non-systematic but potentially productive growth points to be identified and monitored.

3. Brain-computer interfaces. On the one hand, neuro interfaces can become a powerful tool for providing equal opportunities to vulnerable population groups (for example, helping people with disabilities to compensate for skills) and can be an effective tool in the medical and educational spheres, including complex skills training. On

" the other hand, technology has the potential to violate the

right of mental privacy, can be used as an intervention in commercial and civil purposes, and can result in increased vulnerability.

4. Affordable satellite internet. The democratic nature of technology brings information resources tp remote regions, usually excluded from the main trends of civil society development. However, from a material or infrastruc-tural point of view, the high cost and cumbersomeness of production and operation require additional resources to install and maintain such networks, and the economic burden can lie with the local population, making it even more vulnerable.

These examples are intended to demonstrate the ambiguity and complexity of individual technologies with respect to the development of civil society. Civil society representatives can show the range of these limitations and complexities in the real-life situations and cultural contexts. Full development of these technologies is impossible without the ecosystem and environment which must consider multiple barriers, as well as active participation of civil society representatives.

g Weak signals

13

! In the design process of new technology systems, people in

ñ social and cultural contexts are becoming the major focus

TO

i= alongside users. Human interfaces, humanitarian technolo-

gies, community management, and work with user groups reflect the general trend of development, which is the need to improve awareness of technologies through a shift towards sociotechnical interaction. Technological solutions have long ceased to be independent and requires the prefix "socio-" which puts them in the real context of daily life. The following questions occur: who will have access to control of data and technology? Who will be able to track errors and negative social effects? How can the growth of existing inequalities and social vulnerabilities be prevented?

Human relations and life situations continue to be the most marginal element in all possible scenarios of the future, albeit there is always the potential of limited social control (in the form of states or corporations) and enshrined systems of power and relations. Whether we are talking about artificial intelligence as a routine liberator, blockchain as a platform for trust or technology ethics as a major judge, none of the modern developments can solve social problems or change social structures independently; the involvement of expertise and horizontal mechanisms of control is also required.

Experts in the fields of social interactions and human relations who can bring together strict technological models and various life experiences are likely to be increasingly in demand in the future.12 At the same time, expertise areas which involve close work with vulnerable groups, non-profit organizations, and individual civil activists are under-represented. A technological vision of the future does not suppose the involvement of potential channels of vertical communication or self-sufficient mechanisms of public discussion on decisions being taken. Therefore, there is a strong sense that society and its integration into the ideal picture in which civil society is given a special 102

g role of a stakeholder and controller is commonly ignored

o in the process of technology planning.

o £

° Potential models of the future

o

= Desirable future

TO -

0 Civil society and its representatives must become active js and equal participants of technological development and |> discussions of decisions before they are made. The ideol-

1 ogy of active participation of civil society has two goals. J First, it is providing in-depth information and training to H citizens which focuses on how decision-making processes

TO

>■ work and what opportunities exist for ordinary citizens,

'§ rather than on a lack of understanding about how specific

= technologies are designed. Secondly, citizen involvement

" contributes to participation in local initiatives, which

supposes certain interests and responsibility for decisions under discussion. Collective discussions are always difficult to organize and control, but they are necessary at least for collecting feedback about what developers do not take into account or which social groups become vulnerable or unfairly excluded from sociotechnical relations.

There is also the third goal of participatory interaction, which is civil control that has an impact on officials and developers in terms of their responsibility level. Participation generally involves more transparent procedures with clear mechanisms of interaction between different layers and structures of the same society.

Social responsibility of businesses and civic responsibility of officials are a necessary minimum of the desirable future. Access to information, feedback mechanisms, local initiatives, transparency of procedures, and other activities in the framework of the technological decision-making process will make the shared future a common goal and aspiration. For example, corporations developing AI should create communication and feedback channels, which would help users report social effects or errors.

g Blockchain-technologies should not be centralized by

S state actors but rather provide alternative opportunities

| to groups facing structural and institutional constraints.

n Technology ethics should become an open subject for

TO

|= public discussions, and their results should be considered

when making subsequent technological decisions. However, the reality is rather different so far.

Undesirable future

In academic literature and media, only two scenarios of a possible future are played out - either everything will be good, or everything will be bad13. The first one is tech-no-optimistic, with a dominant belief that technologies can solve all of humankind's problems. It is imperfect because society tends towards basic needs with the same personalities, human relations, and standard predictive algorithms of daily routine; no place is left for complex situations and failures. In such an optimistic future, humankind and social relations are the most "wrong" aspects from the perspective of predictable behavior. To make the world better, it should be enough to digitize all spheres of life to the greatest possible extent, so that problems can be solved automatically, while people receive an unconditional basic income allowing them to engage in creative activities.

The second scenario is techno-pessimistic: it suggests halting technological development altogether in order to avoid all potential problems, threats, and difficulties that arise in the process. In other words, rather than understanding them, the focus is on limiting the choice to decide not to create or multiply technological developments. This scenario is rooted in concerns about the consequences of war and man-made disasters; in reality it is highly utopian because too many different actors are interested in technological development.

Both scenarios, of course, have limitations in our understanding of the real future. However, they help identify major problems and constraints in how the technological future is modeled or conceived by its creators. A major problem

104

g here is that the future is reduced to models ignoring all

0 social relations other than the simplest, most primitive, £ frequently evaluative ones, such as good vs. bad, war vs.

u

£ peace, progress vs. regression. These questions do not and

£ cannot have easy answers because all social processes and

| relations are multiple, multidimensional, and dynamic.

® Warnings

J Playing out scenarios of the (un)desirable future quickly

J reveals their faults when they face reality and active par-

1 ticipation of users or citizens. Arising problems are related >■ to social inequality, ethics and morality, technophobia, '§ incompetence of developers or officials, and a number of = other reasons that are better characterized by systemic " and structural conditions, and cultural beliefs.

The fact is that modern technologies reflect existing tensions in society in which groups maintain their dominant positions on various grounds: for example, officials on power grounds, developers on the ground of their professional status, men by gender, etc. If they are not discussed publicly and do not explicitly promote an agenda of changing the status quo, all technological solutions continue to reproduce and bolster established structural inequalities which do not have accessible channels of interaction and effective means of feedback. The current dominant technocratic approach to technology development must be reviewed and new actors need to get involved at different production stages. A shift in the development paradigm can only come about through the diversification of social knowledge, which should form the basis of technological decision-making even before the black box is closed. In a scenario of blocked participation in civil society, we would face oppression of vulnerable groups and an absence of transparent communication and discussion. The worst-case scenarios involve centralized totalitarian control of the state through technologies (for

g example, face identification); monopolized corporate conS trol over data and an inability to influence social effects | (for example, increase in social inequality and exclusion of n certain social groups); or preferential recruitment of pow-

TO

|= erful stakeholders and lobbyists who create exclusively

economic or market-driven mechanisms transforming users into exclusive consumers (for example, using personal assistants or neuro interfaces for market purposes only). Such scenarios do not involve public control, as, for example, in Russia's new national strategy for the development of artificial intelligence,14 which mentions improving the quality of life across the population but does not consider the participation or expertise of social scientists and civil society. It is fair to say that the strategy prevents biased decisions made by algorithms, highlights the value of protecting human rights and freedoms and transparency, and emphasizes the need to develop ethical rules for human interaction with AI (first of all, in a legal way). However, civil society is not mentioned in any way.

Wild cards

In a situation where overcoming a technocratic view of production will become a shared agenda on all levels of social structure, it is not difficult to assume that jokers or wild cards will emerge, which are likely to affect the development of events. An example of a dangerous trend is digital totalitarianism under the veil of state security if boundaries between citizen control and participation become blurred.

One such situation could be a collective lobby of technocratic power structures and economic elites from technological circles, which would jointly exclude citizens from participating in the development of a common agenda. They would likely follow the "state security" path at the expense of privacy issues. The wild-card in this trend may be an information war or conditions created 106

g by public services to restrict communication, including

o physical and infrastructural restrictions.

O

J Isolation and "sovereignty" could become an important

u

£ warning or even threat to the development of civil society,

£ which would have to survive under total information control

o

| and communication isolation. It is the yellow card which is

= defined by the use of information technologies not for the

o benefit of citizens but for the benefit of the state.

<0

js In these circumstances, a green card should have its

|> own civic responses to external restrictions, which would

J suggest alternative, perhaps non-digital ways of interact-

J ing and fighting for the right to cross-border interaction

H or posing questions about greater citizen independence

* and participation in discussions on decisions being made.

'§ Nationwide mobilization could potentially change the situ-

= ation, but it requires new standards (which do not exist yet),

" such as civic participation. Reactive participation of experts

in the field of social and humanitarian research of technology and interactions of science, technology and society could unlock further creative potential. Reasonable monitoring and control mechanisms performed by civil society can also become jokers in the system of technology manufacturers.

Likely future

It is clear that developers, businesspeople, officials, and other interested participants in technological development will not independently build an ideal future, nor will they project working models or foresee all possible consequences. Different social groups and civil society representatives, such as activists, NGOs, minorities, and vulnerable groups add unknown elements to this picture. The more different points of view and expertise there are, the greater the likelihood of combined efforts to think through the design of a balanced and harmonious future. These are the characteristics of techno-realism - the third potential perspective for the development of the future,

107

g in which experts are not just social or humanitarian

kho researchers, but also representatives of civil society with

| their unique life experience and analytical, even critical

n perspective. The key mechanism would be the coopera-

TO

|= tion between these participants: for example, researchers

with an intellectual agenda, representatives of civil society with a social agenda, and officials and developers with the resources and standing to influence decision-making.

iНе можете найти то, что вам нужно? Попробуйте сервис подбора литературы.

Remaining unknowns

A lack of transparency in policy decisions about technologies is the main barrier to understanding how technological development works in different countries. There are many conflicting reasons and interests, and the winners are those who have been able to gain more quantitatively and qualitatively convincing supporters. Once decisions are made, they are almost impossible to reverse, especially in the absence of clear feedback and communication mechanisms. Deep systemic crises in public administration only aggravate the closed nature of restrictions to even discussing technological solutions, even though they have a direct bearing on how they will impact society. Priority work with civil society could change the situation and put comprehensive solutions to social problems using new technologies on the agenda. The issue lies with policymakers at the level of individual engineers, large companies, short-sighted or incompetent officials, and even inactive citizens. Technologies are a reflection of existing complexities with unclear feedback as if they were working in a one-way technological deterministic order. However, current trends show that such an approach would quickly lead to a deadlock which would be impossible to correct simply by "rolling back the system" or "regression testing." Flexible methodologies in the design itself should also be fully implemented through a flexible and open discussion of production and distribution.

108

s Conclusions

o

J How do we involve users or citizens in developing tech-

u

£ nological solutions which can be described as transpar-

£ ent, responsible, and humane? The concept of social

| technology design speaks about active participation of

= non-engineers at all stages of production, about open

0 public discussions, and about rapid feedback. Compa-js nies can implement these recommendations in various | ways: by involving early user groups, by involving social J researchers in developing expert opinions, and organizing J public demonstrations in accordance with the logic of

1 social responsibility. However, much will remain closed

TO

& under the pretext of NDAs or other formal reasons for

'§ non-disclosure. The state, in its turn, will also make the

= best decisions from its perspective aimed at achieving its

" own goals, for example tightening state security. Some

countries, such as Japan and Sweden, put technological problems as a priority and make them the subject of discussion and public debates. In turn, this compels other players - companies and civil society - to participate in these discussions. These political experiments are only starting to work effectively around 15-20 years after their implementation.

There are no universal algorithms for sociotechnical development, but there are recommendations from experienced states, which have achieved changes in the most rigid structural elements by trial and error. This does not mean that they have solved all social problems as well, but they have made technology policies more socially oriented and forced companies to play by the same rules. Civil society in developed countries can become the major controller of the technological agenda.

doi: 10.24411/9999-

Endnotes

10

11

12

D'Onfro J. (2019). Google Scraps Its AI Ethics Board Less Than Two Weeks After Launch in the Wake of Employee Protest // Forbes.ru April 8 URL: (https://www.forbes.ru/tehnologii/374499-google-ras-pustil-sovet-po-voprosam-etiki-v-oblasti-iskusstvennogo-intellek-ta-posle) (retrieval date 26.02.2020).

Bowker G., Star L. (1999). Sorting Things Out: Classification and Its Consequences. Cambridge: MIT Press.

Stakeholder - an interested party, an involved party, a project team member, a role in a project - a person or an organization that has rights, a share, requirements or interest in the system or its parameters, that caters to stakeholders' requirements and expectations. Woolgar S. (1991). Configuring the User: The Case of Usability Trials. Law J. (Ed), A Sociology of Monsters: Essays on Power Technology and Domination. Routledge, London, P. 58-102. Kline R., Pinch T. (1996). Users as agents of technological change: The social construction of the automobile in the rural. Technology and Culture vol. 37, P. 763-795.

Suchman L. (1987). Plans and Situated Actions: The Problem of Human Machine Communication. Cambridge: Cambridge University Press.

Oudshoorn N., Pinch T. (Eds.) (2003). How Users Matter: The Co-Construction of Users and Technology. Cambridge, Mass.: MIT Press. Pinch T., Bijker W. (1984). The social construction of facts and artefacts: or how the sociology of science and the sociology of technology might benefit each other. Social Studies of Science, vol. 14, P. 399441.

Benaich N., Hogarth I. (2019) State of AI Report 2019. URL: https:// www.stateof.ai/ (retrieval date 26.02.2020).

Crawford K., Dobbe R., Dryer T., Fried G., Green B., Kaziunas E., Kak A., Mathur V., McElroy E., Sánchez A., Raji D., Rankin J., Richardson R., Schultz J., West S., Whittaker M. ( 2019). AI Now 2019 Report. New York: AI Now Institute. URL: (https://ainowinstitute.org/AI_Now_2019_ Report.html) (retrieval date 26.02.2020).

General Data Protection Regulation - European Union Regulation on enforcing personal data protection. URL: https://gdpr-info.eu/. "Catalog of professions. Atlas of new professions." URL: http://at-las100.ru/catalog/.

Bychkova O., Zemnukhova L., Rudenko N. (2018). Digital hell, technological paradise or something completely different: what technologies determine the future of mankind. Online magazine "KNIFE" URL: (https://knife.media/tech-changes/) (retrieval date 26.02.2020).

2

3

4

5

6

7

8

9

a E

o o <n

o <n

> b

14 The Decree of the President of the Russian Federation No. 490 dated 10.10.2019 "On the Development of Artificial Intelligence in the Russian Federation" URL: http://publication.pravo.gov.ru/Document/ View/0001201910110003.

i Надоели баннеры? Вы всегда можете отключить рекламу.