6. Design Strategies

Research line 6

Design Strategies

Home > Project > Research lines > …

Objective

To develop strategies that can better deal with value change in sociotechnical systems than current value sensitive design approaches (project objective 5) 

How Can We Design for Value Change?

Different types of strategies are possible to deal with value change in sociotechnical systems. Anticipatory strategies try to predict or anticipate value changes. Currently most approaches do not try to predict value changes but rather develop for example technomoral scenarios to sketch different possible futures. These can then be the basis for deliberation and sociotechnical systems may be so designed that they can deal with certain anticipated value changes or they may be designed in an attempt to steer value change in a certain direction.

However, not all value changes can be predicted or even anticipated. Not only may values change in other ways than anticipated, also how we perceive and morally evaluate value changes may be hard to anticipate beforehand. This means that anticipatory strategies alone are not enough to deal with value change. Partly we may deal with unanticipated value change by adaptive strategies. Possible adaptive strategies include the redesign of some of the technical parts of the sociotechnical systems, redesign of some of the relevant institutions, adaption of the operation of the sociotechnical system (operational strategies) or changing the behavior of the actors in the system (behavior strategies).

However, often it is better to be prepared for adaption beforehand even if the specific to be expected value changes cannot be anticipated. This may be done by employing a range of proactive design strategies, technical as well as institutional. One might think of design strategies or principles like robustness, flexibility, adaptability, modularization, and the building in of artificial intelligence.

Proactive design strategies may employ the distinction between the intended, embedded and realized values of sociotechnical systems. One may, for example, choose to embed values more loosely into sociotechnical systems so there is more room to realize values that deviate from the embedded values. Alternatively, one may choose also to embed values into the system that are perhaps currently less relevant but are likely to become more important in the future; and this may be done in such a way that there is room to change the realized values later.

Similarly, the distinction between values embedded in technology and institutions may be employed to better deal with future value change. While from the viewpoint of short-term efficacy, it may be undesirable if there are tensions between the technologically and institutionally embedded values, in the longer run this may not always be undesirable in the light of potential value change.

As these examples already show employing proactive design strategies to deal with future value change may bring tradeoffs, or even dilemmas, for example between costs and short-term efficacy on the one hand and the ability to effectively deal with future value change. These trade-offs and dilemmas will be investigated in research line 6. The ultimate aim and result of this research lines will be a new approach for dealing with value change that can be integrated in Value Sensitive Design or Design for Values; such an approach will consist of possible design strategies (technical as well as institutional), but also in insights in the mentioned trade-offs and dilemmas.

 

Researchers

Anna Melnyk, M.Sc.

Anna Melnyk, M.Sc.

PhD Candidate

a.melnyk@tudelft.nl
Dr. Amineh Ghorbani

Dr. Amineh Ghorbani

Assistant Professor

a.ghorbani@tudelft.nl

logo research gate

Dr.ir. Tristan de Wildt

Dr.ir. Tristan de Wildt

Postdoctoral researcher

T.E.dewildt@tudelft.nl

logo research gate

Related events

There are no events.

Related Publications

2023

van der Weij, F.; Steinert, S.; van de Poel, I.; Alleblas, J.; Melnyk, A.; de Wildt, T.

Value Change and Technological Design Journal Article

In: IEEE Technology and Society Magazine, vol. 42, no. 3, pp. 25-32, 2023, ISSN: 1937-416X.

Links | BibTeX

2022

de Wildt, Tristan Emile; Schweizer, Vanessa Jine

Exploring value change Journal Article

In: Prometheus, vol. 38, no. 1, 2022.

Links | BibTeX

T. E. Wildt, I. R. van de Poel; Chappin, E. J. L.

Tracing Long-term Value Change in (Energy) Technologies: Opportunities of Probabilistic Topic Models Using Large Data Sets Journal Article

In: Science, Technology, & Human Values, vol. 47, no. 3, pp. 429-258, 2022.

Abstract | Links | BibTeX

2020

de Reuver, Mark; van Wynsberghe, Aimee; Janssen, Marijn; van de Poel, Ibo

Digital platforms and responsible innovation: expanding value sensitive design to overcome ontological uncertainty Journal Article

In: Ethics and Information Technology, 2020, ISSN: 1572-8439.

Abstract | Links | BibTeX

2018

van de Poel, Ibo

Design for value change Journal Article

In: Ethics and Information Technology, 2018, ISSN: 1572-8439.

Abstract | Links | BibTeX

5. Artificial Intelligence

Research line 5

Artificial Intelligence

Home > Project > Research lines > …

Objective

To apply the developing theory of value change to empirical cases in the realm of robot systems and artificial intelligence (objective 4); to study potential proactive strategies for better dealing with value change in the domain of robot systems and artificial intelligence (project objective 5).

How to Deal with Value Change in AI?

In research line 5, we will investigate value change in robot systems. One more specific question is under what conditions it is morally acceptable to design sociotechnical systems that can autonomously (i.e. without human intervention) adapt to value change through artificial intelligence. This research line combines insight and approaches from robot ethics, which focus on the value-sensitive design of computer systems and robotics, with machine ethics, an area that focuses on the design of artificial moral intelligence. This combination is innovative.

Artificial agents are computer and robot systems that are autonomous, interactive and adaptive. This would make it possible, at least in principle, to design artificial agents that autonomously act on certain values and can adapt them on basis of interactions with the environment. James Moor distinguishes between four ways in which artificial agents may be or become moral agents:

  1. Ethical impact agents are robots and computer systems that ethically impact their environment; this is probably true of (all) robots.
  2. Implicit ethical agents are robots and programs that have been programmed (by humans) to act according to certain values (for example by employing value sensitive design).
  3. Explicit ethical agents are machines that can represent ethical categories and that can reason (in machine language) about these.
  4. Full ethical agents in addition also possess some characteristics we often consider crucial for human agency, like consciousness, free will and intentionality.

It might perhaps never be possible to design robots as full ethical agents (and if it would become possible it may be questionable whether it is morally desirable to do so). However, explicit ethical agents may be good enough to build the capacity to adapt to new values into robot systems. There are today hardly any successful examples of explicit ethical artificial agents. As Wallach and Allen point out, the main problem might not be to design artificial agents that can act autonomously (and that can adapt themselves in interaction with the environment), but rather to build enough, and the right kind of, ethical sensitivity into such machines.

Experiences with adaptable algorithms suggest that adaptability is not only an asset but also a potential risk, in particular if it is opaque how robot systems learn and adapt their own working or if they do so in a way that seems undesirable, for example because it is done on the basis of too limited or biased ‘experiences’ [103]. This raises the question whether it is morally desirable to build the ability to autonomously adapt to value change into sociotechnical systems through artificial intelligence.

This question will be approached in this research line by looking for possible meta-values that any acceptable artificial agent or robot system should meet. The idea is that meta-values are to be built in the artificial agent so that they are immutable or can only be changed by humans, while other values can be autonomously adapted by the artificial agent itself. Possible candidates for such meta-values are transparency, accountability, ‘meaningful human control’ and reversibility. Such a set of meta-values would make it possible for humans to monitor when the artificial agent has changed its values (due to transparency), to understand why it did so (due to accountability), and to be able to turn back this adaptation if necessary (due to reversibility and meaningful human control).

Two cases studies will be carried out. The first will focus on self-driving cars and the use of artificial intelligence in transportation systems. These technological innovations may make the transportation system safer and more sustainable, but they have also lead to ethical debates about how self-driving cars should be programed to behave in case of an accident and whether they should be programmed to make ethical choices in such cases. These debates will be interpreted in terms of the question to what extent the capacity to (autonomously) apply, weigh and adapt values should be programmed into self-driving cars or whether these capacities should remain under human control, be it the designers, the users (drivers) or the operators of the system.

The second case study will focus on socially adaptive electronic partners (SAEP). These are artificial agents or systems, like smart homes appliances that support humans, and in which certain values and norms are built. These values and norms are adaptive so that SAEP can adjust their behavior to the context. A crucial question in the developments of SAEP is who should be allowed to adapt the values on which their functioning is based and whether under certain circumstances or for certain values the artificial agent itself should also be able to change its values.

Researchers

Tom Coggins, M.Sc.

Tom Coggins, M.Sc.

PhD Candidate

t.n.coggins@tudelft.nl
Dr. Aimee Robbins-van Wynsberghe

Dr. Aimee Robbins-van Wynsberghe

Assistant Professor

a.l.robbins-vanwynsberghe@tudelft.nl

google scholar logo

Dr. Olya Kudina

Dr. Olya Kudina

Postdoctoral Researcher

olya_kudina@yahoo.com

ORCID logo

Dr. Michael Klenk

Dr. Michael Klenk

Postdoctoral researcher

M.B.O.T.Klenk@tudelft.nl

Logo academia.edu  logo research gate

Related events

Workshop ‘Machines of Change: Robots, AI and Value Change’

Start date: February 1, 2022
End date: February 3, 2022
Time: 12:00 am - 12:00 am
Location: Delft, the Netherlands
Artificial Intelligence | Project workshop
robot-g999220136_640

During the Machines of Change: Robots, AI and Value Change workshop, we will explore how the deployment of Artificial Intelligence and robots leads to value change and how we can study value change, as a phenomenon, via these technologies. The workshop will center on three themes:
1) How do AI and /or robotics contribute to value change,
2) How can we study value change via AI and/or robotics?
3) How should AI and / or robotics deal with value change?

Related Publications

2023

Coggins, Tom N.; Steinert, Steffen

The seven troubles with norm-compliant robots Journal Article

In: Ethics and Information Technology, vol. 25, no. 2, pp. 29, 2023, ISSN: 1572-8439.

Abstract | Links | BibTeX

2022

Coggins, Tom N.

More work for Roomba? Domestic robots, housework and the production of privacy Journal Article

In: Prometheus, vol. 38, no. 1, 2022.

Links | BibTeX

2021

Umbrello, Steven; van de Poel, I R

Mapping value sensitive design onto AI for social good principles Journal Article

In: AI and Ethics, 2021, ISSN: 2730-5961.

Abstract | Links | BibTeX

2020

van de Poel, Ibo

Embedding Values in Artificial Intelligence (AI) Systems Journal Article

In: Minds and Machines, 2020.

Abstract | Links | BibTeX

Muishout, Chantal E; Coggins, Tom N; Schipper, Roel H

More Than Meets the Eye? Robotisation and Normativity in the Dutch Construction Industry Proceeding

Springer, 2020, ISBN: 978-3-030-49915-0, (Accepted Author Manuscript; Digital Concrete 2020 - 2nd RILEM International Conference on Concrete and Digital Fabrication ; Conference date: 06-07-2020 Through 08-07-2020).

Abstract | Links | BibTeX

Hayes, Paul; van de Poel, Ibo; Steen, Marc

Algorithms and Values in Justice and Security Journal Article

In: AI&Society: the journal of human-centered systems and machine intelligence, vol. 35, no. 3, pp. 533–555, 2020, ISSN: 0951-5666.

Abstract | Links | BibTeX

van de Poel, Ibo

Three philosophical perspectives on the relation between technology and society, and how they affect the current debate about artificial intelligence Journal Article

In: Human Affairs, vol. 30, no. 4, pp. 499, 2020, ISSN: 1210-3055.

Abstract | Links | BibTeX

4. Energy Systems

Research line 4

Energy Systems

Home > Project > Research lines > …

Objective

To apply the developing theory of value change to empirical cases in the realm of energy systems (project objective 4); to study potential design strategies for better dealing with value change in the domain of energy systems (project objective 5).

How to Deal with Value Change in Energy Systems?

Energy systems are sociotechnical systems; they are typically not designed by one designer and evolve over time. Still they are shaped by values and sociotechnical visions. Many of our current energy systems were designed at a time when sustainability was a less important value than currently. In many countries, the need for an energy transition to more sustainable energy systems is felt. It is obvious that the energy transition is a technical and economic transition, but it also requires changes in institutions and values.

Various kinds of energy systems have been studied from the perspective of value sensitive design, including offshore energy parks, smart grids, nuclear energy, shale gas and biofuels. These and other studies have revealed a large range of values that play a role, or should play a role in the design of energy systems including energy efficiency; sustainability and other environmental values; security and reliability; social justice and fairness; autonomy and power; safety; privacy, aesthetics and landscape embedding. Although these studies suggest that values are changing over time, the topic of value change and how to address it in the technical and institutional design of energy systems has not yet been systematically addressed.

Addressing value change is particularly important for the case of energy systems because these systems have large technological and institutional momentum, while they are often socially contested. The technological and institutional momentum implies that these systems are often hard to change; technical infrastructures are usually built for decades; and also institutional rules cannot be changed overnight. This makes it more difficult and costly to deal with value change.

At the same time, the socially contested character of many energy technologies makes it not only crucial to properly address values for the ethical acceptability and social acceptance of these systems, but makes it also likely that new values will emerge in public debates about energy technologies. So while value change may be endemic in energy systems, these systems at the same time have characteristics that make it harder to deal with such value change.

Researchers

Joost Alleblas, M.Sc.

Joost Alleblas, M.Sc.

PhD Candidate

Dr. Behnam Taebi

Dr. Behnam Taebi

Associate Professor

mailto:B.taebi@tudelft.nl

google scholar logo 

Anna Melnyk, M.Sc.

Anna Melnyk, M.Sc.

PhD Candidate

a.melnyk@tudelft.nl
Dr.ir. Tristan de Wildt

Dr.ir. Tristan de Wildt

Postdoctoral researcher

T.E.dewildt@tudelft.nl

logo research gate

Related events

Workshop ‘Energy Systems and Changing Values’

Start date: October 15, 2020
End date: October 17, 2020
Time: 12:00 am - 12:00 am
Location: Delft University of Technology
Energy Systems | Project workshop
Energy Workshop

These three days brought together a wide range of scholars on values in relation to energy systems. In total, 17 proposals prepared by 31 scholars were selected based on an abstract for a paper connected to the workshop’s overall theme: Changing Values and Energy Systems.

Related Publications

2022

Melnyk, Anna

An Interpretation of Value Change: A Philosophical Disquisition of Climate Change and Energy Transition Debate Journal Article

In: Science, Technology, & Human Values, vol. 47, no. 3, pp. 404-428, 2022.

Abstract | Links | BibTeX

van de Poel, Ibo; Taebi, Behnam

Value Change in Energy Systems Journal Article

In: Science, Technology, & Human Values, vol. 47, no. 3, pp. 371-379, 2022.

Abstract | Links | BibTeX

2021

de Wildt, Tristan E; Boijmans, Anne R; Chappin, Emile J L; Herder, Paulien M

An ex ante assessment of value conflicts and social acceptance of sustainable heating systems: An agent-based modelling approach Journal Article

In: Energy Policy, vol. 153, 2021, ISSN: 0301-4215.

Abstract | Links | BibTeX

Melnyk, A; Singh, A

Constructing an inclusive vision of sustainable transition to decentralised energy: Local practices, knowledge, values and narratives in the case of community-managed grids in rural India Book Chapter

In: Kumar, Ankit; ö, Johanna H; Pols, Auke (Ed.): Dilemmas of Energy Transitions in the Global South, pp. 39–54, Routledge - Taylor & Francis Group, United Kingdom, 1st Edition, 2021, ISBN: 978-1-032-01546-0.

Abstract | Links | BibTeX

2020

de Wildt, Tristan; Chappin, Emile; van de Kaa, Geerten; Herder, Paulien; van de Poel, Ibo

Conflicted by decarbonisation: Five types of conflict at the nexus of capabilities and decentralised energy systems identified with an agent-based model Journal Article

In: Energy Research & Social Science, vol. 64, pp. 101451, 2020, ISSN: 2214-6296.

Abstract | Links | BibTeX

van de Poel, Ibo; Taebi, Behnam; de Wildt, Tristan

Accounting for Values in the Development and Design of New Nuclear Reactors Journal Article

In: The Bridge, vol. 50, no. 3, pp. 59-65, 2020.

Links | BibTeX

3. Embedded Values

Research line 3

Embedded Values

Home > Project > Research lines > …

Objective

To extend philosophical analyses of the embedding of values in technical artifacts to sociotechnical systems (project objective 3)

How Are Values Embedded in Technology?

Various authors have proposed accounts of how technology may embody values. We will build on the characterization of Van de Poel and Kroes, according to which a technical artifact x embodies a value G “if the designed properties of x have the potential to achieve or contribute to G (under appropriate circumstances) due to the fact that x has been designed for G”. They further distinguish between the intended, the embedded and the realized values of a technical artifact (see figure below). The intended values are the values intended by the designers of the artifact. The realized value may be different from the embedded value, for example because a technology is used differently than intended.

 

This account needs to be refined and extended in order to apply it to sociotechnical systems. Sociotechnical systems may be defined as systems that depend for their proper functioning not only on technical hardware but also on human behavior and social institutions. Sociotechnical systems are usually not designed from scratch but evolve and, in as far as they are designed, there is usually not one designer. This requires a redefinition of the notion intended value and may also require developing a notion of embedded value that does not depend on intentionally designed properties of a sociotechnical system.

Moreover, we need to account for the fact that in sociotechnical systems, values may also be embedded in institutions. Institutions will be understood as rule-sets; rules can both be formal (like legal rules or operational instructions) but also be informal. We will use the ADICO grammar developed by Crawford and Ostrom as basis to analyze institutional rules. This grammar analyses institutions in terms of

  • A: attributes, i.e. to whom a particular institution applies;
  • D: deontic operator, i.e. either permission (may), obligation (must), or prohibition (must not);
  • I: aim: actions or results to which the deontic operator applies;
  • C: conditions; describe when, where, how and to what extent the deontic operator applies;
  • O: or else; describe the sanctions of not observing an institution.

A main question is what the relation between values and institutional rules is. Two possibilities will be further explored:

  1. First, the aim that is part of the institutional rules (according to the ADICO grammar) may either refer to or be motivated by a value. In some cases, however, aims are more specific or instrumental so that it may be hard to associate values with them.
  2. In such cases, it may be worthwhile to look at a second possibility, namely that the institution as such serves a certain value. Such a relation between institutions and values is indeed suggested by teleological accounts of institutions.

 

Researchers

Prof. Dr. Ibo van de Poel

Prof. Dr. Ibo van de Poel

Project leader

i.r.vandepoel@tudelft.nl

Logo academia.edu google scholar logo logo research gate

Dr. Steffen Steinert

Dr. Steffen Steinert

Postdoctoral researcher

S.Steinert@tudelft.nl

logo research gate

Related events

There are no events.

Related Publications

2020

van de Poel, Ibo

Embedding Values in Artificial Intelligence (AI) Systems Journal Article

In: Minds and Machines, 2020.

Abstract | Links | BibTeX

Klenk, Michael

How Do Technological Artefacts Embody Moral Values? Journal Article

In: Philosophy & Technology, 2020, ISSN: 2210-5441.

Abstract | Links | BibTeX

2. Value Change

Research line 2

Value Change

Home > Project > Research lines > …

Objective

To develop a notion of value that can account for value change (project objective 1); to develop a taxonomy and mechanisms of value change in sociotechnical systems (project objective 2).

How do values change over time?

To understand how values may change, we will understand values as emerging from earlier responses to moral problems. In line with pragmatist philosophers like Dewey, values will be seen as generalized responses to earlier moral problems. In many situations, existing values are adequate as a response to (morally) problematic situations people encounter. However, in new types of situations or due to new experiences, current values may no longer be adequate or sufficient. Such situations may require an adaption of current values or the adoption of new values.

The above description gives already some clues about how, and when, values may change. They will particularly do so as a response to new problematic situations or new experiences. On this basis, we will develop a more precise account of different mechanisms of value change in sociotechnical systems. Possible mechanisms include the following:

  1. Technologies lead to new types of consequences that require new evaluative dimensions and therefore new values (e.g. privacy, sustainability) to evaluate sociotechnical systems;
  2. Technologies offer new opportunities (e.g. to protect homes against earthquakes) that lead to new moral obligations and therefore new values;
  3. Technologies create new moral choices and dilemmas where previously were no choices (e.g. predictive genetics) that require new values;
  4. Technologies lead to new experiences (e.g. friendship online) that lead to new values or change existing values.

In addition to distinguishing mechanisms of value change, we may also distinguish different degrees of value change, resulting in a taxonomy of value change. A first thing to note here is that the verdict whether a value has changed, and to what degree it has changed, depends on how that value is exactly defined. It is particularly important at what level of abstraction, or generality, a value is characterized. Usually values are characterized at a rather abstract or general level; they are typically referred to with one abstract noun, like safety, sustainability, privacy or well-being, although also longer expressions occur.

One of the consequences of the fact that values are often understood at a high level of conceptual abstraction, is that changes in the understanding or interpretation of a value can occur while the value itself remains the same. In car design, safety may refer to the safety of the driver and passengers (occupant safety) or to the safety of bystanders like pedestrians and cyclists (pedestrian safety), and while in car design the emphasis was originally mainly on the first it has gradually also become to involve the second. This can be interpreted as a change in how the value of safety is conceptualized and specified, but it could also be interpreted, if values are understood at a somewhat lower level of abstraction, as a change in the relative importance of the values of occupant safety and pedestrian safety.

So how we exactly characterize changes in values at least partly depends on the level of abstraction that we use to characterize values. With this in mind, it is nevertheless possible to distinguish between different kinds of value change:

 

  1. The emergence of new values;
  2. Changes in what values are relevant for the design of a certain technology;
  3. Changes in the priority or relative importance of values;
  4. Changes in how values are conceptualized;
  5. Changes in how values are specified, and translated into norms and design requirements.

Researchers

Dr. Steffen Steinert

Dr. Steffen Steinert

Postdoctoral researcher

S.Steinert@tudelft.nl

logo research gate

Dr. Michael Klenk

Dr. Michael Klenk

Postdoctoral researcher

M.B.O.T.Klenk@tudelft.nl

Logo academia.edu  logo research gate

Dr. Olya Kudina

Dr. Olya Kudina

Postdoctoral Researcher

olya_kudina@yahoo.com

ORCID logo

Related events

Second edition of the Changing Values, Changing Technologies conference from 19-21 April 2023 in Delft (Netherlands) as part of fPET 2023 

Start date: April 19, 2023
End date: April 21, 2023
All-day event
Location: TU Delft
Project workshop | Value Change
web-4861605_640

In October 2020 we organised the first international conference on Changing Values, Changing Technologies. We are happy to announce the second edition, which will be an integral part of the 2023 Forum on Philosophy, Engineering, and Technology (fPET 2023) and will be held at the Technical University of Delft in the Netherlands from 19-21 April […]

Workshop ‘ValueMonitor, a tool for empirical investigations of values and value change in text corpora’

Date: March 10, 2023
Time: 9:30 am - 4:30 pm
Location: TU Delft
Project workshop | Value Change
nuclear-energy-3-768×576

ValueMonitor is an easy-to-use tool for empirical investigations of values and value change in text corpora (e.g., scientific articles, newspaper articles). Investigations performed with ValueMonitor can be used to discover neglected moral issues voiced by societal actors or to map the current state of scientific research on specific values and technologies. Although ValueMonitor relies on […]

Workshop ‘Corpus Analysis, conceptual change, and socially disruptive technologies’

Date: February 16, 2023
All-day event
Location: TU Delft
Project workshop | Value Change
pexels-min-an-694740

New technologies like AI (deepfakes, large language models etc.) or bio technologies (e.g., CRISPR) are not only changing our way of life and social practices. They also have the potential of disrupting or even changing our concepts and values. How can we study such conceptual disruptions empirically? Do such disruptions give rise to conceptual change?  […]

Conference ‘Changing Values, Changing Technologies’

Start date: October 12, 2021
End date: October 13, 2021
Time: 12:00 am - 12:00 am
Location: Delft, the Netherlands
Project workshop | Value Change
aerial-view-ZW-optimized

The aim of the conference is to present and discuss research on the interrelations of moral values and technology. Specifically, we aim to explore how novel technological developments lead to changing moral values and, conversely, how changing moral values affect the developments of new technologies. We envision the following more specific themes for the conference: 1) (Historical) case studies of value change and technology, 2) The interrelation between value change and technological chang3, 3) Value change and moral progress, 4) Origins of value change: individual and collective, 5) Implications of value change for value sensitive design, 6) Methods for studying and anticipating value change. More announcements will follow soon

Workshop ‘Explaining Value Change’

Start date: April 15, 2021
End date: May 6, 2021
Time: 12:00 am - 12:00 am
Location: Online
Project workshop | Value Change
alex-block-PdDBTrkGYLo-unsplash-ConvertImage (1)

This workshop series aims to shed light on this under-investigated topic of value change. Despite its seemingly common enough occurrence, it is not clear what value change amounts to. The first step towards a clearer understanding of value change is to make headway on what exactly we need to explain when we want to explain value change. The workshop will bring together internationally leading scholars on philosophical value theory to explore the topic of value change.

Related Publications

2022

van de Poel, Ibo; Kudina, Olya

Understanding Technology-Induced Value Change: a Pragmatist Proposal Journal Article

In: Philosophy of Technology, vol. 35, no. 2, pp. 40, 2022, ISSN: 2210-5433.

Abstract | Links | BibTeX

van de Poel, Ibo

Understanding value change Journal Article

In: Prometheus, vol. 38, no. 1, pp. 7-24, 2022.

Links | BibTeX

Hopster, J. K. G.; Arora, C.; Blunden, C.; Eriksen, C.; Frank, L. E.; Hermann, J. S.; Klenk, M. B. O. T.; O’Neill, E. R. H.; Steinert, S.

Pistols, pills, pork and ploughs: the structure of technomoral revolutions Journal Article

In: Inquiry, pp. 1-33, 2022, ISSN: 0020-174X, (doi: 10.1080/0020174X.2022.2090434).

Links | BibTeX

Philip J. Nickel, Olya Kudina; van de Poel, Ibo

Moral Uncertainty in Technomoral Change: Bridging the Explanatory Gap Journal Article

In: Perspectives on Science, vol. 30, no. 2, pp. 260-283, 2022, ISSN: 1063-6145.

Abstract | Links | BibTeX

2020

Steinert, Steffen

Corona and value change. The role of social media and emotional contagion Journal Article

In: Ethics and Information Technology, 2020, ISSN: 1388-1957, 1572-8439.

Abstract | Links | BibTeX

2018

van de Poel, Ibo

Design for value change Journal Article

In: Ethics and Information Technology, 2018, ISSN: 1572-8439.

Abstract | Links | BibTeX