van de Poel, Ibo Embedding Values in Artificial Intelligence (AI) Systems Journal Article Minds and Machines, 2020. Abstract | Links | BibTeX @article{vandePoel2020, title = {Embedding Values in Artificial Intelligence (AI) Systems}, author = {Ibo van de Poel}, url = {https://link-springer-com.tudelft.idm.oclc.org/article/10.1007/s11023-020-09537-4}, doi = {DOI: 10.1007/s11023-020-09537-4}, year = {2020}, date = {2020-09-01}, journal = {Minds and Machines}, abstract = {Organizations such as the EU High-Level Expert Group on AI and the IEEE have recently formulated ethical principles and (moral) values that should be adhered to in the design and deployment of artificial intelligence (AI). These include respect for autonomy, non-maleficence, fairness, transparency, explainability, and accountability. But how can we ensure and verify that an AI system actually respects these values? To help answer this question, I propose an account for determining when an AI system can be said to embody certain values. This account understands embodied values as the result of design activities intended to embed those values in such systems. AI systems are here understood as a special kind of sociotechnical system that, like traditional sociotechnical systems, are composed of technical artifacts, human agents, and institutions but—in addition—contain artificial agents and certain technical norms that regulate interactions between artificial agents and other elements of the system. The specific challenges and opportunities of embedding values in AI systems are discussed, and some lessons for better embedding values in AI systems are drawn.}, keywords = {}, pubstate = {published}, tppubtype = {article} }
Organizations such as the EU High-Level Expert Group on AI and the IEEE have recently formulated ethical principles and (moral) values that should be adhered to in the design and deployment of artificial intelligence (AI). These include respect for autonomy, non-maleficence, fairness, transparency, explainability, and accountability. But how can we ensure and verify that an AI system actually respects these values? To help answer this question, I propose an account for determining when an AI system can be said to embody certain values. This account understands embodied values as the result of design activities intended to embed those values in such systems. AI systems are here understood as a special kind of sociotechnical system that, like traditional sociotechnical systems, are composed of technical artifacts, human agents, and institutions but—in addition—contain artificial agents and certain technical norms that regulate interactions between artificial agents and other elements of the system. The specific challenges and opportunities of embedding values in AI systems are discussed, and some lessons for better embedding values in AI systems are drawn. |
Klenk, Michael How Do Technological Artefacts Embody Moral Values? Journal Article Philosophy & Technology, 2020, ISSN: 2210-5441. Abstract | Links | BibTeX @article{RN3917, title = {How Do Technological Artefacts Embody Moral Values?}, author = {Michael Klenk}, url = {https://doi.org/10.1007/s13347-020-00401-y}, doi = {10.1007/s13347-020-00401-y}, issn = {2210-5441}, year = {2020}, date = {2020-01-01}, journal = {Philosophy & Technology}, abstract = {According to some philosophers of technology, technology embodies moral values in virtue of its functional properties and the intentions of its designers. But this paper shows that such an account makes the values supposedly embedded in technology epistemically opaque and that it does not allow for values to change. Therefore, to overcome these shortcomings, the paper introduces the novel Affordance Account of Value Embedding as a superior alternative. Accordingly, artefacts bear affordances, that is, artefacts make certain actions likelier given the circumstances. Based on an interdisciplinary perspective that invokes recent moral anthropology, I conceptualize affordances as response-dependent properties. That is, they depend on intrinsic as well as extrinsic properties of the artefact. We have reason to value these properties. Therefore, artefacts embody values and are not value-neutral, which has practical implications for the design of new technologies.}, keywords = {}, pubstate = {published}, tppubtype = {article} }
According to some philosophers of technology, technology embodies moral values in virtue of its functional properties and the intentions of its designers. But this paper shows that such an account makes the values supposedly embedded in technology epistemically opaque and that it does not allow for values to change. Therefore, to overcome these shortcomings, the paper introduces the novel Affordance Account of Value Embedding as a superior alternative. Accordingly, artefacts bear affordances, that is, artefacts make certain actions likelier given the circumstances. Based on an interdisciplinary perspective that invokes recent moral anthropology, I conceptualize affordances as response-dependent properties. That is, they depend on intrinsic as well as extrinsic properties of the artefact. We have reason to value these properties. Therefore, artefacts embody values and are not value-neutral, which has practical implications for the design of new technologies. |
Steinert, Steffen; Roeser, Sabine Emotions, values and technology: illuminating the blind spots Journal Article Journal of Responsible Innovation, 7 (3), pp. 298–319, 2020, ISSN: 2329-9460. Abstract | Links | BibTeX @article{f3655c00fd504c2a875fbab7a77da521, title = {Emotions, values and technology: illuminating the blind spots}, author = {Steffen Steinert and Sabine Roeser}, doi = {10.1080/23299460.2020.1738024}, issn = {2329-9460}, year = {2020}, date = {2020-01-01}, journal = {Journal of Responsible Innovation}, volume = {7}, number = {3}, pages = {298--319}, publisher = {Taylor & Francis}, abstract = {Responsible innovation and ethics of technology increasingly take emotions into consideration. Yet, there are still some crucial aspects of emotions that have not been addressed in the literature. In order to close this gap, we introduce these neglected aspects and discusses their theoretical and practical implications. We will zoom in on the following aspects: emotional recalcitrance, affective forecasting, mixed emotions, and collective emotions. Taking these aspects into account will provide a more fine-grained view of emotions that will help to improve current and future approaches and procedures that incorporate emotions.}, keywords = {}, pubstate = {published}, tppubtype = {article} }
Responsible innovation and ethics of technology increasingly take emotions into consideration. Yet, there are still some crucial aspects of emotions that have not been addressed in the literature. In order to close this gap, we introduce these neglected aspects and discusses their theoretical and practical implications. We will zoom in on the following aspects: emotional recalcitrance, affective forecasting, mixed emotions, and collective emotions. Taking these aspects into account will provide a more fine-grained view of emotions that will help to improve current and future approaches and procedures that incorporate emotions. |
van de Poel, Ibo Design for value change Journal Article Ethics and Information Technology, 2018, ISSN: 1572-8439. Abstract | Links | BibTeX @article{vandePoel2018, title = {Design for value change}, author = {Ibo van de Poel}, url = {https://doi.org/10.1007/s10676-018-9461-9}, doi = {10.1007/s10676-018-9461-9}, issn = {1572-8439}, year = {2018}, date = {2018-06-26}, journal = {Ethics and Information Technology}, abstract = {In the value sensitive design (VSD) literature, there has been little attention for how values may change during the adoption and use of a sociotechnical system, and what that implies for design. A value change taxonomy is proposed, as well as a number of technical features that allow dealing with value change.}, keywords = {}, pubstate = {published}, tppubtype = {article} }
In the value sensitive design (VSD) literature, there has been little attention for how values may change during the adoption and use of a sociotechnical system, and what that implies for design. A value change taxonomy is proposed, as well as a number of technical features that allow dealing with value change. |