Skip to main content

Study information

Living with Robots: New Technologies and Ethics in Religious and Philosophical Perspectives

Module titleLiving with Robots: New Technologies and Ethics in Religious and Philosophical Perspectives
Module codeTHE3232
Academic year2025/6
Credits30
Module staff

Professor Esther Reed (Convenor)

Duration: Term123
Duration: Weeks

11

Number students taking module (anticipated)

15

Module description

Ethical challenges posed by new technologies call for wide societal debate. These challenges include: problems of algorithmic bias and encoding implicit prejudice; the uncertainty of data values; transparency and accountability as machine learning creates and adapts its own algorithms thereby potentially putting AI decisions beyond human reckoning; whether persons will still really be themselves if their brain function has changed following a machine implant. All call for public engagement, including from religious perspectives.

Much forward thinking is required about what it is to be human, the demands of justice, love of neighbour and the natural environment in a technological age, and more. This module offers you the opportunity to develop a framework for ethical reasoning in a technological age.

There is no module that you should have attended, and no prior knowledge that you should possess, in order to join this module. Debate extends to every citizen of the world. Students of all faith perspectives and philosophical persuasions are welcome to join.

Module aims - intentions of the module

This module aims to:

-        provide an introduction to ethical issues posed by new technologies, including machine-human teaming before treating selected issues in diverse religious and ethical perspective, with particular attention to Christianity and secular humanism(s);

-        encourage awareness of how related economic, social and environmental considerations are posed by new technologies, and to develop critical and constructive responses that include awareness of possible workplace applications;

-        offer an introduction to selected traditions of theological anthropology and modern humanist variants, from which variously adequate answers might be tested in relationship to ethical challenges posed by new technologies;

-        investigate some of the most pressing challenges in machine-human teaming today, including definitions of ‘machine-human teaming’, ‘machine-human fusion’, new questions raised by increasing levels of autonomy in AI decision-making;

-        to develop a framework for discussing ethics and what it means to be ethical in a technological age, including awareness of existing/proposed legislative frameworks.

Intended Learning Outcomes (ILOs)

ILO: Module-specific skills

On successfully completing the module you will be able to...

  • 1. Give religiously and/or philosophically-informed accounts of ethical issues posed by new technologies.
  • 2. Engage analytically and constructively with religious and/or philosophical approaches to ethical issues posed by new technologies.
  • 3. Attend to, reproduce accurately, and reflect critically and independently on the ideas and arguments of major theologians and/or secular humanists or other philosophical theorists with fairness and integrity, and express, as appropriate, your own views about relevant ethical issues posed by new technologies without denigration of the views of others,
  • 4. Acquire enough technical knowledge about new technologies, notably AI and related digital technologies, to identify and critically engage selected ethical issues posed by these new technologies.

ILO: Discipline-specific skills

On successfully completing the module you will be able to...

  • 5. Analyse and evaluate the arguments of a range of significant theorists in the field.
  • 6. Work critically and creatively in applying knowledge, understanding and skills to new social challenges, including questions of justice and care.

ILO: Personal and key skills

On successfully completing the module you will be able to...

  • 7. Communicate effectively with peers and members of the teaching staff in oral form.
  • 8. Exercise substantial autonomy in the management of your own learning.
  • 9. Develop sound critical judgement based upon awareness of key issues and scholarly debate in the area.
  • 10. Demonstrate meaningful and consistent participation in the module, including teamwork.

Syllabus plan

While module content may vary from year to year, it is envisioned that you will cover some or all of the following topics: 

  • Being human in relation to new technologies
  • Technology and justice
  • Living/teaming with machines
  • AI and ethical enfeeblement? What, when and how to let AI decide?
  • Machine-human teaming and question of justice, truth, care and trust

Learning activities and teaching methods (given in hours of study time)

Scheduled Learning and Teaching ActivitiesGuided independent studyPlacement / study abroad
672330

Details of learning activities and teaching methods

CategoryHours of study timeDescription
Learning and Teaching2211 x 2-hour whole cohort lectures/workshops
Learning and Teaching1111 x 1-hour seminars
Learning and Teaching11-to-1 tutorials
Guided Teamwork3311 x 3-hour student-led teamwork
Guided Independent Study233Private study

Formative assessment

Form of assessmentSize of the assessment (eg length / duration)ILOs assessedFeedback method
Presentation15 mins6, 10Oral feedback from class tutor

Summative assessment (% of credit)

CourseworkWritten examsPractical exams
10000

Details of summative assessment

Form of assessment% of creditSize of the assessment (eg length / duration)ILOs assessedFeedback method
Essay302000 words1-5, 7-91-2-1 feedback from tutor on essay plan in tutorial, plus essay feedback sheet
Essay703500 words1-5, 7-9Written

Details of re-assessment (where required by referral or deferral)

Original form of assessmentForm of re-assessmentILOs re-assessedTimescale for re-assessment
Essay (30%)Essay (30%)1-5, 7-9Referral/deferral period
Essay (70%)Essay (70%)1-5, 7-9Referral/deferral period

Re-assessment notes

Deferral – if you miss an assessment for certificated reasons judged acceptable by the Mitigation Committee, you will normally be either deferred in the assessment or an extension may be granted. The mark given for a re-assessment taken as a result of deferral will not be capped and will be treated as it would be if it were your first attempt at the assessment. 

Referral – if you have failed the module overall (i.e. a final overall module mark of less than 40%) you will be required to submit a further assessment as necessary. If you are successful on referral, your overall module mark will be capped at 40%.

Indicative learning resources - Basic reading

Indicative basic reading list (all available electronically):

  • Adib-Moghaddam, A. (2023). Is Artificial Intelligence Racist? : The Ethics of AI and the Future of Humanity (1st ed.). Bloomsbury Academic. https://doi.org/10.5040/9781350374430
  • Dignum, V. (2019). Responsible Artificial Intelligence: How to Develop and Use AI in a Responsible Way (1st ed. 2019.). Springer International Publishing AG. https://doi.org/10.1007/978-3-030-30371-6
  • Gunkel, D. J. (Ed.). (2024). Handbook on the Ethics of Artificial Intelligence (First edition.). Edward Elgar Publishing Limited. https://www.europarl.europa.eu/RegData/etudes/STUD/2020/641507/EPRS_STU(2020)641507_EN.pdf
  • Singler, B., & Watts, F. N. (Eds.). (2024). The Cambridge companion to religion and artificial intelligence (1st ed.). Cambridge University Press.
  • Xu, X. (2024). The Digitalised Image of God. Taylor & Francis Group. https://doi.org/10.4324/9781003356738-2Jacques Ellul, The Technological System (Eugene, OR, Wipf and Stock: 1977/2018)

Indicative learning resources - Web based and electronic resources

Key words search

Ethics, technology, theology, anthropology, autonomy, AI, fusion, machine

Credit value30
Module ECTS

15

Module pre-requisites

None

Module co-requisites

None

NQF level (module)

6

Available as distance learning?

No

Origin date

14/02/2025