Job description
Research Fellow in Explainable AI and Machine Learning (1 year)
Computer Science
We are looking for a Postdoctoral Research Fellow to join our EPSRC-funded project "Human-machine learning of ambiguities to support safe, effective, and legal decision making". The overall aim of this project is to improve the trustworthiness of the next-generation autonomous systems using the latest advances in AI and machine learning.
This is a fixed term contract for up 12 months with the possibility of extension.
We are looking for someone with specialist knowledge of a subject area of direct relevance for the project, i.e., Computer Science (inc. AI, Machine Learning) or Law, and a PhD degree in one of these subject areas, or equivalent significant relevant experience and qualification. Proficiency in at least one programming language (preferably Python or C/C++), experience or knowledge of relational learning and logic programming will be desirable (but not essential).
Background
Autonomous robotic systems offer huge potential to help humans in a wide range of real-world applications. However, there is an acute lack of trust in robot autonomy in the real world - in terms of operational performance, adherence to the rules of law and safety, and human values. Furthermore, poor transparency and lack of explainability (particularly with popular deep learning methods) add to the mistrust when autonomous decisions do not align with human "common sense".
We will be achieving this through the logical and probabilistic machine learning approach of Bayesian meta-interpretive learning (BMIL). This approach uses a set of logical statements (i.e., propositions, connectives, etc.) that are akin to human language. In contrast, the popular approach of deep learning uses complex multi-layered neural networks with millions of numerical connections. It is through the logical representation and human-like reasoning of BMIL that it will be possible to encode expert human knowledge into the perceptive world model and deliberative planner of the robot's artificial intelligence.
The human-like decision-making will be encoded in a variety of ways:
a. By design from operational and legal experts in the form of initial logical rules;
b. Through passive learning of new logical representations and rules during intervention by human overrides when the robot is not behaving as expected
c. Through recognising ambiguities before they arise and active learning of rules to resolve them with human assistance.
More about the project
This role is part of a larger collaborative project on trustworthy autonomous robotic systems where the case study is an autonomous "robot boat" for underwater search and crime scene investigation (police and emergency services). You will be working with Dr Alireza Tamaddoni-Nezhad and his research team at the University of Surrey in the area of explainable AI and human-machine learning and reasoning. You will also be collaborating with Dr Alan Hunter and his research team at the University of Bath in the area of marine remote-sensing and autonomy.
About the University of Surrey, Dept. of Computer Science and the Surrey Institute for People-Centred AI
The University of Surrey is located on a beautiful campus in Guildford, just 30 miles from London and benefitting from excellent road and rail connections (35 minutes train journey). We are an ambitious, research-led organization, committed to research excellence and to the application of our research for the benefit of society. Dept. of Computer Science was ranked 7th in the country for our research outputs in computer science and informatics according to the latest UK Research Excellence Framework (REF 2021). We are ranked 6th in the UK and top 100 globally for computer science and engineering in the Shanghai Global Ranking of Academic Subjects 2022. The successful candidate will also join our new multi-disciplinary Institute for People-Centred Artificial Intelligence which builds on the 35-year history of world-leading foundational AI research through the Centre for Vision, Speech and Signal Processing (CVSSP) which is the UK’s largest research centre in Audio-Visual AI, with over 200 researchers. CVSSP has a track-record of pioneering ground-breaking AI technologies leading to successful commercialisation and is ranked 1st in the UK for Computer Vision research, 4th for AI and 5th for Robotics (csrankings.org).
We acknowledge, understand and embrace cultural diversity. Research staff are supported and encouraged in their career development through mentoring and early career researcher training. We are also committed to help balance life needs and career ambitions, for example by accommodating flexible working conditions such as part-time work or non-traditional work hours, where possible given the requirements of the project.
For informal enquires please contact Dr Alireza Tamaddoni-Nezhad via email: [email protected]
Please submit a CV and covering letter with your application.
Please note, it is University Policy to offer a starting salary equivalent to Level 3.6 (£33,314) to successful applicants who have been awarded, but are yet to receive, their PhD certificate. Once the original PhD certificate has been submitted to the local HR Department, the salary will be increased to Level 4.1 (£34,308).