Location: | Coventry, University of Warwick |
---|---|
Salary: | £34,866 to £45,163 per annum |
Hours: | Full Time |
Contract Type: | Fixed-Term/Contract |
Placed On: | 19th December 2024 |
---|---|
Closes: | 19th January 2025 |
Job Ref: | (109974-1224) |
Location: University of Warwick Campus, Coventry
Duration: Ending 30 August 2027
About the Role
For informal enquiries, please contact Xingyu Zhao (Assistant Professor) at Xingyu.zhao@warwick.ac.uk.
The Safe Autonomy group at WMG is looking for a postdoctoral researcher to join the EPSRC founded project “Harnessing Synthetic Data Fidelity for Assured Perception of Autonomous Vehicles”. This project aligns with the group’s vision which is to ensure the safe introduction of self-driving vehicle technology in society by preventing all potential accidents, while our mission is to create knowledge and methodologies for industry to prove that this technology is safe for society.
Scenario-based testing, a cornerstone of Autonomous Vehicle (AV) safety assurance, leverages synthetic data for virtual verification and validation (V&V). Despite advancements in simulators and Generative AI to enhance fidelity and realism, synthetic data still falls short of outright fidelity. This project addresses a pivotal question: What level of fidelity is required for synthetic data to be deemed sufficient for AV safety? To answer this, we aim to develop a scientifically rigorous framework to define and quantify synthetic data fidelity, perform fidelity-informed verification and validation, and assess how fidelity uncertainty impacts safety confidence. This work will advance academic knowledge in synthetic data, AV safety, and Safe AI. Economically, it accelerates AV safety solutions, supports industry product development, and informs safety standards and policies. Societally, it ensures timely, safe AV deployment and bolsters public trust.
About You
We are seeking a talented Safe AI researcher with a systems-thinking approach to join our team.
In this role, you will apply advanced techniques in machine learning (ML) testing, formal verification, uncertainty quantification, robust ML training, and explainable AI (XAI) to train and test AI systems to meet the highest safety standards. Expertise in systems-approaches (e.g., STPA), safety assurance cases, statistical inference and probabilistic modelling is also desired.
You will support the department’s research, contributing to its reputation both internally and externally, while assisting the Project Leader and collaborators in project execution. Enhancing the impact of your work through publications, networking, PR communications, and stakeholder engagement will be key.
For further information regarding the skills required for this role please see the personal specification section of the attached job description.
If you are near submission or have recently submitted your PhD but have not yet had it conferred, any offers of employment will be made as Research Assistant at the top of level 5 of the University grade structure. Upon receipt of evidence of the successful award of your PhD, you will be promoted to Research Fellow on the first point of level 6 of the University grade structure.
CLOSING DATE: Sunday 19 January 2025 at 11.55 pm
Full details of the duties and selection criteria for this role can be found in the vacancy advert on the University of Warwick's jobs pages. You will be routed to this when you click on the Apply button.
Type / Role:
Subject Area(s):
Location(s):