13075 - Research Associate
Posting date: | 09 September 2025 |
---|---|
Salary: | £41,064 to £48,822 per year |
Hours: | Full time |
Closing date: | 23 September 2025 |
Location: | Edinburgh, Scotland |
Remote working: | Hybrid - work remotely up to 4 days per week |
Company: | University of Edinburgh |
Job type: | Contract |
Job reference: | 13075 |
Summary
Grade UE07: £41,064- £48,822 per annum
Science and Engineering / School of Informatics
Full-time: 35 hours per week
Fixed term: 2 years, until 31 December 2027
The Opportunity:
The School of Informatics, University of Edinburgh invites applications for a 2-year Post-Doctoral Research Associate (PDRA) to do research on efficient methods for foundation models under the supervision of Dr Edoardo Ponti.
The project is fully funded by the Advanced Research and Invention Agency (ARIA)’s Scaling Compute: AI at 1/1000th the cost (TA 4 Benchmarking) £2M project. As part of this project, we aim to create a suite of next-generation benchmarks that track the fast-evolving landscape of AI, and measure the complex trade-offs across costs, accuracy, and performance on a range of state-of-the-art hardware. This project holds promise to open new avenues in developing and deploying AI models that are fast, environmentally friendly, and energy efficient.
The PDRA will be part of Edinburgh NLP, which is ranked among the top groups in Europe for AI research according to CSRankings. The PDRA will be supervised by Dr Edoardo Ponti and collaborate with other team members based at Edinburgh Informatics, Imperial, and EPCC, one of the UK leading supercomputing centres.
The PDRA role involves 1) conducting cutting-edge research in efficient LLM architectures, building on our lab’s pioneering research on end-to-end tokenization and adaptive memory compression in LLMs; and 2) assisting the project team with benchmarking different use cases of foundation models (such as parameter-efficient adaptation and inference-time scaling, among others) on a variety of hardware.
This position includes funding for international travel to attend conferences and offers access to our HPC infrastructure. This position is advertised as full-time (35 hours per week). We are also open to considering requests for hybrid working (on a non-contractual basis) that combines a mix of remote and regular on-campus working.
Please include the following documents in your application:
CV
1-page cover letter
A maximum of 3 scientific papers and a link to a codebase that illustrate the research quality of the candidate and their affinity with the research topic
Your skills and attributes for success:
A PhD or near completion in NLP, ML, MLSys, or related areas of computer science / engineering / mathematics.
Track record of research excellence, evidenced by e.g. preprints / publication record.
Experience in implementation of foundation models / LLMs, evidenced by e.g. projects on Github.
Familiarity with deep learning hardware / accelerators and GPU/CUDA kernels.
Science and Engineering / School of Informatics
Full-time: 35 hours per week
Fixed term: 2 years, until 31 December 2027
The Opportunity:
The School of Informatics, University of Edinburgh invites applications for a 2-year Post-Doctoral Research Associate (PDRA) to do research on efficient methods for foundation models under the supervision of Dr Edoardo Ponti.
The project is fully funded by the Advanced Research and Invention Agency (ARIA)’s Scaling Compute: AI at 1/1000th the cost (TA 4 Benchmarking) £2M project. As part of this project, we aim to create a suite of next-generation benchmarks that track the fast-evolving landscape of AI, and measure the complex trade-offs across costs, accuracy, and performance on a range of state-of-the-art hardware. This project holds promise to open new avenues in developing and deploying AI models that are fast, environmentally friendly, and energy efficient.
The PDRA will be part of Edinburgh NLP, which is ranked among the top groups in Europe for AI research according to CSRankings. The PDRA will be supervised by Dr Edoardo Ponti and collaborate with other team members based at Edinburgh Informatics, Imperial, and EPCC, one of the UK leading supercomputing centres.
The PDRA role involves 1) conducting cutting-edge research in efficient LLM architectures, building on our lab’s pioneering research on end-to-end tokenization and adaptive memory compression in LLMs; and 2) assisting the project team with benchmarking different use cases of foundation models (such as parameter-efficient adaptation and inference-time scaling, among others) on a variety of hardware.
This position includes funding for international travel to attend conferences and offers access to our HPC infrastructure. This position is advertised as full-time (35 hours per week). We are also open to considering requests for hybrid working (on a non-contractual basis) that combines a mix of remote and regular on-campus working.
Please include the following documents in your application:
CV
1-page cover letter
A maximum of 3 scientific papers and a link to a codebase that illustrate the research quality of the candidate and their affinity with the research topic
Your skills and attributes for success:
A PhD or near completion in NLP, ML, MLSys, or related areas of computer science / engineering / mathematics.
Track record of research excellence, evidenced by e.g. preprints / publication record.
Experience in implementation of foundation models / LLMs, evidenced by e.g. projects on Github.
Familiarity with deep learning hardware / accelerators and GPU/CUDA kernels.