You are on page 1of 128

BRL MSc Robotics 2022 January 17, 2022

1
BRL MSc Robotics 2022 January 17, 2022

Contents
Forward ........................................................................................................................................... 8
The projects ..................................................................................................................................... 9
1 Safely Automating Drone Handling ............................................................................................................................... 9
2 Automated filming of indoor drone experiments ........................................................................................................ 9
3 Self-Replenishing Drone Team ..................................................................................................................................... 10
4 Distributed Heartbeat based Multi-UAV Health Monitoring using Bluetooth ....................................................... 10
5 Drone traffic management in urban environments................................................................................................... 11
6 MCMC methods for stochastic robot swarm control ............................................................................................... 11
7 Environmental sensing through minimal yes/no robot swarms ............................................................................ 12
8 Improving Physical Reservoir Computing [2 students] ........................................................................................... 12
9 Soft-robotics of the sperm tail...................................................................................................................................... 13
10 Soft-robotics spontaneous travelling waves locomotor (withdrawn) .................................................................. 13
11 Microorganism inspired robots Bio-SoftRobotics (withdrawn)............................................................................. 14
12 Worm-like robots (withdrawn) ..................................................................................................................................... 14
13 Amphibious soft-robot (withdrawn) ........................................................................................................................... 15
14 Environmental control of microswarms (withdrawn) .............................................................................................. 15
15 Modelling lateral sensory lines in fish (withdrawn) ................................................................................................. 15
16 Artificial motile cilia....................................................................................................................................................... 16
17 The artificial soft-robotic sperm flagellum (withdrawn) .......................................................................................... 16
18 Microorganism inspired artificial swimmers ............................................................................................................ 17
19 Resiliency to Byzantine agents in the best-of-n decision problem ...................................................................... 17
20 Morphing wing UAV development .............................................................................................................................. 18
21 Multimodal robotic design for conservation applications...................................................................................... 18
22 Neural Model Predictive Control ................................................................................................................................. 18
23 Morphology of a robotic swarm of cubic storage units .......................................................................................... 19
24 Robot swarms for warehouse stacking ..................................................................................................................... 20
25 Swarming microsystems .............................................................................................................................................. 20
26 Augmented reality for swarming intralogistics ........................................................................................................ 20
27 Robot swarms to monitor Basking Sharks ............................................................................................................... 21
28 Swarm Post-its for Brainstorming .............................................................................................................................. 21
29 Visual navigation and object tracking for industrial swarm robots performing stacking task in a warehouse
21
30 Swarm Spatial Density Estimation.............................................................................................................................. 22
31 Telehaptic avatar ............................................................................................................................................................ 23
32 [co-created] Tactile deep learning for human-like dexterous robots ................................................................... 23
33 Swarm manipulation in industry ................................................................................................................................. 23
34 [Co-created] Next-generation biomimetic optical tactile sensing ......................................................................... 24
35 Neuromorphic tactile perception ................................................................................................................................ 24
36 Tactile feedback for prosthetic hands........................................................................................................................ 24
37 Gut feeling: Exploring the use of microbes for personality generation in robots. ............................................ 25
38 Bio-inspired underwater navigation in challenging environments ....................................................................... 25
39 Softly-non-spoken ......................................................................................................................................................... 26

2
BRL MSc Robotics 2022 January 17, 2022

40 Visual navigation with a Pixel Processor Array ....................................................................................................... 26


41 Visual perception for handheld robots ...................................................................................................................... 27
42 Computer Vision with Deep Learning for Wearable or Robotic systems ............................................................ 28
43 Visual-based semantic SLAM for navigation in dynamic environment ............................................................... 28
44 Visual localisation in the wild ...................................................................................................................................... 29
45 [co-created] Occluded face recognition .................................................................................................................... 29
46 [co-created] Biometric recognition from eye movements ...................................................................................... 30
47 [co-created] Eye tracking in reality and virtual reality ............................................................................................ 30
48 [co-created] Depth estimation from a single camera .............................................................................................. 30
49 Osillatory Neural Networks for Robotic Manipulations .......................................................................................... 31
50 Modelling of Stiff Behaviours in Spacecraft Systems with Model Based Systems Engineering (MBSE)
Framework ................................................................................................................................................................................ 31
51 Rover Mission Design, Analysis and Telemetry Visualisation with OpenMCT ................................................... 32
52 Steerable Thrusters on a CubeSat .............................................................................................................................. 32
53 Digital Factory Twinning in VR Environment ............................................................................................................ 32
54 GANs for synthetic art production.............................................................................................................................. 33
55 Machine Vision in Agriculture ..................................................................................................................................... 33
56 CNNs learning to ignore ............................................................................................................................................... 33
57 Insect sexing and counting.......................................................................................................................................... 34
58 BRL Reception Support Robot: Long-term operation and visitor interaction .................................................... 34
59 Swarm dynamic task allocation .................................................................................................................................. 35
60 Remote Robotics Outreach.......................................................................................................................................... 35
61 Robotic Assisted Bending of Ceramic Extrusions .................................................................................................. 36
62 Development of Robotic 3D printing for Architectural Ceramics.......................................................................... 36
63 An Industry 4.0 demonstrator for product storage systems.................................................................................. 37
64 Hand over from robot to humans in an industrial context ..................................................................................... 37
65 Vision assisted Collaborative manufacture of jewelry ............................................................................................ 37
66 Path generation for repairing a damaged part.......................................................................................................... 38
67 Support removal in metallic additive layer manufacturing .................................................................................... 38
68 Autonomous painting robot in construction industry ............................................................................................ 39
69 Robotic cutting of large oil platform structures ....................................................................................................... 39
70 Classification of waste using machine learning ...................................................................................................... 39
71 Motor operation prototype ........................................................................................................................................... 40
72 Sparse training of a Deep Predictive Coding network ............................................................................................ 40
73 [co-created] Encoding robot motion using spike based hardware....................................................................... 41
74 Olfaction for place recognition.................................................................................................................................... 42
75 Driver intent prediction for smart driving assistance ............................................................................................. 42
76 Generalizable driving skills transfer from driver to smart vehicle ........................................................................ 43
77 Remote assistance for smart vehicles ....................................................................................................................... 43
78 Mobile robot for elderly care in home environments .............................................................................................. 44
79 Markerless 3D Human Motion Capture for Home Environments .......................................................................... 44
80 Augmented Reality System to Guide Electrode Placement for Clinical Electrophysiology ............................. 44
81 Analysis of Brain Networks to Quantify Consciousness and Cognitive Function ............................................ 45
82 Deep Learning for Clinical Neurophysiology............................................................................................................ 45

3
BRL MSc Robotics 2022 January 17, 2022

83 Intelligent Infrared Beamforming: the Future of Home Heating ............................................................................ 46


84 Where Are My Glasses? Machine Vision and Speech Interaction in the Smart Home to Support People with
Dementia ................................................................................................................................................................................... 46
85 Development of an Interactive Projector to Provide Ubiquitous Graphical Interfaces for Smart Homes ...... 47
86 Opto-genetic Tremor Suppression: Simulation and Feasibility Analysis ............................................................ 47
87 Using a Smartwatch to Detect and Disrupt Obsessive Compulsive (OCD) Hand Washing.............................. 48
88 COVID-Cam: Using computer vision to track viral contamination in shared spaces ........................................ 48
89 Robot-assisted Composite Sheet Layup Skill Learning and generalization through Perception Feedback. 49
90 Robot-assisted ultrasound scanning skill learning to optimize ultrasound image quality .............................. 50
91 Development of a multi-functional mobile robot manipulator ............................................................................... 50
92 Intelligent mobile robot control platform based on smart phone ......................................................................... 50
93 Mobile phone enabled smart car................................................................................................................................. 51
94 Robotany An artificial ecosystem of robots [3 students] ....................................................................................... 52
95 Finding Nemo: Simulating an underwater robot for environmental monitoring ................................................ 52
96 Design and Stabilisation of an Amphibious Drone (Underwater Drone) ............................................................. 53
97 Design and Stabilisation of Distributed Flight Arrays (DFAs) ............................................................................... 53
98 Cooperative Multi-Missile Systems ............................................................................................................................ 54
99 Reinforcement control learning algorithm applied to Autonomous Underwater Vehicle (AUV) in simulation
[2 students] ............................................................................................................................................................................... 54
100 Robotics FES for Under-actuated systems[2 students]..................................................................................... 55
101 Autonomous Landing[2 students] ......................................................................................................................... 55
102 Mobile robots control via voice activation[2 students] ...................................................................................... 56
103 U-model Based Sliding Mode Control (SMC) to Increase Robustness of Various Robots[2 students]..... 56
104 Rendezvous of a Refuelling Space Tug with OneWeb Constellation............................................................... 57
105 Nukopter [3 students]............................................................................................................................................... 57
106 Parts Tracking during Spacecraft Assembly, Integration and Test with HoloLens 2 .................................... 57
107 eVToL Digital Twin Demonstrator........................................................................................................................... 58
108 Robot Manipulators for Space Debris Removal .................................................................................................. 58
109 Spacecraft Attitude Control with Internal Fuel Sloshing.................................................................................... 59
110 Spacecraft Station-Keeping around Libration Point Orbits in the Circular Restricted Three Body Problem
(CR3BP) ..................................................................................................................................................................................... 59
111 Soft Robotics: Novel components, Modelling and Simulation ......................................................................... 60
112 Soft Robotics: Intelligent Control .......................................................................................................................... 60
113 Soft Robotics: Noise & Vibration ........................................................................................................................... 60
114 Stabilisation of Under-actuated Mechanical Systems ........................................................................................ 61
115 New Soft Conjunction Design and Test of Human-Like Robot Hand Links .................................................... 61
116 Improved Variable Impedance Actuator System for Tendon-driven Robot Hand .......................................... 62
117 Exoskeleton Structure Improvement for Free-space Teleoperation ................................................................ 62
118 Robot Primitive Skill Learning from Few-shot Demonstrations based on Meta-Learning .......................... 62
119 Robust Robot Primitive Skill Programming Network for Object Grasping and Manipulation..................... 63
120 What my robot knows about me and who owns that knowledge: User data rights in service robotics [2
students] ................................................................................................................................................................................... 63
121 Robots in circular economy: The role of robots and cobots in product disassembly [2 students] ......... 64
122 Open-set (re-identification) recognition of pigs .................................................................................................. 64

4
BRL MSc Robotics 2022 January 17, 2022

123 Primitive Skill Library Building and Learning for Robot Composites Layup though Teleoperation and
Kinesthetic Demonstration .................................................................................................................................................... 65
124 Language games for language evolution ............................................................................................................. 65
125 A Robot Swarm to Assist British Rewilding ......................................................................................................... 66
126 Collective Memory for a Trustworthy Swarm ....................................................................................................... 66
127 Digital Twin for a Welding Cell ................................................................................................................................ 67
128 Unitree A1 Dog Surveyor ......................................................................................................................................... 67
129 Digital twin of a motor and disc assembly ........................................................................................................... 67
130 Modelling of a cable robot ....................................................................................................................................... 68
131 Data sonification for robot teleoperation.............................................................................................................. 68
132 Automated gesturing for a telepresence robot.................................................................................................... 68
133 [co-created] Reinforcement Learning for Robotic Manipulation Based on Multi-Sensor Fusion .............. 69
134 [co-created] Design of a two-finger robotic hand with tactile feedback ......................................................... 69
135 Soft robotic hand for in-hand manipulation ......................................................................................................... 70
136 Surgical Gesture Segmentation and Recognition for Medical Robotic Applications ................................... 70
137 [co-created] Human-Robot Cooperative Control for Medical Robotics .......................................................... 71
138 Sim-to-Real Transfer Learning for Robotic Manipulation .................................................................................. 71
139 Robotic Skill Learning for Cooking ....................................................................................................................... 71
140 Robot Odometry using Sparse LIDAR .................................................................................................................. 72
141 Behavior-based modelling and simulation for streaming drones .................................................................... 72
142 Speaking microbial language [2 students] ........................................................................................................... 73
143 Eating, sensing, dying like a beetle [2 students]................................................................................................. 73
144 Biological robotic tongue ........................................................................................................................................ 74
145 Design of an interface for lower-limb rehabilitation using the Franka arm .................................................... 74
146 Upper-limb rehabilitation using Franka robotic arm .......................................................................................... 74
147 Self-powered pH-responsive mechanical meta-materials [2 students] ........................................................... 75
148 Tactile-based learning for human-robot interaction ........................................................................................... 75
149 Mechanical Metamaterials for Soft Robots .......................................................................................................... 75
150 Mechanical Metamaterials for Soft Robots .......................................................................................................... 76
151 Development of fully soft Bubble Artificial Muscles to create an assistive device for human body motion
76
152 Exploring designs and patterns of 2D pneumatic actuator for various robotic applications ..................... 77
153 Rapid stiffening capability of flexible 2D and 3D electroactive structures..................................................... 78
154 Tendon-driven stiffening structure ........................................................................................................................ 78
155 Bistable soft structure inspired from bistable mechanism ............................................................................... 79
156 Intelligent Visuo-Tactile Robotic Skin with Adaptive Morphology ................................................................... 79
157 Dynamic Motion Primitive realisation with mechanical springs for effective locomotion ........................... 80
158 Exploitation of Morphological Features with Preditive Information Maximisation ....................................... 80
159 Phonetic alphabet generation with Soft Structures ............................................................................................ 81
160 Peristaltic Robot for Debris Clearance.................................................................................................................. 81
161 Coverage path planning of a marsupial UAV-ground robot [2 students] ........................................................ 82
162 Cat Swarm Optimization in Swarm Robotics ....................................................................................................... 83
163 Evaluating the use of haptic feedback modality for human-in-the-loop multirobot teams [2 students] ... 83
164 Determining human intervention for multirobot decision making [2 students]............................................. 84

5
BRL MSc Robotics 2022 January 17, 2022

165 Robot companions ................................................................................................................................................... 84


166 [co-created] Assessment of human-computer interaction using vision-based behavioural cues ............. 85
167 Effect of Spatial Density on Swarm Problem Solving ........................................................................................ 85
168 Self-Organised Heterogenous Swarm Partitioning ............................................................................................. 86
169 Low-cost Multi-Robot Tracking System ................................................................................................................ 86
170 Cross-Talk: Examining Local Communication in Swarm Robotics ................................................................. 87
171 Intention Recognition for Swarm Robotic Box Pushing .................................................................................... 87
172 Attention Holding in Human-Swarm Interaction.................................................................................................. 88
173 Objectively Interesting Robotic Drawings ............................................................................................................ 89
174 Robotic fish ................................................................................................................................................................ 89
175 Embodied digital logic ............................................................................................................................................. 90
176 Trustworthy machine learning based agile flight control of quadcopters ...................................................... 90
177 Bio-inspired soaring in urban environments ....................................................................................................... 91
178 Bio-inspired distributed flow sensing ................................................................................................................... 91
179 Design through Stimulation - Designing Robot Morphologies through Mechano-stimulation of Stem Cells
in Simulation ............................................................................................................................................................................. 91
180 Machine learning-based automatic control of living cells [2 students] .......................................................... 92
181 Human-swarm interactions for intralogistics ...................................................................................................... 92
182 Aerial robotics for conservation applications [5 students] ............................................................................... 93
183 Safety braking and manoeuvrability of hospital beds with assisted drive capabilities (Focus on
Mechanical Engineering) ........................................................................................................................................................ 93
184 IoT conversion of self-powered hospital bed (Focus on Software Engineering) .......................................... 95
185 [co-created] Decision-making and control for simulated highway autonomous driving based on
reinforcement learning............................................................................................................................................................ 96
Project Supervisors....................................................................................................................... 97
1. Arthur Richards .............................................................................................................................................................. 97
2. Edmund Hunt ................................................................................................................................................................. 98
3. Helmut Hauser ............................................................................................................................................................... 98
4. Hermes Gadelha ............................................................................................................................................................ 99
5. Chanelle Lee................................................................................................................................................................. 100
6. Shane Windsor ............................................................................................................................................................ 101
7. Tom Richardson........................................................................................................................................................... 101
8. Sebastian East.............................................................................................................................................................. 102
9. Sabine Hauert .............................................................................................................................................................. 102
10. Paul O'Dowd........................................................................................................................................................... 103
11. Nathan Lepora ....................................................................................................................................................... 104
12. Benjamin Ward-Cherrier ..................................................................................................................................... 105
13. Hemma Philamore ................................................................................................................................................ 106
14. Walterio Mayol-Cuevas ....................................................................................................................................... 106
15. Wenhao Zhang ...................................................................................................................................................... 107
16. Roshan Weerasekera............................................................................................................................................ 108
17. Yaseen Zaidi ............................................................................................................................................................ 109
18. Mark Hansen.......................................................................................................................................................... 109
19. Manuel Giuliani ..................................................................................................................................................... 110

6
BRL MSc Robotics 2022 January 17, 2022

20. Matthew Studley................................................................................................................................................... 111


21. Tavs Jorgensen ....................................................................................................................................................... 111
22. Farid Dailami .......................................................................................................................................................... 112
23. Martin Pearson ...................................................................................................................................................... 113
24. Ning Wang .............................................................................................................................................................. 114
25. David Western ....................................................................................................................................................... 115
26. Chenguang Yang .................................................................................................................................................... 116
27. Hamidreza Nemati ................................................................................................................................................ 117
28. Quan Zhu ................................................................................................................................................................ 118
29. Zhenyu Lu ............................................................................................................................................................... 118
30. Anna Chatzimichali ............................................................................................................................................... 119
31. Conor Houghton.................................................................................................................................................... 120
32. Paul Bremner ......................................................................................................................................................... 120
33. Dandan Zhang........................................................................................................................................................ 121
34. Jiseon You ............................................................................................................................................................... 122
35. Virginia Ruiz Garate .............................................................................................................................................. 122
36. Amir Bolouri ........................................................................................................................................................... 123
37. Jonathan Rossiter.................................................................................................................................................. 124
38. Mohammad Naghavi Zadeh ............................................................................................................................... 125
39. Andrew Conn ......................................................................................................................................................... 125
40. Lucia Marucci ......................................................................................................................................................... 126
41. Tony Pipe................................................................................................................................................................. 126

7
BRL MSc Robotics 2022 January 17, 2022

Forward
This booklet lists the MSc dissertation projects that are available for you to choose
from. A short description of each project, a project ID number, and the name of the
supervisor(s) are given. The supervisors’ profiles are listed at the end of the booklet.
You can jump to the supervisor’s profile by clicking on the supervisor name next to
any of the projects. Please contact the supervisor directly if you have a question
regarding a project using the email address given in the supervisor profile.
Each student should use the online form to select 5 projects, ranked in order of
preference. Each project has a unique project ID number. Use the project ID number
to indicate your selections on the form.
https://forms.office.com/Pages/ResponsePage.aspx?id=MH_ksn3NTkql2rGM8aQVG
_PiUNuyF8BMtxn7YlTT6d9UMk0wM1MzSk9ISzRLV1BVVkRIVk1ISkxGMy4u
If you are co-creating a project, you should first agree the project description with
your supervisor. Your supervisor will submit the project description on your behalf.
When completing the online form, answer ‘Yes’ to the question, ‘Are you co-creating
your dissertation project?’.
The deadline to submit your preferences using the online form is Friday 21st of
January 2022. After this deadline, we will make the final assignment of project to
students, trying our best to give you the highest ranked choice possible.
Best regards,
Dr. Hemma Philamore (UoB Unit Director) hemma.philamore@bristol.ac.uk
Dr. Ning Wang (UWE Dissertation Module Coordinator) ning2.wang@uwe.ac.uk
Dr. Helmut Hauser (UoB) helmut.hauser@bristol.ac.uk

8
BRL MSc Robotics 2022 January 17, 2022

The projects

1 Safely Automating Drone Handling


Supervisor: Arthur Richards,Hirad Goudarzi

Project ID: 1 Contact: arthur.richards@bristol.ac.uk

Programme: MSc Aerial Robotics, MSc Robotics

Description:
Extensive work has been done to automate the flight of drones - but before and after flight
there are extensive periods of initialization and handling that are almost entirely manual.
Activities include checking and fitting batteries, starting software, verifying connectivity and
navigation, and visual inspection. These activities can be hazardous: you really don’t want a
drone to spin up while you are holding it. Some of these activities are naturally suited to
humans, while others could be automated. This project will adopt a hybrid approach to build a
semi-autonomous drone handling station in the flying arena at BRL. The overall goal will be to
support a human operator in getting drones "turned around" and back ready for flight
efficiently and safely. Ideas from smart manufacturing will be leveraged to identify a smooth
and easily supervised process flow. The project is suitable for students with good software and
system integration skills and preferably some electronics.

2 Automated filming of indoor drone experiments


Supervisor: Arthur Richards, Mickey Li

Project ID: 2 Contact: arthur.richards@bristol.ac.uk

Programme: MSc Aerial Robotics, MSc Robotics

Description:
Many publications venues expect high quality media in order for a paper to be accepted. This
is especially needed in single and multi-drone experimentation where video has a lot of value.
This project aims to look at the design and implementation of an automated filming system for
the Bristol Robotics Laboratory Flight Arena. This includes questions such as the optimal design
of a camera setup, the automated control of a camera rig in order to capture video of interest,
allocation of objects to cameras, the integration of the setup into the existing flight arena
systems and the user interaction. Mixed vision and tracking processing for automated editing
could also be considered, perhaps even with a machine learning approach to footage selection.
This project is suitable for candidates with backgrounds in electronics and software with

9
BRL MSc Robotics 2022 January 17, 2022

interests in robotics for filming and will involve development within the BRL.

3 Self-Replenishing Drone Team


Supervisor: Arthur Richards, Mickey Li

Project ID: 3 Contact: arthur.richards@bristol.ac.uk

Programme: MSc Aerial Robotics, MSc Robotics

Description:
For applications involving monitoring, it is important for system downtime to be minimised.
This is a challenge for a team of drones as any single drone is limited by short battery life and
other factors. One possible method is to have a ‘spare' drone which is available to take over the
task of a working drone, therefore freeing the drone to have its battery replaced or undergo
maintenance. But there are questions as to how the spare knows to attempt a task handover,
what happens if multiple vehicles fail and so on. Therefore, this project aims to look at planning
strategies which can minimise the mission downtime when using a team of drones for a
monitoring scenario. Interested students should have sound scientific programming skills and
either know, or be willing to pick up python, and also have willingness to learn about and
evaluate different task allocation/mission planning methods. If the project goes well, and the
candidate has strong programming skill, there is also the opportunity to fly the method on real
drones at the Bristol Robotics Laboratory Flight Arena.

4 Distributed Heartbeat based Multi-UAV Health Monitoring using Bluetooth


Supervisor: Arthur Richards, Mickey Li, Hirad Goudarzi

Project ID: 4 Contact: arthur.richards@bristol.ac.uk

Programme: MSc Aerial Robotics, MSc Robotics

Description:
Communication is a key aspect of practical multi-uav teaming, not only for transferring data,
but for determining the health of fellow teammates. UAVs are fairly failure prone, and knowing
the health status of teammates enables agents to decide whether they should change their
plans in order to complete their assigned mission. This project aims to apply a new synchronous
flooding communications protocol known as Atomic
(https://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=8726297) over Bluetooth to
implement a heartbeat based health monitoring system for multiple UAVs doing a task. This
will require designing both the communication and implementing a distributed decision-
making protocol for drones. Interested students should ideally have some experience with

10
BRL MSc Robotics 2022 January 17, 2022

embedded hardware and programming in C/C++ as well as willingness to learn about and
evaluate different distributed decision-making protocols. The project will be based in the Bristol
Robotics Laboratory Flight Arena and, if things go well, will include the opportunity to fly real
drones.

5 Drone traffic management in urban environments


Supervisor: Arthur Richards, Hirad Goudarzi

Project ID: 5 Contact: arthur.richards@bristol.ac.uk

Programme: MSc Aerial Robotics, MSc Robotics

Description:
A previous project validated the concept of a “ring road” for drones, where reduced ground
risk enabled higher speeds, and offering both faster and safer transit of populated areas. This
project will seek to generalize the work so far. Several directions are possible, including an
abstract analysis looking at simple geometric arrangements of ring roads and parametric
studies covering speed ratios and demand densities. Alternatively, a more concrete approach
could explore the impact of different detailed designs on network performance, looking at
different ring road routing options around Bristol or another candidate city. Routing is partly a
question of policy: is it “acceptable” to route drones over roads, railways, rivers or rooftops?
However, that policy needs to be informed by studies of the impact of those different choices.
This project is entirely software-based and is suitable for students with good scientific
programming skills, especially Python but possibly Matlab.

6 MCMC methods for stochastic robot swarm control


Supervisor: Edmund Hunt

Project ID: 6 Contact: edmund.hunt@bristol.ac.uk

Programme: MSc BioRobotics, MSc Robotics

Description:
The philosophy of swarm robotics is that large numbers of simple agents can achieve significant
tasks, overcoming simplicity through strength in numbers. Such a task could be exploring an
environment to find concentrated patches of pollution, for example. Markov chain Monte Carlo
(MCMC) methods are a class of algorithms used to sample from probability distributions. A
“walker”™ moves through the distribution and obtains samples as it goes. Given enough time,
these samples resemble the target distribution. We have compared this movement [1] to the
movement of real animals, ants, which are a prime source of inspiration for swarm robotics.
11
BRL MSc Robotics 2022 January 17, 2022

Effective MCMC methods tend to combine randomness (stochasticity) in walker movement


with some form of gradient following (i.e. tending to prefer movement toward higher
probability). In this project, we will consider whether MCMC methods can be used to inspire
movement controllers for minimal, stochastic robot swarms. We will use the CoppeliaSim
simulator (https://www.coppeliarobotics.com/). [1] Baddeley, R. J., Franks, N. R., & Hunt, E.
R. (2019). Optimal foraging and the information theory of gambling. Journal of The Royal
Society Interface, 16(157), 20190162.

7 Environmental sensing through minimal yes/no robot swarms


Supervisor: Edmund Hunt

Project ID: 7 Contact: edmund.hunt@bristol.ac.uk

Programme: MSc BioRobotics, MSc Robotics

Description:
Swarm systems such as ant colonies can make collective decisions about their environment
even though individuals only make simple yes/no decisions [1]. This project will investigate
whether swarms of minimal robots, that have an on/off threshold for some environmental
quality (e.g. a level of pollution) can indicate its spatial distribution through spreading out
randomly throughout the space. We will consider the optimal distribution of activation
thresholds in the robots and the circumstances in which this approach might be practically
useful. We will use the CoppeliaSim simulator (https://www.coppeliarobotics.com/). [1]
Hasegawa, E. et al. (2017). Nature of collective decision-making by simple yes/no decision units.
Scientific Reports, 7(1), 1-11.

8 Improving Physical Reservoir Computing [2 students]


Supervisor: Helmut Hauser

Project ID: 8 Contact: helmut.hauser@bristol.ac.uk

Programme: MSc BioRobotics, MSc Robotics

Description:
Reservoir computing is a machine learning technique that allows us to learn to emulate
complex nonlinear dynamical systems with simple linear regression. The key idea is to have a
so-called reservoir, which is itself a complex, nonlinear dynamical system that we simply exploit
as a computational resource. While traditionally the reservoirs are made up of abstract
nonlinear systems, in recent years, numerous research groups have shown that such a reservoir
can also be realised as real physical bodies ranging from setups using nonlinear effects in lasers,
12
BRL MSc Robotics 2022 January 17, 2022

capacity networks and mechanical structures (e.g., parts of a robot). In the context of robotics,
this means the soft body of the robot can be exploited for computational tasks [1],[2], including
for control and sensing. This is typically referred to as physical reservoir computing [3]. This
project explores how the computational power can be extended by including special types of
dynamics in the robot body, e.g. bifurcation, bi-stability or hysteresis. The project allows for a
number of different approaches (simulation, real robots, the use of machine learning, etc.) and
details can be discussed with the supervisor. Requirements: understanding of nonlinear
dynamical systems, Python programming [1] Hauser, H.; Ijspeert, A.; Fuchslin, R.; Pfeifer, R. &
Maass, W. "Towards a theoretical foundation for morphological computation with compliant
bodies" Biological Cybernetics, Springer Berlin / Heidelberg, 2011, 105, 355-370
http://www.springerlink.com/content/j236312507300638/[2] Hauser, [2] H.; Ijspeert, A.;
Fuchslin, R.; Pfeifer, R. & Maass, W. "The role of feedback in morphological computation with
compliant bodies" Biological Cybernetics, Springer Berlin / Heidelberg, 2012, 106, 595-613
http://www.springerlink.com/content/d54t39qh28561271/ [1] 1. Hauser, H. Physical
Reservoir Computing in Robotics. Nat. Comput. Ser. 169–190 (2021). doi:10.1007/978-981-
13-1687-6_8

9 Soft-robotics of the sperm tail


Supervisor: Hermes Gadelha

Project ID: 9 Contact: hermes.gadelha@bristol.ac.uk

Programme: MSc BioRobotics, MSc Robotics

Description:
The human sperm tail in an intelligent soft material composed by many interacting fibres
connected by elastic springs. This intelligent structure is able to share mechanical information
across long distances instantaneously and adapt under different loading conditions. Most
curiously, under deformation of one part of the tail, the other part can bend in the opposite
direction, something known as the “counterbend” phenomenon. This project aims to take this
biological architecture to design the next generation of articulated soft robots. The student will
build a soft-robotic arm by just using strips of a piece of paper, from which bending experiments
will be conducted. Images of the different deformations will be compared with a simulational
tool of this system (provided) so that the viability of the design may be assessed. This project
is based on the current need for cheap and intelligent materials that are able to communicate
mechanical information and self-adapt under different situations.

10 Soft-robotics spontaneous travelling waves locomotor (withdrawn)


Supervisor: Hermes Gadelha

13
BRL MSc Robotics 2022 January 17, 2022

Project ID: 10 Contact: hermes.gadelha@bristol.ac.uk

Programme: MSc BioRobotics, MSc Robotics

Description:
Molecular motors are force generators which can produce mechanicaloscillations at the cellular
scale and play a central role in muscle contractionand cell motility. Insect fibrilar muscles of
wasps and bees are known to exhibita wing thrust which is asynchronous to the activating
nervous impulses. Skinned skeletal and cardiac muscle fibers have also been shown to
exhibitspontaneous oscillations in vitro under various conditions. Other non-
muscularexamples are the periodic motion of cilia and flagella used for cell motility andthe
amplification of weak stimuli via oscillatory instabilities of mechanosensoryhair bundles.Early
models proposed to understand such systems weregrounded on experimental evidences of
muscle contraction and weresubsequently generalized to the study of eukaryotic flagella. In
this project, we will build artificial generic soft-robotic model for the collective dynamics of
molecular motors which generates spontaneous oscillations.

11 Microorganism inspired robots Bio-SoftRobotics (withdrawn)


Supervisor: Hermes Gadelha

Project ID: 11 Contact: hermes.gadelha@bristol.ac.uk

Programme: MSc BioRobotics, MSc Robotics

Description:
The idea is to construct and macroscopic bio-inspired robot that is able to swim via travelling
waves of contractions along a flexible tail. Other micrrorganisms may be used as inspiration,
such as robo-bacterium, robo-paramelcium, point-force robots, flexible-sheet robot, 3D
printed sperm flagellum, polymorphic bacterium flagellum, Turing self-organised robots.

12 Worm-like robots (withdrawn)


Supervisor: Hermes Gadelha

Project ID: 12 Contact: hermes.gadelha@bristol.ac.uk

Programme: MSc BioRobotics, MSc Robotics

Description:
This project aims to devise a robot capable of moving in porous media. It is a challenging task
14
BRL MSc Robotics 2022 January 17, 2022

to achieve locomotion or propulsion in porous environment, for example sand. This project
draws inspiration from warms that are capable of efficiently moving in complex environment.

13 Amphibious soft-robot (withdrawn)


Supervisor: Hermes Gadelha

Project ID: 13 Contact: hermes.gadelha@bristol.ac.uk

Programme: MSc BioRobotics, MSc Robotics

Description:
Terrestrial organisms evolved by migrating from its quatic form to a terrestrial form. Very little
is known about how this actually took place over millions of years. The idea of this project is to
devise soft-robots capable of moving in both aquatic and terrestrial environments.

14 Environmental control of microswarms (withdrawn)


Supervisor: Hermes Gadelha

Project ID: 14 Contact: hermes.gadelha@bristol.ac.uk

Programme: MSc BioRobotics, MSc Robotics

Description:
Microorganisms swim at the microscale in all random directions. It is a difficult task to control
the swarming motion without externally interfinging with their motion in a direct manner. This
project builds on recent development that allows to "print" light patterns at the microscale.
This allows local changes in temperature and viscosity of the environment, as well as guide
phototatic microorganisms (micro-swimmers able to swim toward light), that in turn could be
used to guide and achieve high-levels of control of a swarming population of micro-swimmers.

15 Modelling lateral sensory lines in fish (withdrawn)


Supervisor: Hermes Gadelha

Project ID: 15 Contact: hermes.gadelha@bristol.ac.uk

Programme: MSc BioRobotics, MSc Robotics

15
BRL MSc Robotics 2022 January 17, 2022

Description:
Most of fish are capable of sensing pressure waves in the fluid environment via specialised
lateral sensory lines. Recent development in Hauert laboratory includes the first design of the
first artificial sensory lines for robotic swimming. This project aims to tackle this problem using
mathematical modeling and simulation of the predicted behaviour of the sensory lines, build
as flexible filaments whiare are able to bend and buckle according to the external forces in the
fluid environment. This is an interdiciplinary project aimed at predicting the mechanical
behaviour from the sensory lines for later comparison with the experimntal emasurements of
the device.

16 Artificial motile cilia


Supervisor: Hermes Gadelha

Project ID: 16 Contact: hermes.gadelha@bristol.ac.uk

Programme: MSc BioRobotics, MSc Robotics

Description:
Cilia is a motile cell apendage with a whip-like structure that beats in a coordinated fhashion
to drive fluid and propel microorganisms. In the lungs, cilia is responsible for clearing our
airways from mucus. This project aims to fabricate robotic cilia capable of driving fluid in a
coordinated fashion.

17 The artificial soft-robotic sperm flagellum (withdrawn)


Supervisor: Hermes Gadelha

Project ID: 17 Contact: hermes.gadelha@bristol.ac.uk

Programme: MSc BioRobotics, MSc Robotics

Description:
The human sperm tail in an intelligent soft material composed by many interacting fibres
connected by elastic springs. This intelligent structure is able to share mechanical information
across long distances instantaneously and adapt under different loading conditions. Most
curiously, under deformation of one part of the tail, the other part can bend in the opposite
direction, something known as the counterbend phenomenon. This project aims to take this
biological architecture to design the next generation of articulated soft robots. The student will
build a soft-robotic sperm tail by just using strips of a piece of paper, from which bending
experiments will be conducted. Images of the different deformations will be compared with a

16
BRL MSc Robotics 2022 January 17, 2022

simulations so that the viability of the design may be assessed. This project is based on the
current need for cheap and intelligent materials that are able to communicate mechanical
information and self-adapt under different situations. See for example:
https://edition.cnn.com/2020/07/31/health/sperm-swimming-movement-
wellness/index.html

18 Microorganism inspired artificial swimmers


Supervisor: Hermes Gadelha

Project ID: 18 Contact: hermes.gadelha@bristol.ac.uk

Programme: MSc BioRobotics, MSc Robotics

Description:
The idea is to construct and macroscopic bio-inspired robot that is able to swim via travelling
waves of contractions along a flexible tail. Other micrrorganisms may be used as inspiration,
such as robo-bacterium, robo-paramelcium, point-force robots, flexible-sheet robot, 3D
printed sperm flagellum, polymorphic bacterium flagellum. The most innovative part of this
project is that few "point forces" can be used to mimic the fluid motion of complex moving
objects, so one can avoid the construction of complex structure of the microorganisms. See
for example: https://edition.cnn.com/2020/07/31/health/sperm-swimming-movement-
wellness/index.html

19 Resiliency to Byzantine agents in the best-of-n decision problem


Supervisor: Chanelle Lee

Project ID: 19 Contact: c.l.lee@bristol.ac.uk

Programme: MSc BioRobotics, MSc Robotics

Description:
"Swarm robotic systems are often claimed to be highly fault-tolerant; however, few works have
considered their resiliency in the presence of Byzantine agents, i.e. agents with benign faulty
or malicious behaviours. In consensus achievement problems, resilience becomes a trade-off
between effectively exploiting information provided by reliable agents while mitigating the
influence of that from faulty or malicious agents. In this project, we consider the best-of-n
decision problem wherein the swarm of agents must choose between n options each with an
associated quality which is received as feedback by members of the swarm. We will use a
probabilistic model of opinion propagation and agent-based simulations to tackle two
questions: - What limits should be imposed on agent interactions when there are Byzantine
17
BRL MSc Robotics 2022 January 17, 2022

agents? - How can we identify faulty or malicious agents in the system? This project would
suit a student looking for a balance of theoretical work and programming simulations. It will
involve writing large scale simulations in python (or Matlab) as well as developing theoretical
ideas about Bayesian updating in this context. Extensions to the project could include using
more realistic robotic simulation software such as V-rep. The findings of this project will be
useful for improving the security of swarm robotic systems and thus of interest to swarm
robotic engineers and those with a strong interest in reliable multi-agent systems e.g. Thales."

20 Morphing wing UAV development


Supervisor: Shane Windsor

Project ID: 20 Contact: shane.windsor@bristol.ac.uk

Programme: MSc Aerial Robotics

Description:
The way birds morph their wings to achieve manoeuvrable yet efficient flight offers inspiration
for alternative designs for fixed wing uncrewed air vehicles (UAVs). This project will look at
designing and flight-testing UAVs with morphing wings based on research we are conducting
on how birds of prey use wing morphing for flight control.

21 Multimodal robotic design for conservation applications


Supervisor: Tom Richardson

Project ID: 21 Contact:


thomas.richardson@bristol.ac.uk

Programme: MSc Aerial Robotics

Description:
Design a robot that can both drive/crawl on the ground as well as fly - without too much
compromise in either mode. Research areas of interest are in the design optimisation and the
control of the vehicle (my background is in control theory)

22 Neural Model Predictive Control


Supervisor: Sebastian East

18
BRL MSc Robotics 2022 January 17, 2022

Project ID: 22 Contact: sebastian.east@bristol.ac.uk

Programme: MSc Aerial Robotics, MSc BioRobotics, MSc Robotics

Description:
This project is an investigation into the use of neural networks for synthesising model predictive
control policies. There are two possible directions to take with this project: either a theoretical
approach that considers stability and constraint satisfaction of the learned controller (would be
suitable for a student with a strong background in mathematics, including real analysis), or an
applied approach looking at the experimental implementation of neural MPC (in either
hardware or simulation).

23 Morphology of a robotic swarm of cubic storage units


Supervisor: Sabine Hauert

Project ID: 23 Contact: sabine.hauert@bristol.ac.uk

Programme: MSc BioRobotics, MSc Robotics

Description:
CHALLENGE: Use artificial evolution to optimise the morphology of a robotic swarm of cubic
storage units. Robots can move within a 3D cartesian grid and also stack on top of other robots.
Optimal morphologies will allow for as many robots to be stored as possible, whilst also
allowing any arbitrary robot to be retrieved using the minimum amount of energy. CONTEXT:
Current shelf-storage systems require shelving infrastructure to be installed before use, which
makes them less suitable for scenarios that need rapidly deployable storage solutions. Instead,
a swarm of robotic cubes inspired by M-cubes, could function as storage units and self-organise
in a 3D cartesian grid by stacking on top of themselves. The morphology into which the swarm
arranges itself should be optimised to allow any arbitrary robot to be retrieved using the
minimum amount of total energy. However, the number of possible morphologies scales very
quickly with volume (proportional to the factorial of the storage volume!), so an intelligent
approach to exploring the morphological search space is needed. Note that robots that are
stacked on top of a target robot will first need to expend energy to move before the target
robot will be able to move. So, a representative method for ranking and comparing the
efficiency of different morphologies should also be considered." If you have any thoughts
about how I could improve the way I've described the project, then I would appreciate any
feedback you could give. Also, if there is any information that I've missed out, or any other steps
I should be taking to get a master project proposal going at this time then please let me know.
Once I've finalised the project description, I'll send this to Marius to ask if he would be
interested in co-supervising such a project as an industrial contact.

19
BRL MSc Robotics 2022 January 17, 2022

24 Robot swarms for warehouse stacking


Supervisor: Sabine Hauert

Project ID: 24 Contact: sabine.hauert@bristol.ac.uk

Programme: MSc BioRobotics, MSc Robotics

Description:
To increase efficiency, more and more warehouses are automated using mobile robots (mostly
centralized systems). To make best use of the space, robots will need to stack items. This project
looks at how a swarm of robots with robot arms can grip, transport, and stack objects in a
warehouse. Experiments will be done in simulation and on the Industrial Swarm Testbed.

25 Swarming microsystems
Supervisor: Sabine Hauert

Project ID: 25 Contact: sabine.hauert@bristol.ac.uk

Programme: MSc BioRobotics, MSc Robotics

Description:
This project focusses on the control of microswarms using the Dynamic Optical Micro
Environment described here:
https://www.biorxiv.org/content/10.1101/2020.12.28.424547v1?ijkey=2bc48ed58907b3c6ee
5f72b26fb0b6e2da970930&keytype2=tf_ipsecsha. Focus will be on designing a simulator to
explore interesting swarm behaviours, and testing results on off-the-shelf volvox (algea).

26 Augmented reality for swarming intralogistics


Supervisor: Sabine Hauert

Project ID: 26 Contact: sabine.hauert@bristol.ac.uk

Programme: MSc BioRobotics, MSc Robotics

Description:
This project focuses on the implementation of an augmented reality system using the hololense
to monitor and control swarm intralogistics scenarios. This will require programming of the

20
BRL MSc Robotics 2022 January 17, 2022

hololense, ROS, and understanding how best to interface with the DOTS robots in the Industrial
Swarm Arena at BRL.

27 Robot swarms to monitor Basking Sharks


Supervisor: Sabine Hauert

Project ID: 27 Contact: sabine.hauert@bristol.ac.uk

Programme: MSc BioRobotics, MSc Robotics

Description:
Basking sharks have amazing filter systems that allow them to process food. With this project
you will design hardware that could be used to monitor these sharks in the wild, including an
underwater robot platform, or a mechanism to release and image artificial plankton.

28 Swarm Post-its for Brainstorming


Supervisor: Sabine Hauert

Project ID: 28 Contact: sabine.hauert@bristol.ac.uk

Programme: MSc BioRobotics, MSc Robotics

Description:
This project explores how a swarm of 100 screen-robots can be used as smart post-its in
brainstorming sessions. Each robot can record the thoughts of a participant, and organise with
other robots to highlight trends.

29 Visual navigation and object tracking for industrial swarm robots


performing stacking task in a warehouse
Supervisor: Sabine Hauert

Project ID: 29 Contact: sabine.hauert@bristol.ac.uk

Programme: MSc BioRobotics, MSc Robotics

21
BRL MSc Robotics 2022 January 17, 2022

Description:
Designing a vision system for multiagent, industrial swarm robots with robotic arm, that can
move around and perform objects manipulation and stacking task in a warehouse. The vision
system shall be based on dynamic SLAM for localization, mapping, and dynamic object tracking.
The robots can use onboard camaras or pixel process array-based SCAMP-5 vision sensor for
visual odometry and object tracking. Each swarm robot agent can have dextrous manipulation
capability using a tactile end-effector called Tactip. The vision system can be used for
locating/tracking the objects that needs to be manipulated, tracking other moving agents and
for navigating robot's path around the warehouse. Each robot can work together with other
robots in the swarm to manipulate objects and move around in the warehouse along with
human stuff. The experiments shall be performed in simulation and on industrial swarm robots,
X-pucks, testbed.

30 Swarm Spatial Density Estimation


Supervisor: Paul O'Dowd

Project ID: 30 Contact: paul.odowd@bristol.ac.uk

Programme: MSc BioRobotics, MSc Robotics

Description:
Swarms of robots are often designed to operate for a specific task and task environment. In
the natural world we can observe a high degree of adaptability of swarm agents across varied
environments. For example, we can observe egg-sorting behaviours of ants within the close-
confines of a nest, and foraging behaviours of ants in large spaces outside of the nest. These
two tasks have very different environments. Inside the nest, ants operate alongside each other
with a high spatial density. Outside of the nest, ants forage with a lower spatial density.
However, the ants are able to adapt their behaviour appropriately depending on their
environment. This observation motivates research to find a technological solution for a swarm
of robots to estimate how many robots exist within a task environment. Such information could
be useful for a swarm to adapt or regulate it’s behaviour across different task environments.
For example, it may be useful for some robots to stop their operation when there are too many
robots within a space. Conversely, a swarm of robots may need to maintain a specific degree
of spatial-density when they are in unbounded environments, such as outdoors. Because this
is a swarm, the algorithm must remain decentralised and self-organising. This study will utilise
a swarm of Pololu 3Pi+ robots fitted with Infra-red (IR) communication boards. The project
should look to implement a form of complementary filter to combine proprioceptive and
exteroceptive information available to the robot, which can then be propagated across the
swarm. For example, messaging frequency between robots may be a useful indication of
swarm density, as well as collisions experienced by individual robots. This study will look to
evaluate how well a collective-belief on spatial-density can be established and communicated
across a swarm. If time permits, demonstration behaviours of maintaining specific spatial-
densities of robots will be evaluated.
22
BRL MSc Robotics 2022 January 17, 2022

31 Telehaptic avatar
Supervisor: Nathan Lepora

Project ID: 31 Contact: n.lepora@bristol.ac.uk

Programme: MSc BioRobotics, MSc Robotics

Description:
This project will investigate how you can immerse yourself into a haptic avatar that relays haptic
sensations from a dexterous robot to a human inspired by the Shadow Tactile Telerobot and
the ANA Avatar X-prize.

32 [co-created] Tactile deep learning for human-like dexterous robots


Supervisor: Nathan Lepora

Project ID: 32 Contact: n.lepora@bristol.ac.uk

Programme: MSc BioRobotics, MSc Robotics

Description:
Co-created project with Shipeng Xiong (MSc student).

33 Swarm manipulation in industry


Supervisor: Nathan Lepora

Project ID: 33 Contact: n.lepora@bristol.ac.uk

Programme: MSc BioRobotics, MSc Robotics

Description:
In this project, we aim to combine work from two research groups (Tactile dexterity - Lepora,
Swarms - Hauert) to create a novel swarm with dexterous manipulation capabilities. Each robot
in the swarm will have a robot arm, which then team together to form a swarm of tactile grippers
that can manipulate and flexibly move items around the warehouse. This is a completely novel
application of swarms that emerges from leading developments in the two research groups at
Bristol Robotics Laboratory.
23
BRL MSc Robotics 2022 January 17, 2022

Videos:
https://www.toshiba.eu/common/css/ltr/ToshibaNEP/video/Toshiba_Swarm_Robotics_case_s
tudy.mp4

34 [Co-created] Next-generation biomimetic optical tactile sensing


Supervisor: Nathan Lepora

Project ID: 34 Contact: n.lepora@bristol.ac.uk

Programme: MSc BioRobotics, MSc Robotics

Description:
Co-created project with Xuyang Zhang (MSc student).

35 Neuromorphic tactile perception


Supervisor: Benjamin Ward-Cherrier

Project ID: 35 Contact: b.ward-cherrier@bristol.ac.uk

Programme: MSc BioRobotics, MSc Robotics

Description:
Spiking Neural Networks (SNN) are considered the new generation of neural networks that
closely emulate the spike-based information processing undertaken by our own nervous
system. This project will aim to investigate the similarities which may be drawn between
artificial and human sensing, building upon previous work on the optical neuromorphic tactile
sensor (NeuroTac) developed at Bristol. The project will involved the adaptation of the
NeuroTac to investigate its potential as a neuromorphic sensor integrated with spiking neural
networks. This work may result in a greater understanding of tactile sensing and information
processing in biologically plausible models.

36 Tactile feedback for prosthetic hands


Supervisor: Benjamin Ward-Cherrier

Project ID: 36 Contact: b.ward-cherrier@bristol.ac.uk

Programme: MSc BioRobotics, MSc Robotics


24
BRL MSc Robotics 2022 January 17, 2022

Description:
"This project involves exploring tactile sensation in prosthetics, with the aim being to provide
high-quality haptic information to amputees. Our system will use OpenBionic'sTM Brunel hand
and provide vibrotactile feedback through a haptic armband. Work on this project will involve
investigating feedback quality and comfort with able-bodied participants. The project could
inform our understanding of tactile sensation and help develop a real-world solution to deliver
feedback to amputees.

37 Gut feeling: Exploring the use of microbes for personality generation in


robots.
Supervisor: Hemma Philamore, Martin Garrad, Jiseon You

Project ID: 37 Contact: hemma.philamore@bristol.ac.uk

Programme: MSc BioRobotics, MSc Robotics

Description:
A microbial fuel cell (MFC) can be described as a biological battery that supplies an electrical
charge due to electrons released through respiration, as anaerobic bacteria break down organic
matter inside of the MFC. The electrical output of the MFC is relative to the energy content of
the organic matter and the health of the biofilm of bacteria, and therefore provides a) the
capability to sense the composition of the organic matter and b) a random individual response
of each MFC due to the self-organization of the bacterial colony. The “gut feeling” project has
so far looked at using this coupled random and deterministic output to build a personality
generator for a robot through simulation. This approach to Artificial Life is a means to
introducing the seemingly random variation observed in nature to artificially created agents.
The student will build on this existing work by developing a proof-of-concept MFC “artificial
gut” of fluidically connected MFCs and a computer program to read the MFC electrical output
and translate this into behaviours such as memory, exploration, and personality.

38 Bio-inspired underwater navigation in challenging environments


Supervisor: Hemma Philamore, Elliot Scott, Hendrik Eichhorn

Project ID: 38 Contact: hemma.philamore@bristol.ac.uk

Programme: MSc BioRobotics, MSc Robotics

25
BRL MSc Robotics 2022 January 17, 2022

Description:
Navigation is a key requirement for any autonomous mobile robot. Navigating the underwater
environment can pose several challenges. Occlusion, visual disturbance and low light in deep
water can make vision-based systems difficult to use. Time-of-flight sensing can suffer from
false depth readings caused by light scattering and reflection. Possible solutions, inspired by
the natural world include pressure sensing that mimics the behaviour of lateral line sensing
which allows fish to detect of pressure differentials in the surrounding fluid. Sensor fusion may
also be used to optimise the mutual information gain from multiple sensory signals. This project
will investigate novel bio-inspired approaches to improving underwater navigation of confined
environments by autonomous robots.

39 Softly-non-spoken
Supervisor: Hemma Philamore, Alice Haynes

Project ID: 39 Contact: hemma.philamore@bristol.ac.uk

Programme: MSc BioRobotics, MSc Robotics

Description:
Softly non-spoken is a project about using soft robotics to develop non-verbal communication
aids (also known as augmentative and alternative communication (AAC)). AAC refer to a set of
tools or strategies that range from simple paper-based systems to electronic or digital
technologies that enable people to communicate their needs and engage in conversation.
Approximately 300,000 people in the UK have complex communication difficulties that can
benefit from external devices or AAC. People can experience difficulties in communication as
a result of a range of conditions; some acquired in birth or early childhood, such as cerebral
palsy, others as a result of a medical event or acquired condition that occurs in adulthood, such
as a stroke, brain injury or motor neuron disease. Current AAC devices tend to support the
delivery of specific content or information from the person using the device, rather than
facilitating richer social interaction and dialogue, or allowing for unscripted, spontaneous
communication. This project will explore the potential of soft robotics to deliver superior AAC
by investigating ways to better understand communication through movement, gesture, or
touch, which may also enable more autonomous, unscripted ways of interacting. The project
will run alongside an ongoing EPSRC-funded project and there is potential scope to work with
industry partners on the project which include speech and language therapists and Air Giants
https://www.airgiants.co.uk/.

40 Visual navigation with a Pixel Processor Array


Supervisor: Walterio Mayol-Cuevas

26
BRL MSc Robotics 2022 January 17, 2022

Project ID: 40 Contact:Walterio.Mayol-Cuevas@bristol.ac.uk

Programme: MSc Aerial Robotics, MSc BioRobotics, MSc Robotics

Description:
"This project deals with the development of new computer vision methods suitable for pixel
processor arrays (PPAs). These devices combine sensing and processing and have massive
parallel computation ability by having a processor for every pixel. These devices offer the ability
of very fast processing speeds with reduced energy consumption. But their operation is
substantively different to that in conventional visual pipelines. This means that algorithms that
usually work well for traditional computer vision do not work well on PPAs. In particular the
massively parallel nature of the array benefits from highly parallel computational methods.

In this project the objective is to 1) identify and implement visual algorithms for visual
navigation that are suitable for PPAs and 2) evaluate them on a robot visual navigation setting.

The project will start with a literature review on existing visual navigation methods such as
those that use small images or few bits for representation, and then identify suitable ones for
implementation and or modification for PPAs. The aim is to then evaluate these algorithms on
a navigation task by either localising the camera based on the images it observes and or actually
controlling a vehicle based on these images."

41 Visual perception for handheld robots


Supervisor: Walterio Mayol-Cuevas

Project ID: 41 Contact:Walterio.Mayol-Cuevas@bristol.ac.uk

Programme: MSc Aerial Robotics, MSc BioRobotics, MSc Robotics

Description:
In this project the objective is to develop and evaluate methods that are useful for the visual
perception of objects that a handheld robot needs to interact with. The principal objective is
to develop the computer vision models for the recognition of objects of a few classes from the
perspective of a handheld robot. The information would then if time permits be used to control
the robot for simulated action. One potential scenario to be considered is the recognition of
objects to be recycled into categories. The objects will be simulated on a TV screen for simplicity
and within a motion tracking system the handheld robot positioning will be determined. The
result of this work will develop understanding about what models for visual perception can
work when in cooperation with a human in the loop.

27
BRL MSc Robotics 2022 January 17, 2022

42 Computer Vision with Deep Learning for Wearable or Robotic systems


Supervisor: Walterio Mayol-Cuevas

Project ID: 42 Contact:Walterio.Mayol-Cuevas@bristol.ac.uk

Programme: MSc Aerial Robotics, MSc BioRobotics, MSc Robotics

Description:
"We can work on ideas with you for cameras that move carried by people such as a go-pro or
google glass type cameras or on robots virtual or real.
In my team we have developed Computer Vision with Deep Learning that is for understanding
what people is doing or how well they are doing things that can be used to teach robots or
intelligent assistants.
https://www.youtube.com/watch?v=GO5pBQA5PhI&feature=emb_logo
Or recognize objects or hands.
We can work on this topic to find alignment."

43 Visual-based semantic SLAM for navigation in dynamic environment


Supervisor: Walterio Mayol-Cuevas

Project ID: 43 Contact:Walterio.Mayol-Cuevas@bristol.ac.uk

Programme: MSc Aerial Robotics, MSc BioRobotics, MSc Robotics

Description:
"Although the current traditional visual SLAM technology performs well in mapping and 3D
construction in static environment, it may not be enough to relocate it based on feature points,
thus extracting semantic information from the scene may help it relocate.
RGB-D cameras are suitable for indoor use, so if we want to use this camera, we need to
simulate the indoor scene and add dynamic objects or people. Semantic slam can use object
detection, semantic segmentation, or instance segmentation to detect landmarks. In addition,
some algorithms are used, such as the optical flow-based method to remove the feature points
in segmented landmark, because the calculation of the geometric relationship of the pose is
based on the rigid body environment. After the 3D-Map is built, the feature points and
semantic information can be used for navigation. Adding some other methods of localization
can also be considered.

28
BRL MSc Robotics 2022 January 17, 2022

RGB-D camera is bad to use in outdoor environment due to unstable illumination and weather,
so binocular camera can be used to realize semantic SLAM in KITTI dataset."

44 Visual localisation in the wild


Supervisor: Walterio Mayol-Cuevas

Project ID: 44 Contact:Walterio.Mayol-Cuevas@bristol.ac.uk

Programme: MSc Aerial Robotics, MSc BioRobotics, MSc Robotics

Description:
"Robust localisation is a necessary sub-problem for robots to operate in the real world.
This project is inspired by the fact that animals can effectively navigate dynamic
environments with sparse feedback and in some cases low-resolution sensor data.
The objective of this project is to explore what constitutes good visual feature
representations for place recognition in diverse environments. Some preliminary
hypotheses are:
- How small is too small? The optimal input image size for recognition.
- The role of attention and saliency maps.
- Are semantics relevant for efficient representations?
- Can a vision system recursively explore good representations based on
the model of uncertainty of its predictions?
- Is place recognition with directions enough for navigation? "

45 [co-created] Occluded face recognition


Supervisor: Wenhao Zhang

Project ID: 45 Contact: wenhao.zhang@uwe.ac.uk

Programme: MSc Robotics

Description:
While generic face recognition techniques are gaining increased accuracy and reliability,
challenges brought by a large area of occlusion is still having a significant impact. For example,
numerous face recognition systems would fail to detect or recognise a face when a person is
wearing a face mask. This project therefore investigates a deep neural network to recognise
occluded faces. Appropriate data augmentation methods will be used to simulate severe face
occlusiosn.

29
BRL MSc Robotics 2022 January 17, 2022

46 [co-created] Biometric recognition from eye movements


Supervisor: Wenhao Zhang

Project ID: 46 Contact: wenhao.zhang@uwe.ac.uk

Programme: MSc Robotics

Description:
Eye movement patterns can be novel types of biometric data that are unique to an individual
and are more spoof-proof than conventional types of biometrics such as faces and fingerprints.
In this project, different machine learning models will be designed and compared to explore
spatial-temporal features that can encode the uniqueness of eye movement patterns of
individuals. Publicly available datasets will be employed to validate experiment results.

47 [co-created] Eye tracking in reality and virtual reality


Supervisor: Wenhao Zhang

Project ID: 47 Contact: wenhao.zhang@uwe.ac.uk

Programme: MSc Robotics

Description:
Conventional psychological and medical studies using an eye tracking device takes place in a
real world environment, where a person's eye movements can be limited by physical
constraints such as the size of the screen. When a virtual reality environment is created around
a user, the immersive experience and flexible setting has the potential to induce different user
behaviors which may provide significant cues that do not exist in a real-world environment. In
this project, a VR headset is employed and compared to a webcam based eye tracker in
exploration of the additional advantages that a VR eye tracker can offer.

48 [co-created] Depth estimation from a single camera


Supervisor: Wenhao Zhang

Project ID: 48 Contact: wenhao.zhang@uwe.ac.uk

Programme: MSc Robotics

30
BRL MSc Robotics 2022 January 17, 2022

Description:
The ability to estimate the depth of an object using a single camera can be a cost-efficient
alternative to different types of 3D cameras. This project investigates a method using an object
whose size is automatically detected by a deep learning model to calibrate estimated depth.
Comparison to alternative approaches and existing 3D cameras will be carried out.

49 Osillatory Neural Networks for Robotic Manipulations


Supervisor: Roshan Weerasekera

Project ID: 49 Contact: roshan.weerasekera@uwe.ac.uk

Programme: MSc Robotics

Description:

50 Modelling of Stiff Behaviours in Spacecraft Systems with Model Based


Systems Engineering (MBSE) Framework
Supervisor: Yaseen Zaidi

Project ID: 50 Contact: yaseen.zaidi@uwe.ac.uk

Programme: MSc Robotics

Description:
The space engineering community prefers the model-based systems engineering (MBSE)
methodology for systems modelling. The MBSE largely uses discrete-event models for
abstracting behaviours. More than often, the desired physical behaviours in the space
applications are stiff requiring highly non-linear dynamics that are difficult to model in discrete-
event simulators. This work aims to co-model/co-simulate continuous-time behaviours in an
MBSE framework (hybrid discrete-event and continuous-time). Problems such as Robertson's
kinetics of different time-constants, bouncing ball root detection, Van der Pol non-steady-state
and Jiles-Atherton hysteresis loops will be taken to be simulated with a differential-algebraic
solver integrated into SysML activity diagrams developed in an MBSE tool (Modelio,
MagicDraw, Cameo, Capella, Enterprise Architect). Optionally, orbit propagators in AGI Systems
Tool Kit may be integrated. http://sysml-models.com/spacecraft/models.html The expectation
is to share the results with the INCOSE UK MBSE Interest Group and publish them in the IEEE
Systems Journal or Wiley Journal of Systems Engineering.

31
BRL MSc Robotics 2022 January 17, 2022

51 Rover Mission Design, Analysis and Telemetry Visualisation with OpenMCT


Supervisor: Yaseen Zaidi

Project ID: 51 Contact: yaseen.zaidi@uwe.ac.uk

Programme: MSc Robotics

Description:
The student should understand the current research and open problems in the OpenMCT rover
mission simulator and confirm with the supervisor software development of interesting
mission scenarios for improving telemetry collection, terrain navigation, imagery, etc. such as
(Perseverance rover capturing Mars Ingenuity rotorcraft flight.

52 Steerable Thrusters on a CubeSat


Supervisor: Yaseen Zaidi

Project ID: 52 Contact: yaseen.zaidi@uwe.ac.uk

Programme: MSc Robotics

Description:
Agility in maintaining the orbit or CubeSat spacecraft attitude may be accomplished by avoiding
fixed thrusters. A mechanism steerable in 2 or more degrees of freedom would orient the
thruster and control the direction of thrust. Possible technologies are gimbals, linear actuators
of a Stewart platform, or one-way or two-way shape memory alloys. The task is to develop one
of these mechanisms to change the thruster vector of a minimum 3U CubeSat inertia (~4 kg).

53 Digital Factory Twinning in VR Environment


Supervisor: Yaseen Zaidi, Chris Toomer

Project ID: 53 Contact: yaseen.zaidi@uwe.ac.uk

Programme: MSc Robotics

Description:
Industrial partner: CIMPA PLM. The project aims to solve several problems raised in detection
and imaging with a VR handset in the digital factory. Digital factory twin is a reference solution

32
BRL MSc Robotics 2022 January 17, 2022

that simulates the manufacturing environment that creates the product and implements bi-
directional data exchange between real and simulated environments. The solution addresses
end-to-end integration by gathering data coming from IoT layers (sensors, systems, smart tools,
smart objects) and connecting it to the Product Lifecycle Management (PLM) process with
configuration management and Product Data Management (PDM) software. The student
should be willing to travel to the partner premises in Filton.

54 GANs for synthetic art production


Supervisor: Mark Hansen

Project ID: 54 Contact: mark.hansen@uwe.ac.uk

Programme: MSc Robotics

Description:
There has been a lot of interest in recent years in bridging the gap between artists and
computer scientists in the area of art generation. The project itself is very open ended, and
could look, for example, at generating specific types of objects for inclusion in larger images, or
how to combine them in an aesthetically pleasing way, or even creating a robot that can
interpret instructions and generate something new.

55 Machine Vision in Agriculture


Supervisor: Mark Hansen

Project ID: 55 Contact: mark.hansen@uwe.ac.uk

Programme: MSc Robotics

Description:
Many ideas for projects in this domain such as: Developing a means of performing in-field
phenotyping; Automated cow or pig recognition; In field potato measurement using a portable
rig via 3D reconstruction/photogrammetry

56 CNNs learning to ignore


Supervisor: Mark Hansen

Project ID: 56 Contact: mark.hansen@uwe.ac.uk


33
BRL MSc Robotics 2022 January 17, 2022

Programme: MSc Robotics

Description:
CNNs are excellent classifiers but there are cases when they might learn to classify objects
correctly based on incorrect information for example when the training set is contains cues that
the validation set does not. Is there a way to force the network to find other features that are
reliable instead of these cues?

57 Insect sexing and counting


Supervisor: Mark Hansen

Project ID: 57 Contact: mark.hansen@uwe.ac.uk

Programme: MSc Robotics

Description:
Insect farming is likely to become more common in the future as a source of livestock feed.
Finding ways of optimising conditions for the insects in order to automate the process are likely
to be extremely useful. Sexing and counting the insects (such as crickets) are two such areas
that are important parameters to measure and seem to lend themselves to machine vision, but
there are many others.

58 BRL Reception Support Robot: Long-term operation and visitor interaction


Supervisor: Manuel Giuliani

Project ID: 58 Contact: manuel.giuliani@uwe.ac.uk

Programme: MSc Robotics

Description:
"BRL Reception Support Robot: Long-term operation and visitor interaction. The aim of this
dissertation project is to investigate a robot that is supporting the reception at the Bristol
Robotics Laboratory (BRL). The tasks and specification of the robot are as following:

• Interacting with the BRL receptionist, visitors, lab staff and students, e.g. by displaying
short spoken messages.
• Leading visitors to spaces in BRL.
• Assisting visitors with the registration process at BRL.
• The robot will have a touch screen that acts as input modality for humans to give
34
BRL MSc Robotics 2022 January 17, 2022

commands to the robot.


• The main output of the robot will be spoken language through a text-to-speech system.
• The robot needs to be able to run long-term, ideally it should be running during office
hours.
• The robot needs to monitor its own state and e.g. be able to return to its charging
station (or alternatively alert a human to attach it to a charging station).
• Optionally, the robot should also be able to produce usage statistics (e.g. how long it is
running each day, how many interactions and type of interactions it had during the day).

The skills needed to be able to complete this project are as follows:


• Knowledge in ROS and Python to program the robot.
• Knowledge in simultaneous localization and mapping (SLAM) and path planning.
• Knowledge in software engineering tools, e.g. Git for code versioning, tools to create
code base documentation."

59 Swarm dynamic task allocation


Supervisor: Matthew Studley

Project ID: 59 Contact: Matthew2.Studley@uwe.ac.uk

Programme: MSc Robotics

Description:
A set of tasks, such as pruning plants in a horticulture application, needs to be accomplished
by a swarm of robots that have minimal communications and no central controller. This project
will investigate the mechanisms and communications requirements for a swarm of simulated
robots to dynamically self-allocate to optimise time and motion efficiency, using a variety of
biologically inspired approaches.

60 Remote Robotics Outreach


Supervisor: Matthew Studley

Project ID: 60 Contact: Matthew2.Studley@uwe.ac.uk

Programme: MSc Robotics

Description:
"Robots will have an enormous impact on all our lives, and it is vital that all sectors of our
society can shape, and benefit from, this new technology.

35
BRL MSc Robotics 2022 January 17, 2022

Kids love robots. In this project you will develop and run an outreach activity in which we
monitor the success of the project remotely using the robots which we will deliver to the
school, etc.

The purpose of the project will be to assess the impact of role models on children's perceptions
of robotics and STEM as a career; can we widen the discipline to include more female and BAME
voices?

For further information about this project, which will take place alongside the [DETI]
(https://www.digicatapult.org.uk/for-large-businesses/commercial-solutions/deti-digital-
engineering-technology-and-innovation), see
[here](https://docs.google.com/document/d/1ZhuZOWMsVihOVGmGG4kXKaoLaiIF3hgCQzFa
bvCCG34/edit?usp=sharing)

https://docs.google.com/document/d/1ZhuZOWMsVihOVGmGG4kXKaoLaiIF3hgCQzFabvCCG
34/edit?usp=sharing"

61 Robotic Assisted Bending of Ceramic Extrusions


Supervisor: Tavs Jorgensen

Project ID: 61 Contact: tavs.jorgensen@uwe.ac.uk

Programme: MSc Robotics

Description:
"This project will contribute to development of a novel robotic system to shape extruded
sections as the clay - with the aim of creating curved forms that correspond to dimensions of
CAD designs. We are looking to use Universal Robots Arms for the project, but also open to the
possibility of using other types of computer-controlled actuators. The research is located at
UWE's Centre for Fine Print Research (CFPR) and is part of an externally funded project.

The main challenge is likely to be the development of bespoke code and scripts that can
translate CAD drawings into machine control data and facilitate the production of physical
pieces in corresponding forms. The project may also involve the development of the physical
actuators, electronics, mechanics and CAD. The development of the system is likely to involve
notions of Control Theory and Compliance. The coding environment is not pre-determent, but
likely involve UR script, Python and Rhino Grasshopper."

62 Development of Robotic 3D printing for Architectural Ceramics


Supervisor: Tavs Jorgensen

36
BRL MSc Robotics 2022 January 17, 2022

Project ID: 62 Contact: tavs.jorgensen@uwe.ac.uk

Programme: MSc Robotics

Description:
This project is focused on the further development of 3D printing with ceramic paste for large
scale ceramic parts for architectural applications. The project will look to create bespoke code
and scripting for the control of complete 3D printing systems – also potentially involvement in
the creation of bespoke hardware and practical tests with the system.

63 An Industry 4.0 demonstrator for product storage systems


Supervisor: Farid Dailami

Project ID: 63 Contact: Farid.Dailami@uwe.ac.uk

Programme: MSc Robotics

Description:

64 Hand over from robot to humans in an industrial context


Supervisor: Farid Dailami

Project ID: 64 Contact: Farid.Dailami@uwe.ac.uk

Programme: MSc Robotics

Description:

65 Vision assisted Collaborative manufacture of jewelry


Supervisor: Farid Dailami

Project ID: 65 Contact: Farid.Dailami@uwe.ac.uk

Programme: MSc Robotics

37
BRL MSc Robotics 2022 January 17, 2022

Description:
The project intends to create an Industrial collaborative solution for Manufacturing of jewelry.
The aim of the project is to deploy a collaborative robot with vision system- ABB YUMI for
manufacturing of jewelry and to assist the associate by performing tasks like grinding, polishing
and quality check alongside the associate. The project aims to target rings as the form of
jewelry for manufacturing purposes.

66 Path generation for repairing a damaged part


Supervisor: Farid Dailami

Project ID: 66 Contact: Farid.Dailami@uwe.ac.uk

Programme: MSc Robotics

Description:
Main aim:

The project has a goal to generate a welding path for repairing a damaged aerospace part. The
part is firstly scanned using a vision system. The system needs to locate the damaged part and
generate a welding path to make the part back to its original shape.

Background:

The aerospace part works in a very harsh environment especially because of a very strong air
stream. This part is very expensive to build from scratch and cheaper to repair the damaged
one. The existing repairing process is still done by manual welding workers which is can be
optimized by using robotic technology. There are three main types of how to program a robot.
The first and second ones are off-line programming mode and teaching-playback mode. These
types of programming are not effective to be applied in the part repairing process because the
parts may have different damaged shape and location. Thus, welding path generation using a
vision system needs to be designed for this process.

67 Support removal in metallic additive layer manufacturing


Supervisor: Farid Dailami

Project ID: 67 Contact: Farid.Dailami@uwe.ac.uk

Programme: MSc Robotics

38
BRL MSc Robotics 2022 January 17, 2022

Description:

68 Autonomous painting robot in construction industry


Supervisor: Farid Dailami

Project ID: 68 Contact: Farid.Dailami@uwe.ac.uk

Programme: MSc Robotics

Description:

69 Robotic cutting of large oil platform structures


Supervisor: Farid Dailami

Project ID: 69 Contact: Farid.Dailami@uwe.ac.uk

Programme: MSc Robotics

Description:

70 Classification of waste using machine learning


Supervisor: Farid Dailami

Project ID: 70 Contact: Farid.Dailami@uwe.ac.uk

Programme: MSc Robotics

Description:
This project aims to automate the final sorting stage of waste recycling, in order to provide a
cheaper, more efficient alternative to a task which is commonly completely by a team of human
labourers. This project will first require accurate real time classification of the different waste
materials present on the picking line, with challenges such as occlusion of different objects
having to be overcome. In order to achieve this a variety of different sensors will mostly likely
have to be used, this will form my initial research and literature review, with novel sensor
solutions also being considered. Once classified, a robotic system will be designed and
implemented in order to move the classified objects, of various weights, shapes and sizes to
39
BRL MSc Robotics 2022 January 17, 2022

their respective goal points, different manipulators will have to be considered and performance
of different end effector tools on different types of objects analysed.

71 Motor operation prototype


Supervisor: Farid Dailami

Project ID: 71 Contact: Farid.Dailami@uwe.ac.uk

Programme: MSc Robotics

Description:
"DiGiPulse Ltd is a start-up company aiming to position itself as a specialist technology leader
in the market for high-efficiency motor software and hardware. It will offer clients software
and hardware that improves motor efficiency, especially in part-load operation, commensurate
with new system-level control systems being adopted into domestic and mid-sized commercial
plants (refrigeration, HVAC, and process industries)
This project will deliver a first physical prototype to demonstrate improved part-load
operation of a motor, using DiGiPulse’s control software - and will provide a back-to-back
comparison of a commercially available PWM-controlled motor, with and without DiGiPulse’s
control software, in at least one operating condition.
The project is seeking a researcher capable of implementing DiGiPulse control logic onto a
hardware platform (Digital Signal Processor or simple PC, e.g. Raspberry Pi) and interrupting
the control signals of a commercially available PWM-controlled motor at the point of GDA input
(Gate Driver Array).
This researcher will also need to turn and calibrate the self-built motor controller (for at least
one operating point), so that the controller’s performance can be assessed on a like-for-like
basis with the fully tuned and calibrated commercial motor that constitutes the baseline for
this study."

72 Sparse training of a Deep Predictive Coding network


Supervisor: Martin Pearson

Project ID: 72 Contact: martin.pearson@brl.ac.uk

Programme: MSc Robotics

Description:
"Deep learning has revolutionized data science and is transforming many aspects of robotics
particularly perception. However, it has fundamental algorithmic differences to known brain
function and can not serve as a complete model of biological significance. Neurorobotics is an
40
BRL MSc Robotics 2022 January 17, 2022

area of robotics research that is motivated toward understanding the brain through the
evaluation of situated, robotic embodiment of brain models. For this we have been
investigating the next generation of deep machine learning algorithms, namely, Deep Predictive
coding Networks (DPN) that have a closer affinity to biologically plausible algorithmic models
of the cortical microcircuit. DPN are similar to Variational AutoEncoders in that they learn a
generative model without extensive labelled training sets, instead using the patterns and
structure of the sensory input itself.

In this project you will be evaluating the performance of a DPN at learning a simple 1D state
estimation problem (head direction) from visual scenes taken from a mobile robot. The primary
research question will be to find the minimum size of training set that is required to maintain
a baseline performance. This could involve filtering the visual scene for salient features
(landmarks), influencing the behavior of the robot during learning, or perhaps a revision of the
network learning rules themselves. The secondary research question will be to evaluate how
new environments can be accommodated into this model without incurring catastrophic
forgetting of previously learnt environments.

This will be a software based project using the Tensor Flow library for Deep learning and a ROS
controlled Gazebo simulation of a biomimetic robot ""WhiskEye"" to gather data sets for
training and validation. Existing code for DPN will be provided along side starter data sets for
familiarization and initial experimentation. You will be expected to collect your own data sets
from the simulator and to modify the controller for the robot as necessary. Proficiency in the
Python programming language is essential and familiarity with ROS would be beneficial."

73 [co-created] Encoding robot motion using spike based hardware


Supervisor: Martin Pearson

Project ID: 73 Contact: martin.pearson@brl.ac.uk

Programme: MSc Robotics

Description:
"Neuromorphic hardware has the potential to revolutionize AI and robotics in the future
through its greatly reduced energy requirement and brain inspired, massively parallel compute
architecture. To best leverage this hardware we look to neuroscience for inspiration, here we
have learnt that information is represented through myriad spatio-temporal spike based
encoding schemes which have yet to be fully understood. One of the research projects in the
BRL is developing a spike based model of the rodent spatial navigation system to test
hypotheses from biology in an embodied, situated biomimetic robotic model of the rat. One of
the inputs to this model comes from an analogue measure of the robot's own motion which
we convert into a spike rate through a simple transfer function. We have found that this naive
""hack"" is insufficient for stable localization outside of a limited range of velocities when
applied to our spiking neural network model.

41
BRL MSc Robotics 2022 January 17, 2022

In this project you will address 2 research questions; first, how best to encode robot motion
into spikes that can represent and respond to a broad range of velocities; and secondly, how
can that encoding be adapted in response to systematic changes in the characteristics of robot
motion over time. The second question leads from results taken from animal behaviour
experiments that demonstrate how rats modify their representation of self motion in response
to environmental changes.

This will be a primarily software based project using tools such as NEST, a software spiking
neural network simulator, and a ROS controlled Gazebo simulated model of the whiskered
biomimetic robot called ""WhiskEye"". You will be working with researchers on the project who
will provide test data sets and guidance to understand the prior model. Proficiency in the
Python programming language and a keen interest in neuroscience is essential."

74 Olfaction for place recognition


Supervisor: Martin Pearson

Project ID: 74 Contact: martin.pearson@brl.ac.uk

Programme: MSc Robotics

Description:
To integrate olfactory cues into a multi-sensory place recognition system based on the
principles of predictive coding neural networks. This project will feed into a model of a cortico-
hippocampal loops as the basis for spatial cognition in mammalian brains and as the inspiration
for next generation spike based robotic control.

75 Driver intent prediction for smart driving assistance


Supervisor: Ning Wang, Tony Pipe

Project ID: 75 Contact: Ning2.Wang@uwe.ac.uk

Programme: MSc Robotics

Description:
Driving is a complex, dynamic and time-varying process. Different drivers vary in driving habits
and operation style. In order to enhance the mutual communication and to encourage
cooperation between a driver and the vehicle, we are motivated to build a smart driving
assistance system, which owns self-learning capability through sensing, signal processing
techniques and intelligent learning methodology. An Intelligent Vehicular System would have
42
BRL MSc Robotics 2022 January 17, 2022

to infer the intentions of the driver and help or intervene only when needed. For the
development of a novel Smart Assistance System we will need to have a prediction mechanism
by combining information from both sources - vehicle and user. Multiple information sources,
e.g., steering forces, speed, position of the vehicle, etc., will be combined to analyse the state
of the car and the intent of the driver in this study.

76 Generalizable driving skills transfer from driver to smart vehicle


Supervisor: Ning Wang, Tony Pipe

Project ID: 76 Contact: Ning2.Wang@uwe.ac.uk

Programme: MSc Robotics

Description:
In achieving the ultimate goal of autonomous driving, at the current stage, it is expected that
smart vehicles can assist human driver in part of the driving when needed, namely co-driving.
In this project, we will look at the skills that could be transferred from human driver to the
smart vehicles, as part of the modern driving assistance strategy. Endowing the vehicle with
human-like driving skills can not only improve the autonomous driving performance, but can
make the co-driving experience more user-friendly. Motor skills like the motion in steering, and
force-related skills like the press of pedals, could be considered as the driving skills to be
transferred to the smart vehicles. It is also expected that the learned skills could be generalized
in broader scenarios when necessary, which means the smart vehicle can flexibly adopt
different yet similar skills according to varying driving environments.

77 Remote assistance for smart vehicles


Supervisor: Ning Wang, Tony Pipe

Project ID: 77 Contact: Ning2.Wang@uwe.ac.uk

Programme: MSc Robotics

Description:
Teleoperation has already been used to explore the world’s oceans and defuse bombs. Amid
the COVID-19 crisis, however, teleoperation - whether taking control of a vehicle remotely or
offering indirect “remote assistance” could take on greater importance, as it minimizes social
contact. However, the remote controlled vehicles highly rely on the high-speed cellular
connection for long durations, which is never 100% stable. Remote assistance hereby offers an
alternative approach, i.e., let the vehicle self-drive autonomously and only intervene when
there is a need, e.g., a lane is blocked as a result of a road accident. In this project, we are going
43
BRL MSc Robotics 2022 January 17, 2022

to explore the remote assistance method that can detect and identify unknown and unsolvable
conditions by self-driving and then issue an alarm to the remotely standby human
operator/controller to provide necessary assistance. This involves issues like scenario analysis,
object detection and distance estimation, etc.

78 Mobile robot for elderly care in home environments


Supervisor: Ning Wang, Chenguang Yang

Project ID: 78 Contact: Ning2.Wang@uwe.ac.uk

Programme: MSc Robotics

Description:
Elderly people are often not able to perform all activities of their daily life without the help of
caregivers and face a higher risk of experiencing a medical emergency in unattended situations.
Therefore, they usually have to move to assisted living facilities where they are looked after by
the nursing staff. Due to limited number of nursing staff and increasing costs of such facilities,
many research groups use modern technologies to assist elderly people at their homes. The
aim is not only to lower the nursing costs but also to increase the quality of life of senior
citizens. In this project, an in-door mobile robot will be able to navigate autonomously through
narrow corridors and closely placed furniture in the living environment. It can follow the elder
people at home, detect emergency such as falling down, and remote controlled by people for
simple in-house tasks such as deliver small items.

79 Markerless 3D Human Motion Capture for Home Environments


Supervisor: David Western

Project ID: 79 Contact: david.western@uwe.ac.uk

Programme: MSc Robotics

Description:
Recently developed computer vision tools such as OpenPose allow detailed human pose
information to be automatically extracted from ordinary videos. This project aims to build on
such tools for human health applications. Specific tasks to focus on could include automated
extrinsic calibration, fall detection, monitoring of movement disorders, or others.

80 Augmented Reality System to Guide Electrode Placement for Clinical


44
BRL MSc Robotics 2022 January 17, 2022

Electrophysiology
Supervisor: David Western

Project ID: 80 Contact: david.western@uwe.ac.uk

Programme: MSc Robotics

Description:
Various important medical diagnostic and therapeutic procedures involve measuring electrical
activity generated by the brain, nerves, muscles, or heart. The reliability and quality of results
in these procedures depends largely on the accuracy in placing electrodes over anatomical
regions of interest. Conventional practice for measuring these target locations by hand is time
consuming and error prone. In this project, the student will develop an Augmented/Virtual
Reality system that automatically identifies target locations on the patient’s body and enables
the clinician to ‘see’ these targets superimposed on the body as they position the electrodes,
so that they can be positioned quickly and accurately.

81 Analysis of Brain Networks to Quantify Consciousness and Cognitive


Function
Supervisor: David Western

Project ID: 81 Contact: david.western@uwe.ac.uk

Programme: MSc Robotics

Description:
Graph theory and network analysis represent a fast-growing area of mathematics, with
numerous applications including the analysis of brain activity. In this project, the student will
develop software to apply existing and/or novel measures of network structure and activity to
real and/or simulated brain networks. The project could lead to much needed improvements
in our ability to assess levels of consciousness during general anaesthesia (i.e. during surgery)
or to monitor changing cognitive functioning in conditions such as dementia. Improving our
ability to measure such traits is crucial in enabling improvements in how we treat them.

82 Deep Learning for Clinical Neurophysiology


Supervisor: David Western

45
BRL MSc Robotics 2022 January 17, 2022

Project ID: 82 Contact: david.western@uwe.ac.uk

Programme: MSc Robotics

Description:
The NHS’s neurophysiology departments provide vital diagnostic and prognostic information
that informs the treatment of a wide range of conditions, including neurodegenerative
diseases, traumatic brain injuries, stroke, and epilepsy to name only a few. A particular
investigative tool, electroencephalography (EEG), is well established as a rich source of
neurophysiological information. However, the extent of current use of this tool within the NHS
falls well short of guidelines, due to a shortage of trained neurophysiologists to interpret the
data. Researchers have begun to attempt to develop AI systems to support neurophysiologists
in their work, but there is much progress to be made. In this project, the student will aim to
advance the state-of-the-art, working with an open-access EEG dataset.

83 Intelligent Infrared Beamforming: the Future of Home Heating


Supervisor: David Western

Project ID: 83 Contact: david.western@uwe.ac.uk

Programme: MSc Robotics

Description:
The aim of this project is to explore the feasibility of a new concept for domestic home heating:
infrared panels that can selectively focus their energy on room occupants and/or cold spots,
e.g. where mould is likely to form. This could be achieved in various ways, such as ‘phased
arrays’ for beamforming and readily available computer vision software (e.g. OpenPose) for
human motion tracking. With the addition of facial recognition or other ID techniques, it could
even be possible to allow different occupants of a single household to experience different
temperatures according to their preferences or health requirements.

84 Where Are My Glasses? Machine Vision and Speech Interaction in the


Smart Home to Support People with Dementia
Supervisor: David Western

Project ID: 84 Contact: david.western@uwe.ac.uk

46
BRL MSc Robotics 2022 January 17, 2022

Programme: MSc Robotics

Description:
In this project, the student will build upon a previous project that combined object recognition
(e.g. YOLO) and speech interaction (e.g. SNIPS) software tools to develop a system to provide
useful information about the home for people with memory problems. For example, I someone
asks the smart home “Where did I leave my reading glasses?” the home should be able to reply
based on tracking the location of those glasses with cameras around the house (“I see them.
They’re on your head.”). More advanced variants of this system could incorporate activity
recognition algorithms to provide richer contextual information.

85 Development of an Interactive Projector to Provide Ubiquitous Graphical


Interfaces for Smart Homes
Supervisor: David Western

Project ID: 85 Contact: david.western@uwe.ac.uk

Programme: MSc Robotics

Description:
Tablets, smartphones, and audio assistants such as Amazon’s Alexa have made it easier than
ever to access information quickly from anywhere in the home, but there is room for
improvement. For example, if you’re following a recipe in the kitchen, the audio version can
be difficult to navigate intuitively while the on-screen version risks getting flour and oil all over
your precious tablet. An alternative is to project the screen onto the walls / furniture and use
a camera to identify hand gestures / pointing as a mode of interaction. Interactive projectors
already exist e.g. for classroom use, but various technical challenges have prevented them from
being successfully applied to in-home use. In this project, the student will aim to overcome
some of these challenges and produce a working prototype of some part of this system. The
student may choose challenges in electronic / robotic design, 3D geometry, and/or
programming to suit their particular skills and interests.

86 Opto-genetic Tremor Suppression: Simulation and Feasibility Analysis


Supervisor: David Western

Project ID: 86 Contact: david.western@uwe.ac.uk

Programme: MSc Robotics


47
BRL MSc Robotics 2022 January 17, 2022

Description:
"Tremor (involuntary rhythmic movement of a body part) is a highly disabling symptom of
various neurological diseases. Various treatments exist, but none is universally successful and
all have significant drawbacks. The newly emerging field of opto-genetics offers a potential
solution that has not yet been explored. In short, a specially engineered virus may be delivered
to a cell such that it’s behaviour may be controlled by shining light on the cell. When applied
to motor neurons, this may allow a specific muscle to be temporarily weakened or paralysed
to prevent unwanted movement.
The objective of this project is to develop a simulation of an opto-genetically controlled muscle
(making use of established neural simulators), then use that simulation to explore the potential
of this approach from a control theory perspective (e.g. what are the limits of the extent to
which movement dynamics may be controlled?). "

87 Using a Smartwatch to Detect and Disrupt Obsessive Compulsive (OCD)


Hand Washing
Supervisor: David Western

Project ID: 87 Contact: david.western@uwe.ac.uk

Programme: MSc Robotics

Description:
"Obsessive compulsive disorder (OCD) is a highly debilitating mental health condition that can
manifest in various ways, but one of the most common forms is compulsive hand-washing. The
obsessive/compulsive nature of the patient’s thought processes make it difficult for them to
independently disrupt their behaviour.

The aim of this project is to develop software to automatically recognise compulsive


handwashing using smartwatch sensor data. This could ultimately be implemented in a
smartwatch app to trigger alerts or even establish live communication links to a therapist in
order to help the patient disrupt their compulsion and practice their therapeutic techniques."

88 COVID-Cam: Using computer vision to track viral contamination in shared


spaces
Supervisor: David Western

Project ID: 88 Contact: david.western@uwe.ac.uk

48
BRL MSc Robotics 2022 January 17, 2022

Programme: MSc Robotics

Description:
One way to address infectious diseases/viruses such as COVID-19 is to help people visualise
which surfaces or spaces are most likely to be contaminated, so that they can be cleaned and/or
avoided. The aim of this project is to enable this approach using computer vision techniques.
Existing human motion tracking software (e.g. OpenPose) may be used with conventional or
depth-sensing cameras to monitor people’s behaviour in a space. Touch-based contamination
may be monitored simply by recording and visualising which surfaces have been touched in the
last X hours. In more advanced implementations, simple conservative models could be applied
to visualise airborne contamination by automatically recognising behaviours such as speech or
sneezing. Ultimately, such a system could be used in various ways, such as: 1) to inform efficient
deployment of cleaners to de-contaminate priority areas in hospital wards, 2) to inform public
behaviour, e.g. choosing a seat in a GP waiting room based on a live heatmap showing
contamination risk spots in the room, 3) informing the general public or specific institutions
with visualisations showing how a virus can be transmitted around any given space, to
encourage and support more hygienic behaviour.

89 Robot-assisted Composite Sheet Layup Skill Learning and generalization


through Perception Feedback
Supervisor: Chenguang Yang

Project ID: 89 Contact: Charlie.Yang@uwe.ac.uk

Programme: MSc Robotics

Description:
Composites are increasingly becoming a material of choice in the aerospace and automotive
industries. Currently, many composite parts are produced by manually laying up sheets on
complex molds. Composite sheet layup requires executing three main tasks: (1) grasping a
sheet and (2) draping it on the mold (3) layup the sheet. Automating the layup process requires
automation of these three tasks. Human-robot skill transfer has been proved an effective
method to achieve the automation of the layup process. However, it is still difficult to generalize
to novel situations, such as variance in the object shape. Online perception information, such
as visual feedback, is vital to reactive online modification to tackle these challenges. Therefore,
this project will focus on robotic grasping, online perception processing, and integrating with
the layup skill model to improve reactivity and safety and generalization. In this project, deep
learning and machine learning techniques etc., will be used. So, the student is expected to have
basic knowledge of Python, computer vision and machine learning etc.

49
BRL MSc Robotics 2022 January 17, 2022

90 Robot-assisted ultrasound scanning skill learning to optimize ultrasound


image quality
Supervisor: Chenguang Yang, Ning Wang

Project ID: 90 Contact: Charlie.Yang@uwe.ac.uk

Programme: MSc Robotics

Description:
Ultrasound (US) imaging is commonly employed for the diagnosis and staging of abdominal
aortic aneurysms (AAA), mainly due to its non-invasiveness and high availability. Robot-assisted
tele-echography is an effective method to achieve medical examination. This project will mainly
study how to optimize ultrasound image quality by using adaptive compliant control, computer
vision and robot skill learning techniques. Robot learning techniques, such as imitation learning
and reinforcement learning, will be used in this project. We could provide experimental
phantom and Franka Panda platform for experiment. So, the student is expected to have basic
knowledge of Python, computer vision and machine learning.

91 Development of a multi-functional mobile robot manipulator


Supervisor: Chenguang Yang, Ning Wang

Project ID: 91 Contact: Charlie.Yang@uwe.ac.uk

Programme: MSc Robotics

Description:
Mobile manipulator is usually built with a robotic manipulator arm mounted on a mobile
platform. This project will investigate combination of various manipulators and different type
of mobile platforms, e.g., wheeled or tracked, differential drived or omni-directional. The
mobile manipulator can be fully autonomous or teleoperated and is supposed to be able to
access space not easy to be accessed by human users. The develop mobile manipulator can be
multi-functional with application to various sencarios, such as disposal of dangerous objects,
remote inspection and mobile charging station.

92 Intelligent mobile robot control platform based on smart phone


Supervisor: Chenguang Yang, Ning Wang

50
BRL MSc Robotics 2022 January 17, 2022

Project ID: 92 Contact: Charlie.Yang@uwe.ac.uk

Programme: MSc Robotics

Description:
"1) Design a human-computer interface to realize remote control of mobile robot, with
acquisition and displayment of visual streams collect by the mounted camera.
2) Realize basic functions of the intelligent car, such as remote steering, automatic tracking,
autonomous obstacle avoidance, light seeking and automatic parking.
3) Using the car mounted camera to realize face detection and face recognition to identify
human use; to model the environment and to detect accidents, and to realize patrolling
functions;
4) Wireless local area network (WLAN) can be used to realize the remote communication
between mobile phone and car."

93 Mobile phone enabled smart car


Supervisor: Chenguang Yang

Project ID: 93 Contact: Charlie.Yang@uwe.ac.uk

Programme: MSc Robotics

Description:
"1) Design an Android based app for interactive remote control of a mobile car, e.g., steering
and speed control.
2) Combine machine vision and machine learning to realize automatic tracking and
autonomous obstacle avoidance
3) Develop SLAM for path planning to realize autonomous navigation and positioning
4) Create a database for user identity and personalized data storage.
platform, design of human-computer interaction interface, remote control of intelligent car
end, acquisition and display of image data of car mounted camera, acquisition of sensor data
of car end, and Realization of interactive control with car;;
2) Realize the basic functions of the intelligent car: such as remote control forward,
backward, left and right rotation, automatic tracking and driving, autonomous obstacle
avoidance, light seeking and automatic parking, multi-level speed regulation and other
functions; the mobile phone terminal uses screen buttons and mobile phone gravity induction
to control the car movement and add transportation
3) Realize the advanced application of intelligent car: realize autonomous positioning, object
detection, path planning; complete the goods handling work, transport the specified items to
the designated location at the designated place;
4) Feed back the information of the goods and the running environment of the car to the
mobile terminal;
51
BRL MSc Robotics 2022 January 17, 2022

5) Bluetooth is used to realize the remote communication between mobile phone and car."

94 Robotany An artificial ecosystem of robots [3 students]


Supervisor: Hemma Philamore, Elliot Scott

Project ID: 94 Contact: hemma.philamore@bristol.ac.uk

Programme: MSc Robotics, MSc Bio-Robotics

Description:
"Hydroponic growing systems allow for plants to be grown without the use of soil. Typically,
systems are layered, with the upper layer holding the plants that have been rooted into a
substrate, and the lower layer acting as the water reservoir. Benefits of these systems include
increased yields and reduced water consumption compared to conventional farming, as well as
allowing plants to be grown in almost any location, and often in the absence of natural light,
meaning that currently unusable land can be repurposed. Closed system hydroponics also help
to reduce agricultural run-off, which can be very damaging to the environment. As such,
hydroponics is being touted as a potential way to move towards more sustainable farming.
However, without soil to act as a buffer, a more hands-on approach is often required to
maintain a delicate balance of oxygen levels, pH, and fertiliser; tipping too far in any one
direction can lead to unsatisfactory yields, damaged roots, or algal blooms. A potential method
of both monitoring nutrient levels and ensuring that the system remains in balance involves
robotics. An artificial ecosystem of simple robots that live and operate within the hydroponics
systems would allow for constant feedback, as well as providing the opportunity to affect
changes in an otherwise difficult to access space. This project aims to explore the potential uses
of robotics for improving the way that hydroponic systems work and has scope for multiple
students to work in parallel on different research topics. Example research themes :
• using novel bio-inspired sensing for navigation of a swimming robot within the tank
• energy-harvesting from light and biomass to power a robot operating in the hydroponic
system
• development of a self-powered bio-sensor for plant health or nutrient level using
microbial fuel cells
The project will run alongside an ongoing EPSRC-funded project titled ‘Robotany’. There is
potential scope to test findings in real hydroponic set-ups, working with industry partners on
the project. "

95 Finding Nemo: Simulating an underwater robot for environmental


monitoring
Supervisor: Hemma Philamore, Hendrik Eichhorn
52
BRL MSc Robotics 2022 January 17, 2022

Project ID: 95 Contact: hemma.philamore@bristol.ac.uk

Programme: MSc Robotics

Description:
A major challenge when developing underwater robots is protecting the internal parts of
prototype robots, such as electronics and computation hardware, from damage due to water
ingress. This can make iterative design and testing an extremely difficult process. Simulation
can be a useful tool for developing the robot design prior to testing in an underwater
environment. This project will build on an existing underwater robot, which was developed
for environmental monitoring. For the monitoring, the robot is supposed to follow a given set
of points, at which measurements will be taken. To try different control and path planning
algorithms, which can be used in this scenario, the student will develop a model of the robot
within an existing simulation framework. It is suggested to use either UUVSimulator or UWSim
in combination with ROS, but this will be ultimately up to the student. Factors that should be
considered for the control and path planning are the efficiency and length of the path, as well
as the disturbance that is caused by the robot.

96 Design and Stabilisation of an Amphibious Drone (Underwater Drone)


Supervisor: Hamidreza Nemati

Project ID: 96 Contact: Hamidreza.Nemati@uwe.ac.uk

Programme: MSc Robotics, MSc Aerial Robotics

Description:
Recently, different drones must be used for different missions and applications. Therefore, an
innovative and cost-effective solution is required to assist the end-user by developing a cross-
domain vehicle. This vehicle is able to merge the benefits of operating in both aerial and
underwater domains. Attached file is the Loon Copter created by the Embedded Systems
Research Laboratory at Oakland University.

97 Design and Stabilisation of Distributed Flight Arrays (DFAs)


Supervisor: Hamidreza Nemati

Project ID: 97 Contact: Hamidreza.Nemati@uwe.ac.uk

Programme: MSc Robotics, MSc Aerial Robotics

53
BRL MSc Robotics 2022 January 17, 2022

Description:
Distributed flight array (DFA) is a modular single-propeller aerial robot that has been firstly
developed at ETH Zurich by Prof Raffaello D’Andrea in 2008. Each unit can generate enough
thrust to take off but it’s unstable in flight. For flying in the air, each unit can drive autonomously
and dock with its peer on the ground. DFA configuration and components are presented in the
attachment [1]. In this project, new mechanical chassis design and configuration (feasibility
studies) are required using modelling and motion analysis software such as SOLIDWORKS and
MSC ADAMS (or ROS).

98 Cooperative Multi-Missile Systems


Supervisor: Hamidreza Nemati

Project ID: 98 Contact: Hamidreza.Nemati@uwe.ac.uk

Programme: MSc Robotics, MSc Aerial Robotics

Description:
Recently, the research on cooperative multi-missile systems (MMS) has attracted more
attention and efforts due to the advancement of anti-missile technology. In another words,
from the missile point of view, the mission success in the conventional one-to-one engagement
scenario will be degraded because a missile can be destroyed by the defence system of the
target. However, an effective way is to employ a group of well-organised and low-cost MMS
rather than an individual high technology and high cost missile.

99 Reinforcement control learning algorithm applied to Autonomous


Underwater Vehicle (AUV) in simulation [2 students]
Supervisor: Quan Zhu, Prof William Holderbaum (University of Reading)

Project ID: 99 Contact: quan.zhu@uwe.ac.uk

Programme: MSc Robotics

Description:
Sliding model control and AI are traditionally used control techniques of underwater
vehicles .Most of these methods often focused on lower dimensional systems, for instance yaw
and surges and constraints other range of possible motions along other dimensions. The
implementation of the control of underwater vehicle required a dynamical model of the vehicle
movement between initial and target points, which require a navigation problem or high level
54
BRL MSc Robotics 2022 January 17, 2022

control. This project would investigate a control systems of an AUV with no explicit model based
reinforcement control learning algorithm. The control input would be calculated upon states
measurement (or estimated) and reward shaping from the signal to maximise the reward. The
project would be in simulation using Matalb Simulink.

100 Robotics FES for Under-actuated systems[2 students]


Supervisor: Quan Zhu, Prof William Holderbaum (University of Reading)

Project ID: 100 Contact: quan.zhu@uwe.ac.uk

Programme: MSc Robotics

Description:
Underactuated robotics is an emerging research field. The number of control input of the
underactuated systems is less than the degree of freedom of the system. It can be for
economical reason or constraints due to the nature of the systems. Biped robots may have this
possible configurations where the number of possible actuators is restricted to the upper part
of the legs while the lower part in independently control. This is the case in rehabilitation
engineering where control the muscles of the tights via electrical stimulation. The project will
consist to model and control via simulation tools a walking biped robots with the constrains of
being under actuated systems, Control methods could range from linear control to deep
learning.

101 Autonomous Landing[2 students]


Supervisor: Quan Zhu, Prof William Holderbaum (University of Reading)

Project ID: 101 Contact: quan.zhu@uwe.ac.uk

Programme: MSc Robotics

Description:
The project will focus on vision-based control approach to enable autonomous landing of
vertical take-off and landing (VTOL) capable unmanned aerial vehicles (UAVs). A simulation tool
will show feature of some of the control methods to accomplish the landing task. A control
systems using reinforcement control learning algorithm to learn from computer vision data to
operate the landing. Through comparison with a gain-scheduled proportional integral
derivative (PID) control system is with the vision system and then implemented on a quadrotor-
UAV dynamic model in a realistic simulation program. Extensive simulations are conducted to
demonstrate the ability of the controller to achieve robust landing. Repeated flight tests could
be used for comparison between the algorithms.
55
BRL MSc Robotics 2022 January 17, 2022

102 Mobile robots control via voice activation[2 students]


Supervisor: Quan Zhu, Prof William Holderbaum (University of Reading)

Project ID: 102 Contact: quan.zhu@uwe.ac.uk

Programme: MSc Robotics

Description:
In the near future, the usage of a mobile robot should become a common practice anywhere
and a spoken language is a convenient interface for commanding such devices. Therefore, the
project intent to develop an interface between human and control. The simulated robot´s
movements will be controlled according to the given instructions. Matlab-Simulink will be the
starting point of the project but it is expected to use Arduino and embed the methods in a small
wheel robot.

103 U-model Based Sliding Mode Control (SMC) to Increase Robustness of


Various Robots[2 students]
Supervisor: Quan Zhu, Prof Charlie Yang

Project ID: 103 Contact: quan.zhu@uwe.ac.uk

Programme: MSc Robotics

Description:
This dissertation research project aims to integrate U-model (UM) design procedure and Sliding
Mode Control (SMC) to improve robustness of robots. U-model based control (U-control) is a
model-independent control system design approach simplify the design complexity of
nonlinear control systems without linearisation. It bridges the gap between linear and
nonlinear system design. Through UM, linear polynomial model-based methodologies, such as
pole assignment, can be implemented directly on nonlinear control systems, and that the
system performance can be tuned according to actual requirements. However, one remarkable
shortage of UM methodology is that it relies heavily on precise system model, which is very
hard to obtain. In this case, SMC is considered as its complement to improve the robustness of
the system. Therefore, U-model is used to simplify its control design procedure of nonlinear
system with a desired transient and steady state linear transfer function of the resulting closed-
loop system. Then combining sliding mode control scheme improves the robustness possibly
induced from U-model based dynamic inversion.
The main objectives of this project include:
56
BRL MSc Robotics 2022 January 17, 2022

• propose an integrated methodology of U-model and Sliding Mode Control.


• complete a basic tracking task of a 2-DOF manipulator with disturbance using proposed
Methodology.
• analyse the proposed methodology and render a descent evaluation.

104 Rendezvous of a Refuelling Space Tug with OneWeb Constellation


Supervisor: Yaseen Zaidi

Project ID: 104 Contact: yaseen.zaidi@uwe.ac.uk

Programme: MSc Aerial Robotics

Description:
The present work has reduced the terminal guidance problem of a single target spacecraft with
a chaser spacecraft to 1m accuracy using a genetic algorithm and gradient descent methods for
trajectory design. We wish to bring the accuracy to cm using the currently implemented
algorithms or the Monte Carlo method. Once achieved, the final aim is to rendezvous one after
another with multiple OneWeb spacecraft in a constellation of ~650 and refuel them.

105 Nukopter [3 students]


Supervisor: Yaseen Zaidi, Vilius Portapas

Project ID: 105 Contact: yaseen.zaidi@uwe.ac.uk

Programme: MSc Aerial Robotics

Description:
A high lift-to-drag ratio and copter drone carrying a significant avionics payload for inspection
and scanning before personnel move in and for equipment external health monitoring will be
designed and 3-D printed. The drone will be built with smart material just in case of a stall or
safety abort. The payload is going to be a ZED 3-camera. Images would be sent over a wireless
link. Dense integrated avionics would be protected for radiation which would be detected with
an on-board dosimeter. The design requirements include facility dimensions, inspection or
operator safety, heights, lighting, environmental conditions (temperature, pressure, humidity),
etc. The use case of the drone operating environment is extreme conditions, containment
buildings, and draft wet cooling towers.

106 Parts Tracking during Spacecraft Assembly, Integration and Test with
57
BRL MSc Robotics 2022 January 17, 2022

HoloLens 2
Supervisor: Yaseen Zaidi, Chris Toomer

Project ID: 106 Contact: yaseen.zaidi@uwe.ac.uk

Programme: MSc Robotics

Description:
Industrial partner: Thales Alenia Space. The project was offered last year and excellent results
were achieved. Now, we aim to expand the dataset and improve training, detection, continuous
object detection, latency, blur, UI, and usability and uniquely identify parts. A further aim is to
fully digitise the Assembly, Integration, and Test (AIT) process with holographic support. A big
area of interest for AIT across all of Thales Alenia Space is parts tracking. The idea is that during
assembly, parts can be automatically tracked either by external cameras or an AR headset as
they are placed to record them leaving a store's database and to have a record of exactly where
they end up. Putting RFID or QR codes on the parts is the easy solution, but fast and reliable
image recognition with HoloLens to determine size as well as the shape would be far better.

107 eVToL Digital Twin Demonstrator


Supervisor: Yaseen Zaidi, Vilius Portapas

Project ID: 107 Contact: yaseen.zaidi@uwe.ac.uk

Programme: MSc Aerial Robotics

Description:
Industrial partner: Smart Ports Systems, Neoptera. In the dock2dock project, the current
development is a digital flight profile of an electric aircraft between Bristol and Cardiff docks.
The digital twin features include equivalent fuel consumption, Traffic collision Avoidance
System TAS radar, availability of GPS constellation over the route, and detection of an
approaching aircraft. We aim to enhance the air situational awareness capability with an on-
board camera, performance characteristics such as cruise airspeed, roll rate, and bank angle,
access times, GPS jamming locations, emergency landing and adding realism to the aircraft's
fixed antenna pattern when rolling. Communications interference analysis and access to the 5G
network will be additional targets. The goal of the digital twin is to support the development
of the regulatory framework of the UK Civil Aviation Authority.

108 Robot Manipulators for Space Debris Removal

58
BRL MSc Robotics 2022 January 17, 2022

Supervisor: Hamidreza Nemati

Project ID: 108 Contact: Hamidreza.Nemati@uwe.ac.uk

Programme: MSc Robotics, MSc Bio-Robotics, MSc Aerial Robotics

Description:
There are various examples of robot applications to assist humans in space. Recently, there is
a global concern on the population growth of orbital debris that have the potential to collide
and cause massive disruption. However, robotic manipulators can be employed to avoid the
collision and catastrophic failure of space missions.

109 Spacecraft Attitude Control with Internal Fuel Sloshing


Supervisor: Hamidreza Nemati

Project ID: 109 Contact: Hamidreza.Nemati@uwe.ac.uk

Programme: MSc Robotics, MSc Aerial Robotics

Description:
Propellant slosh during spacecraft manoeuvre, upper-stage separation and orbit injection in
partially filled fuel containers may affect the stability of the spacecraft motion or even
disintegrates the whole system. This research aims to model and control of propellant slosh
dynamics during spacecraft attitude manoeuvre.

110 Spacecraft Station-Keeping around Libration Point Orbits in the


Circular Restricted Three Body Problem (CR3BP)
Supervisor: Hamidreza Nemati

Project ID: 110 Contact: Hamidreza.Nemati@uwe.ac.uk

Programme: MSc Aerial Robotics

Description:
The problem of spacecraft station-keeping for libration point orbits has received great attention
recently for space exploration missions such as ARTEMIS. In this research, an appropriate
control method must be derived for spacecraft station-keeping (or formations) near libration

59
BRL MSc Robotics 2022 January 17, 2022

points of CR3BP.

111 Soft Robotics: Novel components, Modelling and Simulation


Supervisor: Hamidreza Nemati

Project ID: 111 Contact: Hamidreza.Nemati@uwe.ac.uk

Programme: MSc Robotics, MSc Bio-Robotics, MSc Aerial Robotics

Description:
Compliance in soft robots can allow them to adapt their shapes against objects with unknown
and uncertain geometry. Therefore, new class of components are required to deal with various
environments. Furthermore, there still exists an open challenge to create an accurate
mathematical formulation due to the infinite dimensionality of the robot’s state space.

112 Soft Robotics: Intelligent Control


Supervisor: Hamidreza Nemati

Project ID: 112 Contact: Hamidreza.Nemati@uwe.ac.uk

Programme: MSc Robotics, MSc Bio-Robotics, MSc Aerial Robotics

Description:
Considerable efforts have been recently dedicated to building soft bodies for robots. However,
relatively less attention has been given to the problem of developing their intelligent brains
particularly in harsh environments.

113 Soft Robotics: Noise & Vibration


Supervisor: Hamidreza Nemati

Project ID: 113 Contact: Hamidreza.Nemati@uwe.ac.uk

Programme: MSc Robotics, MSc Bio-Robotics, MSc Aerial Robotics

Description:
Since soft robotic actuators have low Young’s modulus and low structure stiffness, they may
vibrate back and forth drastically once they are excited or unactuated. These undesired
60
BRL MSc Robotics 2022 January 17, 2022

oscillations can greatly decrease production efficiency and accuracies in industrial applications.
On another front, many intelligent robots utilise commercial off-the-shelf sensors that are
noise-, bias- and drift-prone and hence inaccurate. Since these sensors are always part of a
control system, measurement noises, biases and drifts can degrade the performance of the
controller and may lead to instability. Therefore, suitable algorithms are required to counteract
the above problems.

114 Stabilisation of Under-actuated Mechanical Systems


Supervisor: Hamidreza Nemati

Project ID: 114 Contact: Hamidreza.Nemati@uwe.ac.uk

Programme: MSc Robotics, MSc Bio-Robotics, MSc Aerial Robotics

Description:
Under-actuated mechanical systems (UMSs) are mechanical control systems in which the
number of configuration variables are greater than the controls. Stabilisation of UMS is still an
active field of research because of their broad applications in robotics, aerospace and marine
vehicles.

115 New Soft Conjunction Design and Test of Human-Like Robot Hand
Links
Supervisor: Zhenyu Lu, Chenguang Yang

Project ID: 115 Contact: zhenyu.lu@uwe.ac.uk

Programme: MSc Robotics

Description:
This theme belongs to the mechanical design and soft robot area. Humans use hands for
grasping and manipulation thousands of times, and the connection part between hand bones
is the key to ensuring the flexibility and smoothness of the operations. Compared with solid
linkage, soft conjunction has a good cushioning effect and helps to reduce the frictions during
the relative movements of hand links. This project will improve the current robot hand
structure to replace the solid conjunction with a soft one and test and compare the differences
in performance.

61
BRL MSc Robotics 2022 January 17, 2022

116 Improved Variable Impedance Actuator System for Tendon-driven


Robot Hand
Supervisor: Zhenyu Lu, Chenguang Yang

Project ID: 116 Contact: zhenyu.lu@uwe.ac.uk

Programme: MSc Robotics

Description:
Improved Variable Impedance Actuator System for Tendon-driven Robot Hand

117 Exoskeleton Structure Improvement for Free-space Teleoperation


Supervisor: Zhenyu Lu, Chenguang Yang

Project ID: 117 Contact: zhenyu.lu@uwe.ac.uk

Programme: MSc Robotics

Description:
The theme belongs to the areas of mechanical design (10%), electronic design (10%), and
teleoperation technology (80%). Traditional teleoperation is taken based on the joystick or
gesture recognition equipment to acquire the demonstration information, and then accepts
the feedback such as force, video for decision in next step. The participant in this project will
be provided with a newly designed exoskeleton to improve the hardware performance and
then design a software toolkit for completing teleoperation and manipulation through a
physical twins system. The application should be familiar with ROS, C++ or Python.

118 Robot Primitive Skill Learning from Few-shot Demonstrations based on


Meta-Learning
Supervisor: Zhenyu Lu, Ning Wang

Project ID: 118 Contact: zhenyu.lu@uwe.ac.uk

Programme: MSc Robotics

62
BRL MSc Robotics 2022 January 17, 2022

Description:
The theme belongs to the areas of robotic technology and artificial intelligence. Meta-learning
is state-of-the-art technology for learning from multiple few-shot cases but seldom used for
robot skill learning. In actuality, the demonstrations of robot manipulation are hard to be
acquired directly, so there are lots of distributed demonstrations acquired from various scenes.
This project will explore robot skill learning technology from multiple few-shot demonstrations
based on Dynamic Motion Primitive and Meta-Learning. The technology will be verified
through the robot operational platform. The applicatant should have a basic knowledge of
machine learning and be familiar with C++ or Python.

119 Robust Robot Primitive Skill Programming Network for Object


Grasping and Manipulation
Supervisor: Zhenyu Lu, Ning Wang

Project ID: 119 Contact: zhenyu.lu@uwe.ac.uk

Programme: MSc Robotics

Description:
The theme belongs to the areas of robotic technology and artificial intelligence. Imitation
learning is a modern technology for realizing robots quick programming. After acquiring
primitive learning skills, how to analyse the difference and evaluate the manipulation
performance to improve the robustness and flexibility of network structure is the research
topic of this project. The technology will be verified through the robot operational platform.
The applicatant is required to have a basic knowledge of machine learning, robot skill learning
(e.g. imitation learning) and be familiar with C++, Python or MATLAB.

120 What my robot knows about me and who owns that knowledge: User
data rights in service robotics [2 students]
Supervisor: Anna Chatzimichali, Anouk van Maris

Project ID: 120 Contact: Anna.chatzimichali@uwe.ac.uk

Programme: MSc Robotics

Description:
In the field of social and service robots, user data are crucial in order to design a meaningful
63
BRL MSc Robotics 2022 January 17, 2022

connection with the user. The quality of such user data, as well as a clear model of their
ownership is also extremely valuable when it comes to designing future features in user
interaction with the robot. Such data are the new currency of the digital economy. The rise of
inter-connected digital goods and services marked the end of an era of local data storage in
user owned equipment. In the new era of remote computing, internet of things, personalised
machines and robots, user data are almost always stored in third party remote resources. This
shift made ownership of digital assets and data privacy increasingly abstract. However, the
landscape of ownership of such user generated data records (or personal data) is unclear.
Currently, no organisation has the obligation or the capacity to protect digital personal data,
while data protection usually happens only as a result of personal interest. The main idea
behind this work is to investigate data right architectures in social/service robots and especially
within a collaborative personalisation space. Data is power, but in a collaborative personalised
space who really owns the power?

121 Robots in circular economy: The role of robots and cobots in product
disassembly [2 students]
Supervisor: Anna Chatzimichali

Project ID: 121 Contact: Anna.chatzimichali@uwe.ac.uk

Programme: MSc Robotics

Description:
Circular economy and design for disassembly as an emerging trend promise to create a
sustainable impact in our world. In the past products were used for a considerable number of
years, were frequently repaired and were not discarded easily. The rapid advance of
technology, the consecutively more effective manufacturing and the automation of production
have led to the insertion of new products in the market. Modern products are enhanced in
capabilities, while cheaper than the ones they replace. Assessing the quality of product design
from the disassembly point-of-view is emerging as a key issue in end-of-life treatment options
of electrical and electronic waste. Design for Disassembly (DfD) has been developed as a
framework for defining disassembly goals, applying specific guidelines, reducing disassembly
complexity and evaluating product ease of disassembly in the design phase. However the
diassembly process is still in its infancy when it come to automation or the use of robots within
it. This project will explore the role of robotics with exisiting DfD frameworks and the role they
can play in the circular economy.

122 Open-set (re-identification) recognition of pigs


Supervisor: Mark Hansen

64
BRL MSc Robotics 2022 January 17, 2022

Project ID: 122 Contact: mark.hansen@uwe.ac.uk

Programme: MSc Robotics

Description:
Being able to recognise animals without having to enroll them first is potentially extremely
useful for managing welfare. Using our own datasets, and any others available, you will be able
to improve your knowledge and experience of deep learning architectures such as siamese
networks and triplet loss to provide new insights into the problem.

123 Primitive Skill Library Building and Learning for Robot Composites
Layup though Teleoperation and Kinesthetic Demonstration
Supervisor: Chenguang Yang, Ning Wang

Project ID: 123 Contact: Charlie.Yang@uwe.ac.uk

Programme: MSc Robotics

Description:
Composite layup is challenging for robots. A flat fibre-reinforced material that is relatively
inextensible along the fibre direction must be draped onto a complex (often doubly curved)
mould geometry while retaining desired fibre orientations and avoiding damage. This project
will focus on building the primitive skill library and realizing primitive skill learning based on the
classification of operational scenes through both teleoperation and kinesthetic
demonstrations. The final drivable contains a library consisting of primitive skills, covering at
least three kinds of operational cases.

124 Language games for language evolution


Supervisor: Conor Houghton

Project ID: 124 Contact: conor.houghton@bristol.ac.uk

Programme: MSc Bio-Robotics

Description:
Language games - can a collection reinforcement learning agents learn to communicate; in the
past attempts to do this using signalling games such as the Lewis signalling game [Lewis 1969]
have been bedevilled by the agents `cheating' using non-compositional communication
65
BRL MSc Robotics 2022 January 17, 2022

strategies to communicate without developing an effective language. I think, with more


powerful computers and better board games, it is time to revisit this! This project would involve
reinforcement learning approaches to communication games such as Dixit.

125 A Robot Swarm to Assist British Rewilding


Supervisor: Sabine Hauert, James Wilson

Project ID: 125 Contact: sabine.hauert@bristol.ac.uk

Programme: MSc Bio-Robotics

Description:
In recent years there have been multiple rewilding projects taking place across the UK. The
introduction of beavers, buffalo, pine marten, and other species across the country has had
significant positive environmental impact. A key feature of re-wilding in other countries comes
from trophic cascade via the introduction of predators; a notable example of this is the effect
of wolf reintroduction on the rivers of Yellowstone national park in the USA. In this project, we
look to develop a swarm technology to support rewilding in the UK, regulating movement of
animals to prevent the predation of livestock, monitoring new species, or mapping the effects
of introduced flora and fauna. We expect you to develop a swarm robotic solution for livestock
and predator separation, creating a simulation to demonstrate your ideas.

126 Collective Memory for a Trustworthy Swarm


Supervisor: Sabine Hauert, James Wilson

Project ID: 126 Contact: sabine.hauert@bristol.ac.uk

Programme: MSc Bio-Robotics

Description:
Understanding swarm behavior is a key element in building trustworthiness and widespread
user adoption of swarm technology. By developing effective means of extracting and visualizing
swarm information we can create more understandable and more trustworthy swarms. We
have recently designed a swarm that is capable of self-monitoring, providing swarm
information to a user in a fully distributed manner. To supplement this work, we are looking to
create an effective mechanism for distributing memory amongst a swarm. This system will
leverage the total memory available to the swarm, rather than just individual agents, while still
ensuring that users have access to the data they require in reasonable time. The effective
distribution of memory across a swarm should enhance the scalability and provide a simple but
powerful method for monitoring swarm activities.
66
BRL MSc Robotics 2022 January 17, 2022

127 Digital Twin for a Welding Cell


Supervisor: Farid Dailami

Project ID: 127 Contact: farid.dailami@uwe.ac.uk

Programme: MSc Robotics

Description:
This project will aim to produce a Digital Twin of a welding cell currently being developed at
Roll Royce. The work will entail the development of a model to examine the robot and work
handling equipment needed to repair components using the welding process.

128 Unitree A1 Dog Surveyor


Supervisor: Farid Dailami

Project ID: 128 Contact: farid.dailami@uwe.ac.uk

Programme: MSc Robotics

Description:
The aim of this project is to use a Unitree A1 Dog to survey a building site and monitor safety
infringements. This work requires both robotics, in particular SLAM and also scene
interpretation from a vision system.

129 Digital twin of a motor and disc assembly


Supervisor: Farid Dailami

Project ID: 129 Contact: farid.dailami@uwe.ac.uk

Programme: MSc Robotics

Description:
The aim of the project is to develop a digital twin, where the loop between the model and the
real system is closed and various ‘what if’ type analysis can be carried out.

67
BRL MSc Robotics 2022 January 17, 2022

130 Modelling of a cable robot


Supervisor: Farid Dailami

Project ID: 130 Contact: farid.dailami@uwe.ac.uk

Programme: MSc Robotics

Description:
Modelling of a cable robot Develop a model of a cable robot and simulate its performance in
performing a number of industrial processes requiring precision motion.

131 Data sonification for robot teleoperation


Supervisor: Paul Bremner, Alex Smith

Project ID: 131 Contact: paul2.bremner@uwe.ac.uk

Programme: MSc Robotics

Description:
In data sonification data is mapped to parameters of different sounds so that they can be used
to ‘display’ the data. It has advantages over visual display of data as previous research has
shown that multisensory integration of vision with sound can improve an operator's awareness
during high visual load tasks. Importantly, the human auditory system has a high temporal
resolution, wide bandwidth, and can localise and isolate concurrent audio streams. In this
project we propose using data sonification of operation critical data for teleoperation of a robot
arm. It builds on existing work of sonification in VR, and utilises our existing leader-follower
teleoperation system. This project will involve programming for ROS and UNITY (building on
existing work so prior in depth knowledge of these environments is an advantage but not
required), and designing and executing a user study to evaluate the utility of sonified data.

132 Automated gesturing for a telepresence robot


Supervisor: Paul Bremner

Project ID: 132 Contact: paul2.bremner@uwe.ac.uk

Programme: MSc Robotics

68
BRL MSc Robotics 2022 January 17, 2022

Description:
A chief advantage of telepresence robots is their ability to improve operator presence and
interaction quality. However, a common feature of human communication that is not easily
relayed via a telepresence robot is conversational gesture. This project aims to investigate
whether conversational gestures might be autonomously generated from an operators speech,
such that interaction quality is improved. The project will involve programming a gesture
generation system (likely in Python) for a NAO robot, including a method to predict gesture
frequency from speech, and design and execution of a user study to evaluate it.

133 [co-created] Reinforcement Learning for Robotic Manipulation Based


on Multi-Sensor Fusion
Supervisor: Dandan Zhang, Nathan Lepora

Project ID: 133 Contact: ye21623@bristol.ac.uk

Programme: MSc Bio-Robotics

Description:
For sensing techniques used for robotic dexterous manipulation, there are two primary
modalities: vision and tactile. The robot can be guided to approach the target objects based on
visual feedback, while tactile feedback can provide useful information of interaction between
objects and the environment. These two modalities are complementary to each other for
contact-rich tasks. To this end, this project aims to explore multimodal representation learning,
which could be a useful tool for developing a robust and task-related representation to facilitate
reinforcement learning with high efficiency.

134 [co-created] Design of a two-finger robotic hand with tactile feedback


Supervisor: Dandan Zhang, Nathan Lepora

Project ID: 134 Contact: ye21623@bristol.ac.uk

Programme: MSc Robotics

Description:
This project aims to develop a two-finger robotic hand, which can be used to grasp deformable
target objects such as plastic bottles with various shapes. The robotic hand should integrate
tactile sensors to provide feedback to the controller, which paves a way for the development
of an adaptive control algorithm to determine the best grasping strategy.
69
BRL MSc Robotics 2022 January 17, 2022

135 Soft robotic hand for in-hand manipulation


Supervisor: Dandan Zhang, Jonathan Rossiter

Project ID: 135 Contact: ye21623@bristol.ac.uk

Programme: MSc Robotics, MSc Bio-Robotics

Description:
This project aims to develop a soft robotic hand and demonstrate its potential for in-hand
manipulation. In-hand manipulation involves manipulating an object within one hand. It is a
vitally important capability for robots to accomplish real-world tasks. Therefore, it is
worthwhile to investigate robotic dexterous in-hand manipulation to reconfigure an object.
Soft robotic hands can enable robots to grasp delicate or soft objects of varying shape, size,
and pose without explicit knowledge, and is promising for potential applications such as
manipulating eggs or fruits etc. Machine learning approaches can be explored to mitigate the
need of complex planning and control for in-hand manipulation when using a soft robotic hand
with dexterous fingers.

136 Surgical Gesture Segmentation and Recognition for Medical Robotic


Applications
Supervisor: Dandan Zhang, Weiru Liu

Project ID: 136 Contact: ye21623@bristol.ac.uk

Programme: MSc Robotics, MSc Bio-Robotics

Description:
Minimally invasive surgery mainly consists of the execution of a series of specific sub-tasks,
which can be further decomposed into basic gestures or contexts. As a prerequisite of
autonomic operation, surgical gesture recognition can assist motion planning and decision-
making, construct context-aware knowledge to improve the surgical robot control quality,
which may lead to better clinical outcomes. This project aims to investigate Bayesian Inference
and explainable machine learning approaches for surgical gesture segmentation and
recognition across multiple robotic platforms. The student is expected to have basic knowledge
of machine learning and is familiar with Python.

70
BRL MSc Robotics 2022 January 17, 2022

137 [co-created] Human-Robot Cooperative Control for Medical Robotics


Supervisor: Dandan Zhang, Chenguang Yang

Project ID: 137 Contact: ye21623@bristol.ac.uk

Programme: MSc Robotics, MSc Bio-Robotics

Description:
Teleoperation is widely used for medical robotic platforms. To ensure seamless and intuitive
control, an optimized leader-follower control model should be built to assist the operator in a
cooperative manner. The operator can retain precise control when conducting delicate or
complex manipulation, while an accelerated movement to the next distant target can be
achieved. Moreover, haptic guidance should be included in the control framework. Learning
from expert demonstration algorithms can be explored to obtain a pre-trained model, which
can be used to provide the haptic guidance signals for instructing trainees during teleoperation
or supervising novice surgeons to enhance the surgical operation skills. The student is expected
to have basic knowledge of machine learning and is familiar with Python/C++ as well as
simulation.

138 Sim-to-Real Transfer Learning for Robotic Manipulation


Supervisor: Dandan Zhang, Nathan Lepora, Daniel Freer (external)

Project ID: 138 Contact: ye21623@bristol.ac.uk

Programme: MSc Robotics, MSc Bio-Robotics

Description:
Training a deep learning model for robotic manipulator to locate and grasp the target object is
notoriously costly and time consuming. It requires a large amount of data, which is not
reasonable for most of the real-world applications. This project aims to train the deep learning
model in a simulation environment and implement sim-to-real transfer learning technique to
transfer the pre-trained model from the simulator to the physical environment. This project will
have collaboration with industry.

139 Robotic Skill Learning for Cooking


71
BRL MSc Robotics 2022 January 17, 2022

Supervisor: Dandan Zhang, Virginia Ruiz Garate

Project ID: 139 Contact: ye21623@bristol.ac.uk

Programme: MSc Robotics, MSc Bio-Robotics

Description:
With advances in the field of robotic manipulation, machine learning and sensing, intelligent
robots are expected to assist humans in kitchens and restaurants. This project aims to develop
assistive robots for cooking, which is envisioned to replicate human skills in order to reduce the
burden of the cooking process.

140 Robot Odometry using Sparse LIDAR


Supervisor: Arthur Richards, Mickey Li

Project ID: 140 Contact: arthur.richards@bristol.ac.uk

Programme: MSc Robotics, MSc Aerial Robotics

Description:
There have been awesome breakthroughs in SLAM, where dense sensors like cameras, depth
cameras or LIDARs are used to identify and track features in the world and hence track motion.
However, those systems have extensive size, weight and power (SWaP) needs. This project will
explore a navigation idea using just a few lightweight narrow-beam rangefinders, e.g. LIDARs,
which have much reduced SWaP needs. The key is to exploit the natural orthogonal nature of
most built-environments, i.e. that most rooms are cuboids. This observation reduces the
navigation problem to one of fault-tolerant estimation, forming a set of hypotheses about what
each rangefinder is detecting: north-south wall, east-west wall, ceiling, or nothing useful. The
project will involve a mix of analysis, simulation and hardware experiments to suit individual
interests. It will demand good software skills plus a strong background in estimation or a
related topic: sensor fusion, control, dynamics, etc. As it will be easier to start in 2D, the initial
scenario will be based on a rover.

141 Behavior-based modelling and simulation for streaming drones


Supervisor: Arthur Richards, Yuan Gao, Hirad Goudarzi

Project ID: 141 Contact: arthur.richards@bristol.ac.uk

Programme: MSc Aerial Robotics

72
BRL MSc Robotics 2022 January 17, 2022

Description:
This topic is focusing on scheduling the arrival drones by using the pre-set sequencing lines.
This system is particularly useful for the crowded landing situation. The lines can be designed
straight or curve but they should not be unique. Some of the key points include: how to design
the sequencing lines; where to set the merge point; how the new arrival drones merge the exist
sequence. To model this sequence system for arrival drones, some behavior-based concepts
(e.g., behavior tree, finite state machine) can be used as the design basis. In this study, all the
arrival drones can use the same behavior-based models but their control models may be
different: a behavior-based controller or decentralized control without a controller. The
streaming-drone system can be modelled by programming codes. An alternative way is to use
the multi-agent platforms, e.g. Anylogic, Unity3D, which provide friendly functions to
implement behavior modeling, state transition, message transmission, and data collection.

142 Speaking microbial language [2 students]


Supervisor: Jiseon You

Project ID: 142 Contact: jiseon.you@uwe.ac.uk

Programme: MSc Robotics

Description:
A certain type of microorganisms called electrochemically active bacteria, responses to external
electrical load. Therefore, by changing the external load, their responding signal (i.e. electrical
current) related to the change can be observed. In this project, the microbial language is
translated into electronic signals to help us understand the world of microorganisms. Students
are expected to build an electronic controller with built-in function of dynamic reconfiguration
of electrical connection.

143 Eating, sensing, dying like a beetle [2 students]


Supervisor: Jiseon You

Project ID: 143 Contact: jiseon.you@uwe.ac.uk

Programme: MSc Robotics

Description:
Beetles (; Coleoptera) are the most common insects in the world. They interact with their
ecosystems in several ways: beetles often feed on plants and fungi, break down animal and
plant debris, and eat other invertebrates. In this project, we want to create a ‘living’ robot

73
BRL MSc Robotics 2022 January 17, 2022

beetle that can break down organic substance, sense its surroundings, move as needed, and
eventually complete its life then be decomposed by nature.

144 Biological robotic tongue


Supervisor: Jiseon You

Project ID: 144 Contact: jiseon.you@uwe.ac.uk

Programme: MSc Robotics

Description:
In this project, students are expected to build biological robotic tongue using the microbial fuel
cell (MFC) technology. MFCs convert chemical energy locked inside organic matters into
electricity. Their electrical output can be affected by several factors, including the properties of
organic feedstock. The robot tongue will detect certain changes in feedstock, such as
concentration change and inflow of unwanted substances.

145 Design of an interface for lower-limb rehabilitation using the Franka


arm
Supervisor: Virginia Ruiz Garate

Project ID: 145 Contact: virginia.ruizgarate@uwe.ac.uk

Programme: MSc Robotics, MSc Bio-Robotics

Description:

146 Upper-limb rehabilitation using Franka robotic arm


Supervisor: Virginia Ruiz Garate, Dan Withney

Project ID: 146 Contact: virginia.ruizgarate@uwe.ac.uk

Programme: MSc Robotics, MSc Bio-Robotics

74
BRL MSc Robotics 2022 January 17, 2022

Description:

147 Self-powered pH-responsive mechanical meta-materials [2 students]


Supervisor: Amir Bolouri

Project ID: 147 Contact: amir.bolouri@brl.ac.k

Programme: MSc Robotics

Description:
The aim is to develop 4D printing of pH-responsive structures that work with pH-responsive
polymer gel actuators. In general, these gels show limited actuations and work capacity from
the swollen to the unswollen states. By combining multi-material 3D printing and advanced
geometrical modelling, we will develop new, bio-inspired actuation mechanics and 3D
architectures to transform the small deformation in pH-responsive gels into a large
displacement. This will produce pH-responsive 4D soft structures.

148 Tactile-based learning for human-robot interaction


Supervisor: Nathan Lepora, Efi Psomopoulou

Project ID: 148 Contact: n.lepora@bristol.ac.uk

Programme: MSc Robotics

Description:
Robot interaction with the environment and humans highly depends on force and energy
exchange. Being able to accurately estimate the interaction forces is then essential. This project
will focus on building and comparing force/torque models from an optical tactile sensor.
Applications can potentially include robot grasping and manipulation or physical human-robot
interactions. Prior Python knowledge is essential. During the project, the student will need to
become familiar with machine learning libraries and possibly CAD software.

149 Mechanical Metamaterials for Soft Robots


Supervisor: Jonathan Rossiter, Mohammad Naghavi Zadeh

75
BRL MSc Robotics 2022 January 17, 2022

Project ID: 149 Contact: jonathan.rossiter@bristol.ac.uk

Programme: MSc Robotics, MSc Bio-Robotics, MSc Aerial Robotics

Description:
Mechanical metamaterials exhibit properties not present in natural matter; therefore, they
provide much broader design domain and capabilities than conventional materials. Some of
these properties such as negative Poisson's ratio and negative stiffness are among those that
can be realized using soft matter. Having soft meta structures can benefit applications of soft
robotic such as grasping, gripping and locomotion in different operations like handling and
packaging, search and rescue, transport, medical and health care, etc. This project is aiming at
considering the design, analysis, and prototyping of soft robotic structures/mechanisms using
mechanical metamaterials. The scope of the project is broad and the candidate will have the
chance to delve into the details further and focus on application of interest. The project will be
executed within the soft lab at the BRL and training for the use of simulation software and
manufacturing equipment will be provided. The candidate will also benefit from the state of
the art testing and manufacturing facilities at the BRL.

150 Mechanical Metamaterials for Soft Robots


Supervisor: Mohammad Naghavi Zadeh, Jonathan Rossiter

Project ID: 150 Contact: m.naghavizadeh@bristol.ac.uk

Programme: MSc Robotics, MSc Bio-Robotics, MSc Aerial Robotics

Description:
Mechanical metamaterials exhibit properties not present in natural matter; therefore, they
provide much broader design domain and capabilities than conventional materials. Some of
these properties such as negative Poisson's ratio and negative stiffness are among those that
can be realized using soft matter. Having soft meta structures can benefit applications of soft
robotic such as grasping, gripping and locomotion in different operations like handling and
packaging, search and rescue, transport, medical and health care, etc. This project is aiming at
considering the design, analysis, and prototyping of soft robotic structures/mechanisms using
mechanical metamaterials. The scope of the project is broad and the candidate will have the
chance to delve into the details further and focus on application of interest. The project will be
executed within the soft lab at the BRL and training for the use of simulation software and
manufacturing equipment will be provided. The candidate will also benefit from the state of
the art testing and manufacturing facilities at the BRL.

151 Development of fully soft Bubble Artificial Muscles to create an


76
BRL MSc Robotics 2022 January 17, 2022

assistive device for human body motion


Supervisor: Jonathan Rossiter, Richard Suphapol Diteesawat, Nahian Rahman

Project ID: 151 Contact: jonathan.rossiter@bristol.ac.uk

Programme: MSc Robotics, MSc Bio-Robotics, MSc Aerial Robotics

Description:
Bubble artificial muscle (BAM) is a flexible, lightweight, pneumatic-driven actuator created for
human-body-assist applications. It can be customised to deliver either high contraction or high
tension, and any designs can be optimised to maximise their actuation performance. The BAM
was previously used to perform lower limbs’ activities, such as knee flexion for walking [1] and
knee extension for sit-to-stand transition [2]. However, the BAM requires rigid rings to
configurate the actuator shape and produce contractile motion, which possibly leads to human
injuries due to rigid components. This project will explore a new approach to fabricate the BAM
purely made from fabric materials without additional rings, for example, by creating uniform
folds on material using a heat-sealing method to replace rings. This will result in a 100% soft,
flexible fabric-based actuator, which causes the BAMs to be more user-friendly and safer.
Reference: [1] https://ieeexplore.ieee.org/abstract/document/8404950 [2]
https://www.liebertpub.com/doi/full/10.1089/soro.2019.0157

152 Exploring designs and patterns of 2D pneumatic actuator for various


robotic applications
Supervisor: Jonathan Rossiter, Richard Suphapol Diteesawat, Nahian Rahman

Project ID: 152 Contact: jonathan.rossiter@bristol.ac.uk

Programme: MSc Robotics, MSc Bio-Robotics, MSc Aerial Robotics

Description:
Pneumatic actuators have been widely used in robotic fields over several decades thanks to
their inherent flexibility, lightweight, high force-to-weight ratio, and simple power source. This
project will explore different designs or patterns of pneumatic actuators made of 2D fabric
material. When inflating the fabricated 2D fabrics to form pneumatic actuators, different 3D
structures and actuation behaviours occur. Combining or connecting different 2D patterns
together can result in useful 3D structures, for example, [1]. This concept can also include
morphological computation into a structure, meaning that the actuation control is transmitted
to the actuator structure, and it can passively change its shape or actuation behaviour by very
simple inputs. The outcome of this project can widely benefit robotic fields, such as a medical
robot, a robotic manipulator, a locomotion robot, etc. Reference: [1]
77
BRL MSc Robotics 2022 January 17, 2022

https://dl.acm.org/doi/abs/10.1145/2984511.2984520

153 Rapid stiffening capability of flexible 2D and 3D electroactive structures


Supervisor: Jonathan Rossiter, Richard Suphapol Diteesawat, Nahian Rahman

Project ID: 153 Contact: jonathan.rossiter@bristol.ac.uk

Programme: MSc Robotics, MSc Bio-Robotics, MSc Aerial Robotics

Description:
This project is to create a stiffening structure by the mean of electrostatic actuation [1]. The
concept of an instantly stiffening capability on 2D or 3D structure using electrical power source
is intriguing and will be very useful for wide robotic applications: general soft robotics,
locomotion robots, medical applications, etc. This project will initially investigate the
electroadhension effect of two parallel conductive plates, followed by developing actuators
with different patterns or shapes in 2D or 3D. The initial actuator prototype can be a ribbon-
like shape. The achievement of this project will lead to a flexible actuator which can be easily
deformed to any shapes by external load inputs and can maintain its shape when the electrical
power source is supplying. Reference: [1] https://ieeexplore.ieee.org/document/8613892

154 Tendon-driven stiffening structure


Supervisor: Jonathan Rossiter, Nahian Rahman, Richard Suphapol Diteesawat

Project ID: 154 Contact: jonathan.rossiter@bristol.ac.uk

Programme: MSc Robotics, MSc Bio-Robotics, MSc Aerial Robotics

Description:
This project will investigate various mechanical designs that enhance the stiffening of a tendon-
driven flexible structure at a given configuration. Tendon-driven actuators are quite common
and have been explored in robotics (e.g. medical catheter [1], teleoperation, gripper) over
several decades, enabling their ability to steer and bend in different planes, etc. Often, they
can be under-actuated and provide a simple and low-cost solution for various applications.
However, it is difficult for the tendon-driven structure to hold a certain configuration or state
in its workspace. In this study, several tendon routing mechanisms (exploiting 3D printing),
tensegrity, and auxetic structures will be explored to stiffen the structure at a given instance by
activating a tendon actuation. Once a suitable design is obtained, a further investigation to
reduce friction, hysteresis effect and coupling (between proximal and distal) effect will be
conducted. Reference: [1] Y. Kim, S. S. Cheng, M. Diakite, R. P. Gullapalli, J. M. Simard and J. P.
78
BRL MSc Robotics 2022 January 17, 2022

Desai, "Toward the Development of a Flexible Mesoscale MRI-Compatible Neurosurgical


Continuum Robot," in IEEE Transactions on Robotics, vol. 33, no. 6, pp. 1386-1397, Dec. 2017,
doi: 10.1109/TRO.2017.2719035.

155 Bistable soft structure inspired from bistable mechanism


Supervisor: Jonathan Rossiter, Nahian Rahman, Richard Suphapol Diteesawat

Project ID: 155 Contact: jonathan.rossiter@bristol.ac.uk

Programme: MSc Robotics, MSc Bio-Robotics, MSc Aerial Robotics

Description:
Bistable mechanisms [1] consist of two stable states that can be switched between two
different states when some condition(s) is met. In most bistable structures, force is applied to
obtain the shift between the states. These structures are useful, and many robots exploit these
to obtain some features, such as locking and stiffening features. Having this inspiration from
rigid robots, this study will develop semi-rigid and soft bistable structures that are potentially
be used as soft-lock, soft-brake, and soft-stiffening structures. For example, these mechanisms
can benefit a soft locomoting robot which can change its motion patterns when entering
different environments, or a wearable device which can alter assistances at different body
postures. Reference: [1] https://tinyurl.com/yckw84ma

156 Intelligent Visuo-Tactile Robotic Skin with Adaptive Morphology


Supervisor: Jonathan Rossiter, Qiukai Qi

Project ID: 156 Contact: jonathan.rossiter@bristol.ac.uk

Programme: MSc Robotics, MSc Bio-Robotics, MSc Aerial Robotics

Description:
"Touch (tactile) sensing helps robots to achieve rich information about the environment (E.g.,
hardness) during their interactions [1]. In principle, stimuli are first mediated by tactile sensors
before they would be further processed for decision making by Central Neural System (CNS). In
this perspective, the morphology of sensors is of great importance by affecting the
characteristics and property of the sensor [2-3]. A tactile sensor with an optimized, or adaptive
morphology can not only improve the performance of perception under different
circumstances, but also reduce the cost of computation. For example, in human, the different
density of tactile sensing elements (mechanoreceptors) throughout the hand (denser in the
fingertips and sparser elsewhere) results in different receptive fields and sensitivities. This
ensures that our brain can focus more on processing data from those more important areas
79
BRL MSc Robotics 2022 January 17, 2022

(fingertips), thereby reducing the computational burden. Skinflow [4] is a visuo-tactile sensor
developed in our lab (SoftLab, Bristol Robotics Laboratory). While it was demonstrated
successfully in a few tasks such as shape reconstruction, its morphology is not yet optimized.
This project will seek to optimize the morphology of Skinflow making it adaptive and further
demonstrate the effectiveness of such design in a range of robotic tasks boosted by prevailing
artificial intelligence techniques. Specifically, the student will: 1. Design and fabricate the
Skinflow sensor with an adaptive morphology. 2. Design experimental setup to characterize
the developed sensor. 3. Developed techniques based on analytical modelling and machine
learning to demonstrate the effectiveness of the design in robotic applications, e.g., shape
classification. Potential candidate is expected to have a background of mechanical engineering,
information science, Computer science and related disciplines, with enthusiasm for research
and investigating new area. Preferred skills include 3D modelling, programming (Python,
MATLAB, etc), and Finite Element Method (FEM) modelling, etc. Reference [1] Q. Li, et al, “A
Review of Tactile Information: Perception and Action Through Touch”, IEEE Traction on
Robotics, 2020. [2] A. B. Vallbo, et al, “Properties of Cutaneous Mechanoreceptors in the
Human Hand Related to Touch Sensation”, Human Neurobiology, 1984. [3] F. Iida, et al,
“Adaptation of Sensor Morphology: An Integrative View of Perception from Biologically Inspired
Robotics Perspective”, Interface, 2016. [4] G. Soter, et al, “Skinflow: A Soft Robotic Skin Based
on Fluidic Transmission”, Proceeding of IEEE International Conference on Soft Robotics, 2019."

157 Dynamic Motion Primitive realisation with mechanical springs for


effective locomotion
Supervisor: Helmut Hauser

Project ID: 157 Contact: helmut.hauser@bristol.ac.uk

Programme: MSc Robotics, MSc Bio-Robotics

Description:
Biological system are using central pattern generators (CPG) to produce rhythmic behaviours
like locomotion. Based on this principle Isjpeert et al. [1] proposed a clever learning framework
called Dynamic Movement Primitives (DMP). At its basis it uses an abstract limit cycles and
superimposes nonlinear function to learn rhythmic target trajectories. This project will explore
in simulation the exploitation of physical springs as basis for DMPs. The project can potentially
exapnded to implement the results in a real robot. Requirements: Programming knowledge of
Python, understanding nonlinear dynamical systems [1]
https://link.springer.com/chapter/10.1007/11008941_60

158 Exploitation of Morphological Features with Preditive Information


Maximisation
80
BRL MSc Robotics 2022 January 17, 2022

Supervisor: Helmut Hauser

Project ID: 158 Contact: helmut.hauser@bristol.ac.uk

Programme: MSc Robotics, MSc Bio-Robotics

Description:
It has been shown that Predictive Information Maximisation can be used to exploited existing
complex dynamics to obtain a coordinated behaviour like locomotion, e.g. [1]. In addition, it
has been demonstrated that simple compliant systems can exhibit surprisingly complex
dynamics that can be exploited for computation and control [2]. The project combines both
observations and explores systematically in simulation how predictive information
maximisation can be used to find interesting and useful locomotion gaits in compliant robots.
The project can be extended to implement the findings in a real robot. Requirements:
Programming knowledge of Python, interest in Machine Learning [1] 1. Martius, G., Jahn, L.,
Hauser, H. & Hafner, V. Self-exploration of the Stumpy Robot with Predictive Information
Maximization. in From Animals to Animats 13 (eds. del Pobil, A. P. et al.) 8575, 32-42 (Springer
International Publishing, 2014). [2] Hauser, H.; Ijspeert, A.; Fuchslin, R.; Pfeifer, R. & Maass, W.
"Towards a theoretical foundation for morphological computation with compliant bodies"
Biological Cybernetics, Springer Berlin / Heidelberg, 2011, 105, 355-370
https://link.springer.com/article/10.1007/s00422-012-0471-0

159 Phonetic alphabet generation with Soft Structures


Supervisor: Jonathan Rossiter, Andrew Hinitt

Project ID: 159 Contact: jonathan.rossiter@bristol.ac.uk

Programme: MSc Robotics

Description:
People can readily converse and spare no thought to how they form the words they are saying
using their mouth. The oral cavity is a complex soft structure with a mix of static semi-rigid
surfaces, soft tissues, and actuating muscles. The tongue is a muscular hydrostat that enables
the shaping of many of the phonetic sounds we produce. To formulate words the tongue, lips
and surfaces need to dynamically interact in concert. This project will aim to develop a
simulator to generate individual phonics. Some modelling and real-world simulators have been
produced with limited success. We want to take this further and use state-of-art soft actuators
to mimic one of Nature’s greatest tools.

160 Peristaltic Robot for Debris Clearance


81
BRL MSc Robotics 2022 January 17, 2022

Supervisor: Jonathan Rossiter, Karen Yue

Project ID: 160 Contact: jonathan.rossiter@bristol.ac.uk

Programme: MSc Robotics

Description:
Removal of debris is a key process in robotic systems (from medical to industrial). Soft robots
are of particular interest for tunnelling applications due to their ability to adapt and avoid
obstacles in ways current methods cannot. In a tunneling robot debris must be removed to
avoid jamming and limiting the tunnelling capability. This has parallels in Nature with tunnelling
animals (mammals and invertebrates). One possible method of clearing debris is through
peristalsis - a series of wave-like contractions that propagate down a body. This action is found
within the human digestive tract and for locomotion in worms. This project aims to develop a
soft system to remove the debris formed by a tunnelling robot through a peristaltic action.

161 Coverage path planning of a marsupial UAV-ground robot [2 students]


Supervisor: Manuel Giuliani/Arthur Richards, Jennifer David

Project ID: 161 Contact: manuel.giuliani@uwe.ac.uk

Programme: MSc Robotics, MSc Aerial Robotics

Description:
Abstract: The project proposes to use a UAV tethered to a ground robot (preferably a wheeled
mobile robot like the turtlebots) for inspecting a nuclear decommissioning plant. It could be
tethered physically or virtually. For example, the battery issue of the UAV can be dealt with
using a physical cable from the ground robot. Or formation controller can be used as a virtual
tether between the robots. Some areas in the environment like a small tunnel can be explored
only by the ground robot, whereas over-bridges can be explored by the UAV only and other
areas by both the robots. We want to map/inspect/coverage plan the given environment with
this setup. Research Questions: The student can come up with their own questions based on
the area they want to explore. Some of the questions/expected tasks can be: Setting up the
marsupial system - either virtually or physically. Localizing the UAV based on the ground robot
system without GPS. The UAV can create a 2.5D map using the camera and the ground robot
can create a 2D map based on its Lidar for navigation - possible map merging? The robots can
do coverage path planning for the given area. How many marsupial robots are needed to do
coverage planning/mapping for the given
area?https://www.wired.com/video/watch/marsupial-robot

82
BRL MSc Robotics 2022 January 17, 2022

162 Cat Swarm Optimization in Swarm Robotics


Supervisor: Chanelle Lee

Project ID: 162 Contact: c.l.lee@bristol.ac.uk

Programme: MSc Robotics, MSc Bio-Robotics

Description:
Cat Swarm Optimization (CSO) is inspired by the behaviours of cats and was originally invented
by Chu et al. 2006. Often cats appear to be lazy and spend much of their time resting. However,
in this state they remain aware of their surroundings and once they identify a target they
pounce. CSO combines both these seeking and tracking modes. This project proposes using Cat
Swarm Optimization in a multi-robot system to complete a foraging task. We predict that CSO
will minimise energy usage - an important consideration in scenarios where regular recharging
might be difficult. This project would suit a student looking for a balance of theoretical work
and programming simulations. It will involve writing large scale simulations in python (or
Matlab). Extensions to the project could include using more realistic robotic simulation
software such as V-rep.

163 Evaluating the use of haptic feedback modality for human-in-the-loop


multirobot teams [2 students]
Supervisor: Manuel Giuliani, Joseph Bolarinwa

Project ID: 163 Contact: manuel.giuliani@uwe.ac.uk

Programme: MSc Robotics

Description:
Haptic feedback modalities have been employed for robot tele-operation to allow effective
robot tele-operation. It may be used to provide force feedback or other measured sensor
parameters via sensory substitution. Where a team of heterogenous robots are used to carry
out tasks, say for nuclear decommissioning, the introduction of a human tele-operator may
improve the efficiency with which the tasks are carried out. The tele-operator may be able to
take control of any of the robots to carry out a task. This project proposes the use of a wrist-
worn haptic feedback device to provide information about the measured parameters to the
tele-operator (via sensory substitution) when the tele-operator takes manual control of a robot.
Knowledge of C, C++ or Python programming languages will be beneficial. A user study will be
required to evaluate the usability of the haptic feedback modality.

83
BRL MSc Robotics 2022 January 17, 2022

164 Determining human intervention for multirobot decision making [2


students]
Supervisor: Manuel Giuliani, Joseph Bolarinwa

Project ID: 164 Contact: manuel.giuliani@uwe.ac.uk

Programme: MSc Robotics

Description:
Several decisions are made when a team of heterogenous robots carry out tasks. The scenario
changes when a human tele-operator is introduced into the setup. The tele-operator may be
able to intervene at different stages of the task to improve the efficiency with which the team
of heterogenous robots carry out the task. This project proposes a user study to investigate the
need for human intervention, types of interventions and how different tele-operators may
perceive the need for intervention. This project may be carried out in simulation or real-world
implementation. Knowledge of python or C++ programming languages and ROS will be an
advantage.

165 Robot companions


Supervisor: Paul Bremner, Joe Daly

Project ID: 165 Contact: paul2.bremner@uwe.ac.uk

Programme: MSc Robotics

Description:
Many visions of the future feature people living with robot companions, but how do we get
there? Knowing how people are affected by interactions with a robot will be an essential part
of the successful emergence of this future.
The aim of this project is to explore how people interact with, and perceive a robot companion
with a focus on how emotions can be expressed by a robot, and what effect this has on people.
The project will include developing behaviours for the Vector robot using Python and Raspberry
Pi.
There are multiple possible lines of investigation to suit your interests/ expertise. Some
example research areas your project could explore are: - how ethical is it for robots to show
emotions without actually feeling them? - how well do people recognise and respond to robot
emotion when they are not expecting it? - how do robot emotions affect the mood of people
interacting with the robot?

84
BRL MSc Robotics 2022 January 17, 2022

166 [co-created] Assessment of human-computer interaction using vision-


based behavioural cues
Supervisor: Wenhao Zhang

Project ID: 166 Contact: wenhao.zhang@uwe.ac.uk

Programme: MSc Robotics

Description:
When actively or passively interact with a computer, a human user might exhibit different
behaviours via head movements and eye movements which can indicate the state of an
interaction. For example, frequent head rotations and eye movements may be signs for
ineffective human-robot communication (active interaction); and frequent eye blinks of a
driver might indicate drowsiness (passive interaction). This project aims to use a deep learning
model to estimate head poses, eye centre positions, and eye openness from images and videos
in the context of HCI assessments.

167 Effect of Spatial Density on Swarm Problem Solving


Supervisor: Paul O'Dowd

Project ID: 167 Contact: paul.odowd@bristol.ac.uk

Programme: MSc Robotics, MSc Aerial Robotics

Description:
Swarm robotic systems are intended to utilise hundreds, thousands or larger numbers of robots
working cooperatively. Swarm robotic systems are unusual because they specify that a
centralised control scheme should not be used (e.g., no hierarchy, central server or database).
Swarm robotic systems are interesting because they rely on relatively simple local
communication between robots as an alternative to centralised control. Robots are also
physical devices, and therefore always occupy mutually exclusive space, and they must navigate
around one another. In these terms, a swarm of robots can be considered as a dynamic
network topology - all the robots in the swarm cannot be fully connected to one another, and
the nodes of the network are free to move and alter their connectivity. The aim of this project
is to model how the mobility of swarm robots effect the performance of a dynamic network
topology. To assess the performance of the network topology a collective problem is required
for the swarm. This project will implement a distributed evolutionary algorithm (EA) to operate
across a swarm of robots. The study will initially benchmark the performance of the distributed
EA on known fixed network topologies common to computer science, such as fully-connected,
ring, star and hypercube connectivity. Then, a comparison will be made to a swarm of robots
85
BRL MSc Robotics 2022 January 17, 2022

running the same distributed algorithm. The swarm will be experimentally validated across
different degrees of robot mobility. As an outcome of this research, it should be possible to
make recommendations on the degree of mobility (or spatial density) of swarms of robots
when communication underpins collective problem solving. This project will involve both
simulation and real robots, with a swarm of Pololu 3Pi+ robots fitted with Infra-red
communication as an available candidate.

168 Self-Organised Heterogenous Swarm Partitioning


Supervisor: Paul O'Dowd

Project ID: 168 Contact: paul.odowd@bristol.ac.uk

Programme: MSc Robotics, MSc Aerial Robotics

Description:
If the proposed benefits of swarm robotics hold true, we can imagine a future of thousands of
cheap robots being deployed to solve our problems. However, this would mean many swarms
operating and existing within the same environments. Swarms are understood to make
extensive use of their interactions with the environment and other robots to coordinate their
group-level behaviours. As such, many swarms are likely to interfere with each other’s
operation. Therefore, it would be valuable and useful if many swarms could identify which
swarm they belong to, and to organise themselves appropriately. This goal is made a challenge
because we attempt to avoid any centralised control or identifying attributes within swarm
systems. It is also the case that the algorithm to allow swarms to self-identify should itself be
a self-organising and decentralised mechanism.

This study will investigate a probabilistic approach to allow individual swarm robots to develop
a belief about which swarm they belong to based on the success of their interactions with other
robots. We can roughly imagine this as analogous to perceiving “being accepted or rejected”
by a swarm. The project is likely to have a period of implementation and evaluation in
simulation before it can be progressed to an implementation on real robots. For real-robots, a
swarm of Pololu 3Pi+ robot fitted with Infra-red communication are an available candidate.

169 Low-cost Multi-Robot Tracking System


Supervisor: Chanelle Lee, Paul O'Dowd

Project ID: 169 Contact: c.l.lee@bristol.ac.uk

Programme: MSc Robotics, MSc Bio-Robotics

86
BRL MSc Robotics 2022 January 17, 2022

Description:
When multi-robot control architectures tested in simulation are transferred to physical robots
there is often a discrepancy in behaviour known as the reality gap. Often the reality gap
manifests as a drop in performance, meaning that to fully evaluate a control architecture it
needs to be tested on physical robots. However, multi-robot testbeds are currently both
expensive, can be hard to access and difficult to set up with requirements such as a static arena,
motion capture and tracking systems and multiple robots. Thus, there is a need for multi-robot
testbeds which are low cost, accessible with a short build time. In this project, we will first look
at creating a multi-robot testbed using low cost hardware and open-source software
alternatives. For example, rather than a motion capture system we could instead use a system
using web-cameras, Open-CV and QR codes. We will then evaluate the testbed on metrics such
as max number of robots and robot pose accuracy/precision. Finally, as an extension we could
implement ways to compensate for tracking loss or a projected information overlay. This
project would suit a student looking for a project focused on practical work. Familiarity with a
computer vision libraries such as Open-CV would be useful.

170 Cross-Talk: Examining Local Communication in Swarm Robotics


Supervisor: Paul O'Dowd

Project ID: 170 Contact: paul.odowd@bristol.ac.uk

Programme: MSc Robotics, MSc Aerial Robotics

Description:
Swarm robotics has the governing principles of decentralised control and self-organised
behaviours. It is believed that these governing principles are facilitated through several key
mechanisms. One key mechanism is a necessity for local-communication - that communication
distance should be constrained to be local relative to the size of a swarm agent. This
mechanism has been adopted and implemented into many swarm models and swarm systems.
In this study, we will investigate the real-world complexities and performance of a local-
communication implementation. The findings will be used to improve future models and
implementations of swarm robotics. For example, a real world complexity is that we expect
there to be a degree of signal-interference when many robots are transmitting messages to
one-another asynchronously. These issues are confounded when we consider that the robots
are mobile, and the functionality of communication may depend on task and context. This
project will make a structured investigation of an Infra-red (IR) messaging technology for a
swarm of Pololu 3Pi+ robots. The platform will be quantitatively evaluated for performance
across several operational parameters. From this, recommendations for swarm
implementations will be proposed and evaluated.

171 Intention Recognition for Swarm Robotic Box Pushing


87
BRL MSc Robotics 2022 January 17, 2022

Supervisor: Paul O'Dowd

Project ID: 171 Contact: paul.odowd@bristol.ac.uk

Programme: MSc Robotics, MSc Aerial Robotics

Description:
Intention Recognition for Swarm Robotic Box Pushing Box pushing is a frequently explored area
of multi-agent systems, multi-robot systems, swarm robotics, and other areas. As a scenario,
box-pushing has an analogy to collective foraging behaviours in the natural world, such as when
many ants push or pull an object back to the nest. Developments in robotic box-pushing include
various ways to model, plan, predict and coordinate box-pushing behaviour. In swarm robotics,
such deliberative techniques and explicit communication strategies are often avoided to ensure
a decentralised and self-organising principle of operation. However, planning and
communication does not necessarily compromise decentralised control. This project will
investigate how these “higher level” robotic mechanisms can be implemented in self-organised
and decentralised ways, to benefit a swarm robotic approach. In particular, it is proposed that
swarm robots should be able to explicitly communicate their intention when engaged in box-
pushing behaviour, and the utility of this will be explored across multiple scenarios.

172 Attention Holding in Human-Swarm Interaction


Supervisor: Paul O'Dowd, Merihan Alhafnawi

Project ID: 172 Contact: paul.odowd@bristol.ac.uk

Programme: MSc Robotics

Description:
This project will use the new Mosaix Tile robot’s designed by Merihan Alhafnawi. These swarm
robots are fitted with cameras that allow them to make high-resolution observations of the
task environment. This study will focus on the research are of human-swarm interaction (HSI).
In HSI, we wish to investigate how a human can hold meaningful interactions with a swarm,
without compromising the core mechanisms of swarm robotic systems. These include
decentralised control and self-organisation. Therefore, it is expected that a human would not
necessarily make a direct communication to any one robot, or utilise any specific robot as a
centralised node. This raises the research question of how well a human can understand if they
are communicating with a swarm (or not). Does a human communicate with one robot, or the
swarm as a whole? How can a human know when they are interacting successfully with a whole
swarm? This project will utilise the camera of the Mosaix to implement face tracking. The
robots will therefore be able to individually perceive the presence and directed-gaze of a
human. The project will look to investigate and implement how a swarm can realise and
collectively focus their attention on a human in a self-organised way. The study will conduct
88
BRL MSc Robotics 2022 January 17, 2022

multiple experiment scenarios to understand whether a human consequently believes they can
capture and hold the meaningful attention of a swarm, and which swarm robotic mechanisms
underpin this.

173 Objectively Interesting Robotic Drawings


Supervisor: Paul O'Dowd

Project ID: 173 Contact: paul.odowd@bristol.ac.uk

Programme: MSc Robotics

Description:
Whilst it is possible to build various machines and robots to execute (produce) drawings, it is a
much harder problem to decide what to draw. This question of “what to draw” implicates
intentionality - when we draw, we start with an idea and attempt to complete it to our
satisfaction. This becomes a process of trial-and-error - a search process - where we optimise
our actions and adjust our expectations. Furthermore, “how to draw” is a second question
that implicates reason - when we make marks, we utilise the characteristics of the drawing
materials to suggest or infer information about a subject we have observed - for example, bold
lines, hatching, or a wash of colour, all communicate different information. If a person is asked
to draw, “what” and “how” will change depending on the experience of the materials they are
given. Existing research shows that this includes a broad range of senses, including auditory,
tactile and visual perception. This makes drawing a good candidate to research embodied
artificial intelligence. This project looks to take a small step towards a robot which can
determine “what” and “how” when drawing. This project will use a load-cell sensor to allow a
robot to perceive tactile information whilst drawing. A heuristic search algorithm will be
applied to generate drawing-actions, which will then be evaluated for the resultant tactile
information produced. An implementation of Shannon’s Entropy measure of information (or
similar) will allow for a correlation between action and information rendered. The project will
evaluate whether this computational understanding of information, combined with an
optimisation of actions, is made apparent in the reality of robot drawings. Students in this
project must be proficient in C, Java and be comfortable with kinematics. The project will utilise
an existing 6dof Stewart platform built for robotic drawing.

174 Robotic fish


Supervisor: Andrew Conn, Jonathan Rossiter

Project ID: 174 Contact: a.conn@bristol.ac.uk

Programme: MSc Bio-Robotics, MSc Robotics


89
BRL MSc Robotics 2022 January 17, 2022

Description:
Autonomous underwater vehicles (AUVs) are increasingly required for harbour monitoring, to
inspect infrastructure for tidal energy harvesting and to track environmental factors such as
marine pollution. Fish-inspired robots have the potential to exploit the high hydrodynamic
efficiencies of biological swimmers, but they are typically restricted by inefficient actuation and
body form. This project aims to develop a fish-inspired robotic propulsion system that
combines high hydrodynamic and electromechanical efficiencies.

175 Embodied digital logic


Supervisor: Andrew Conn

Project ID: 175 Contact: a.conn@bristol.ac.uk

Programme: MSc Robotics

Description:
The principle of a transistor can be embodied as an electromechanical “switch” that switches
an electrical signal on and off in response to mechanical deformation. This will enable digital
logic, and hence computation, to be driven by mechanical stimuli in the bodies of robots. By
using soft robotic materials, the electromechanical switches could be flexible and fully
integrated into the robot’s body so that it can perform computation while interacting with its
environment without the need for additional sensors and centralised control. This would create
soft robotic systems that are more resistant to failure by removing the need for single processor
(and point of failure). This project will focus on the design of a novel soft electromechanical
switch that requires no additional circuitry to respond to mechanical stimuli such as force or
strain, so that logic gates and other forms of computation can be embodied in a robotic
prototype.

176 Trustworthy machine learning based agile flight control of quadcopters


Supervisor: Shane Windsor

Project ID: 176 Contact: shane.windsor@bristol.ac.uk

Programme: MSc Aerial Robotics

Description:
Machine learning offers many potential advantages over conventional flight control algorithms
but it is very challenging to verify these systems and to guarantee their robustness and
performance as can be done with conventional control systems. This project will look at
approaches for using machine learning as part of a flight control system in a way that allows

90
BRL MSc Robotics 2022 January 17, 2022

robust performance to be guaranteed. The project would use the CrazyFly micro-quadcopter
flying in the BRL flight arena as the test platform for this study.

177 Bio-inspired soaring in urban environments


Supervisor: Shane Windsor

Project ID: 177 Contact: shane.windsor@bristol.ac.uk

Programme: MSc Aerial Robotics

Description:
Birds save substantial amounts of energy by using the wind flows in urban environments to
reduce their flight costs. This project will look at developing path planning algorithms which
use the wind fields around buildings to save energy for fixed wing UAV flight.

178 Bio-inspired distributed flow sensing


Supervisor: Shane Windsor

Project ID: 178 Contact: shane.windsor@bristol.ac.uk

Programme: MSc Robotics

Description:
Birds and insects all have arrays of distributed force and flow sensors on their wings. We have
developed fixed wing UAVs with similar arrays of sensors to explore how this type of
information can be used as part of a flight control system. This project will look at conducting
outdoor flight testing with this platform to characterise the signal properties available from this
system and explore how these signals can be integrated into the flight control system.

179 Design through Stimulation - Designing Robot Morphologies through


Mechano-stimulation of Stem Cells in Simulation
Supervisor: Helmut Hauser

Project ID: 179 Contact: helmut.hauser@bristol.ac.uk

91
BRL MSc Robotics 2022 January 17, 2022

Programme: MSc Bio-Robotics, MSc Robotics

Description:
Conventionally, robots are designed and assembled. However, this becomes non-trivial, even
unfeasible, for smaller scale robots. This project will explore an alternative approach by
obtaining robot morphologies through stimulation of stem cella. Specifically, we will exploit the
mechano-sensitivity of stem cells that drives differentiation to new phenotypes with different
morphological features, e.g., chondrocytes (“cartilage”), osteoblasts (“bones”), via responses
to different levels of mechanical stimulation. This provides a mechanism that translates
mechanical stimulation into emergent morphological designs (i.e., level of force --> level of
stiffness). The idea will be explored in simluation. We will focus on a specific proof-of-concept,
i.e., the tail for a tadpole inspired swimming robot, which will be exposed to water flow. The
level of mechanical stimulation and resulting strains will differ at different locations on the
body. As a result, at different body parts we will obtain different cell expressions and,
correspondingly, different levels of stiffness. Requirements: Programming skills (e.g. Python)

180 Machine learning-based automatic control of living cells [2 students]


Supervisor: Lucia Marucci, David Barton

Project ID: 180 Contact: lucia.marucci@bristol.ac.uk

Programme: MSc Bio-Robotics

Description:
Synthetic Biologists aim at directly engineering new functions into living cells, to both
understand better endogenous processes and to repurpose cells for useful scopes.
Cybergenetics, in particular, applies principles of feedback control engineering to control living
cell functions. We have developed cybergenetics platforms based on microfluidics and
microscopy to automatically steer gene expression and proliferation dynamics in mammalian
cells [Postiglione et al, ACS Synth. Biol. 2018, 7, 11, 2558-2565; Pedone et al., Nat Commun 10,
4481, 2019]. So far, we have used Relay and Model Predictive Control algorithms. The student
working on this project will attempt the use of reinforcement learning, starting with in silico
controllers and, if time allows, implementing results experimentally in bacterial or mammalian
cells. Ideally, s/he has knowledge of advanced control theory, and Matlab/Python.
Stakeholders include members of the research groups of the supervisors, control engineers
with an interest in biological systems, and biopharma (Lucia Marucci is collaborating on related
research with AstraZeneca).

181 Human-swarm interactions for intralogistics

92
BRL MSc Robotics 2022 January 17, 2022

Supervisor: Sabine Hauert

Project ID: 181 Contact: sabine.hauert@bristol.ac.uk

Programme: MSc Bio-Robotics, MSc Robotics

Description:
This project focusses on designing a user interface for a swarm of robots deployed in a
cloakroom scenario. The interface should allow a user to depost their jacket, and retrieve them.
The project will end with a small-scale user study.

182 Aerial robotics for conservation applications [5 students]


Supervisor: Tom Richardson

Project ID: 182 Contact:


thomas.richardson@bristol.ac.uk

Programme: MSc Aerial Robotics, MSc Robotics

Description:

183 Safety braking and manoeuvrability of hospital beds with assisted drive
capabilities (Focus on Mechanical Engineering)
Supervisor: Tony Pipe, Snir Benedek

Project ID: 183 Contact: tony.pipe@brl.ac.uk

Programme: MSc Robotics

Description:
"Hospital beds are moved frequently in hospitals typically requiring 3 to 4 staff to be involved.
This impacts staff availability, impairs quality of service and subjects staff to contamination
risks. Decreasing the number of staff needed to transport beds can be an effective mitigating
measure.

New motorised beds are a very expensive option, while retrofitting existing beds to achieve the
same mobility function is more commercially viable.
93
BRL MSc Robotics 2022 January 17, 2022

Universal bed moving devices improve the ergonomics of mobilising beds and possible other
types of heavy ambulatory equipment, such as bariatric hoists. Improvements in
semiconductor electronics and electric motor performance have enabled use of distributed
topology systems based on ‘intelligent wheels’ with integrated motor, electronics and self-
control.

These can be bolted onto existing platforms, giving them mobility.

Benedex Ltd is a BRL-based robotics and e-mobility company with an innovative, universal
mobility solution which can be attached to existing platforms and power their movement,
thereby decreasing the effort exerted by porters and medical staff by 70-80%.

To assess the viability of applying these safety measures, it is highly desirable to convert an
existing bed to inform the assessment of the requisite characteristics for the optimal adaptation
to the hospital environment and requirements.

This project is a redesign of the system which incorporates an active and responsive suspension
and braking system. The braking system, for example could be a cable disc brake with a servo
actuator. The Suspension system must be able to retract to enable lateral travel of the bed
without the friction of the drive wheel. It’s a task which is wide in scope and requires the newly
augmented platforms to operate together well in the highly constrained ecosystem of hospitals
to truly improve staff work conditions and availability. Therefore, careful design must be
implemented on the usability of the system from a system engineering aspect: user interface
design, battery capacity, reliability of operation, and controller requirements.

In addition, risk assessments of operation need to be carried out, and considerations of


manoeuvrability of the bed need to be considered, as well as other additions which could
enhance the performance of the system.

Benedex has actually conceptualised a design of such a system as you can see in the rendering
below. It is two steps beyond a previous MSc Robotics project carried out in this topic area last
year. It involves an active linear actuator for applying monitored pressure of the wheel against
the floor, for traction, and also has a disc brake feature which depends on there being traction,
since the brake is useless if the wheel is not pressed against the floor. A productive task for a
student involved in the current project would be to suggest a way to keep the pressure constant
and monitor it, assess how much pressure the wheel must press against the floor to keep a
500kg bed from rolling down a 5° slope, and a powered mechanism to operate the disc brake.
Suggestions could include use of a different actuator, an air canister, or some manual level.
Such a system would be constituted of:
• Benedex wheel and control system (that has been developed)
• Mounting bracket
• Battery mount
• Active suspension system comprising electric actuator, spring, swing arm or equivalent
mechanism
• Powered braking system
"
94
BRL MSc Robotics 2022 January 17, 2022

184 IoT conversion of self-powered hospital bed (Focus on Software


Engineering)
Supervisor: Tony Pipe, Snir Benedek

Project ID: 184 Contact: tony.pipe@brl.ac.uk

Programme: MSc Robotics

Description:
"Hospital beds are moved frequently in hospitals typically requiring 3 to 4 staff to be involved.
This impacts staff availability, impairs quality of service and subjects staff to contamination
risks. Decreasing the number of staff needed to transport beds can be an effective mitigating
measure.

New motorised beds are a very expensive option, while retrofitting existing beds to achieve the
same mobility function is more commercially viable.

Improvements in semiconductor electronics and electric motor performance have enabled use
of distributed topology systems based on ‘intelligent wheels’ with integrated motor, electronics
and self-control.

These can be bolted onto existing platforms, giving them mobility.

Benedex Ltd is a BRL-based robotics and e-mobility company with an innovative, universal
95
BRL MSc Robotics 2022 January 17, 2022

mobility solution that can be attached to existing platforms and power their movement,
thereby decreasing the effort exerted by porters and medical staff by 70-80%.

As with other capital assets of a hospital, a fleet of bed movers in any needs to be monitored
to assure units are operable, and their whereabouts known, and control can be overridden by
the hospital IT centre. Implementing IoT in Benedex’s bed movers could be used to report
diagnostic information and whereabouts to the hospital’s monitoring system, as well as to
remotely disable all units or enact a safety protocol, such as retracting the powered wheels in
case of a fire so that the beds can be pushed without resistance, or alternately, forcing the
brakes on in case a bed leaves the area it must be in, similar to remote-supervised supermarket
trolleys in some places.

This project requires characterisation of the data transfer system:


• Choice of Wireless Communication Protocol eg Zigbee, Wifi or Bluetooth?
• Localisation Technology integration
• What parameters are being broadcast from the Hospital bed to a central server?
• Operation protocols (e.g. what to do if the battery is close to empty, or if the bed leaves an
expected area)
• Writing a basic software which queries the parameters from the Benedex system and
broadcasts them
• Linking this with ROS or ROS2
• Optional Extension of project to include addition of patient Observations and Monitoring to
the data packet, eg ECG, VO2, pressure sensor (to detect if patient is on bed).

It should be noted that the existing Platform already has a ROS node that the student engaged
in this project can subscribe to, so as to provide relevant data from and publish data to effect
behaviour, e.g. to force the brakes to come on or retract the motor wheel. Benedex have some
expertise in writing microROS and ROS2 nodes so they can Mentor the student when required.
"

185 [co-created] Decision-making and control for simulated highway


autonomous driving based on reinforcement learning
Supervisor: Ning Wang, Tony Pipe

Project ID: 185 Contact: Ning2.Wang@uwe.ac.uk

Programme: MSc Robotics

Description:
It is a challenge to carry out autonomous driving safely on highway because of the numerous
dynamic vehicles. Compared with urban roads, highways structures are simpler, usually
composed of straight or small curvature roads. However, the high driving speed require drivers

96
BRL MSc Robotics 2022 January 17, 2022

or the autonomous driving system to make decisions in a very short time. Either for drivers or
the autonomous driving system, it is not easy to predict the motion of surrounding vehicles and
make correct driving decisions in a short time. This project will use a method based on
reinforcement learning to solve autonomous driving problems in highway situation such as car
following, changing lanes, and overtaking, etc. The project will use the Carla simulator to
simulate the above scenes. The ego vehicle will use multiple cameras as sensors for data
sources. The reinforcement learning model will plan a path according to the image scenes.
Then, the lower controller will control the accelerator, brake and steering wheel of the vehicle
based on the path. The model evaluation will use the collision-free driving distance and average
speed of the car as metrics. The goal of this project is to achieve a collision-free driving of 10km
on a highway at an average speed of 80km/h.

Project Supervisors
1. Arthur Richards

Contact: arthur.richards@bristol.ac.uk
Autonomous navigation, primarily for drones. Aiming to make drones more useful in civil
applications by improving their ability to handle uncertain and/or cluttered environments.
Techniques include smart mission-planning, on-board autonomy, or both. Skills you ideally have
to work with me:
• Mathematical modelling, including any of: optimisation; control; estimation; geometry.
• Programming, including any of: C, C++, Python, MATLAB, Arduino
Projects supervised by Arthur
• Safely Automating Drone Handling [1]
• Automated filming of indoor drone experiments[2]
• Self-Replenishing Drone Team [3]
• Distributed Heartbeat based Multi-UAV Health Monitoring using Bluetooth[4]
• Drone traffic management in urban environments[5]
97
BRL MSc Robotics 2022 January 17, 2022

• Robot Odometry using Sparse LIDAR [140]


• Behavior-based modelling and simulation for streaming drones [141]

2. Edmund Hunt

Contact: edmund.hunt@bristol.ac.uk
I am a research fellow working in “swarm engineering”. My main interest is in bio-inspired swarm
robotics – I have a PhD background in biology (collective animal behaviour) and complexity (“the
whole is greater than the sum of its parts”). The main platform that I’ve used to test swarm
robotics ideas so far is the Kilobot (1000 robot swarm) at the BRL. However, this year I am aiming
to take swarm robotics into the outside world, by building a robot that works successfully in field
environments (Fenswood Farm, Long Ashton). If you are interested in participating in the design
and simulation of a cheap & easy to manufacture robot that can work in large numbers, consider
my project.
Projects supervised by Edmund
• MCMC methods for stochastic robot swarm control [6]
• Environmental sensing through minimal yes/no robot swarms [7]

3. Helmut Hauser

Contact: helmut.hauser@bristol.ac.uk

98
BRL MSc Robotics 2022 January 17, 2022

Research Area
Our research group works in the field and Morphological Computation, which is a concept
inspired by observations in nature where we can see that morphological feature play a crucial
role in the emergence of intelligent behaviour. We work together with biologists and chemists to
understand the underlying principles and employ them to develop fundamentally novel design
approaches to build more robust and intelligent robotic systems and sensors. Our field is closely
connected to Soft Robotics, since you need soft bodies to leverage the concept of Morphological
Computation.
Note that our group is very interdisciplinary, and group members are coming from a number of
different backgrounds. Furthermore, we also have strong collaborations on the international
(USA, Japan, Europe) as well on the national level (e.g., Oxford, Cambridge and Imperial).
Skills you ideally have to work with me:
Our research is carried out in many different domains. We work on theoretical models, use
simulation tools, apply machine leaning, and last but not least, we also build various prototypes.
Hence, any background is welcome as long as you are really excited about the research project.
Projects supervised by Helmut
• Improving Physical Reservoir Computing [8]
• Dynamic Motion Primitive realisation with mechanical springs for effective locomotion
[157]
• Exploitation of Morphological Features with Preditive Information Maximisation [158]
• Design through Stimulation - Designing Robot Morphologies through Mechano-
stimulation of Stem Cells in Simulation [179]

4. Hermes Gadelha

Contact: hermes.gadelha@bristol.ac.uk
Our research focuses on novel questions and unexplained phenomena in a broad range of
physical systems found in nature. We are inspired by engineering and mathematical problems
found in biology and industry, encompassing areas in small-scale fluid mechanics, solid
mechanics and elasticity, biological soft-matter, transport phenomena and diffusion, pattern
formation and selforganization, and biomimetic devices. We use a combination of analytical

99
BRL MSc Robotics 2022 January 17, 2022

derivations, numerical computations and empirical investigations in areas ranging from animal
and human fertilization, swimming of microorganisms, especially human spermatozoa and
bacterial swimming, cancer mechanics, molecular-motor dynamics, fluid-structure interactions
of intelligent structures, plus synthetic bio-inspired propellers and swimmers. Other areas
include automatic bio-imaging processing and mathematical data analysis, and unorthodox
problems in arts and humanities, such as machine learning applied to literature, mathematical
modelling of storytelling and the design of robot-writers. Interdisciplinarity is thus at the heart
of our research, and students will work within a worldwide network which spans an umbrella of
universities and industrial partners across 12 countries, from oil industry and Nestle, to fertility
clinics and livestock companies. Group members will also be exposed to our public engagement
activities that has generated significant worldwide recognition (BBC, Science, The Discovery
Channel, The New Scientist and TV interviews).
Skills you ideally have to work with me:
• Basic knowledge of fluid and solid mechanics
• Basic knowledge of statistical methods
• Some experience in empirical methods.
Projects supervised by Hermes
• Soft-robotics of the sperm tail [9]
• Soft-robotics spontaneous travelling waves locomotor [10]
• Microorganism inspired robots Bio-SoftRobotics [11]
• Worm-like robots [12]
• Amphibious soft-robot [13]
• Environmental control of microswarms [14]
• Modelling lateral sensory lines in fish [15]
• Artificial motile cilia [16]
• The artificial soft-robotic sperm flagellum [17]
• Microorganism inspired artificial swimmers [18]

5. Chanelle Lee
Contact: c.l.lee@bristol.ac.uk
Projects supervised by Chanelle
• Resiliency to Byzantine agents in the best-of-n decision problem [19]
• Cat Swarm Optimization in Swarm Robotics [162]
• Low-cost Multi-Robot Tracking System [169]

100
BRL MSc Robotics 2022 January 17, 2022

6. Shane Windsor

Contact: shane.windsor@bristol.ac.uk
Bio-inspired flight dynamics, sensing and control
My research looks at understanding the mechanics of animal flight and applying this
understanding to develop novel technologies for aerial robots, also known as unmanned air
vehicles (UAVs).
My research group, in Aerospace Engineering at the University of Bristol, currently has a focus
on bird flight and investigating how birds are able to fly so robustly in all wind conditions, as well
as how they can harvest energy from the wind. From this understanding of birds’ flight mechanics
and sensory processing we are developing bio-inspired technologies for aerial robots such as
wings with distributed sensors for “feeling” how an aircraft is interacting with the wind, morphing
wing air vehicles, and bird-inspired wind harvesting algorithms based on GPS tracking of gulls.
My lab uses a wide variety of techniques including high speed video analysis, wind tunnel testing,
outdoor flight vehicle testing, simulation and computational modelling.
Skills you ideally have to work with me:
• Ability to design and construct experimental systems
• Experience in programming (MATLAB)
Projects supervised by Shane
• Morphing wing UAV development [20]
• Trustworthy machine learning based agile flight control of quadcopters [176]
• Bio-inspired soaring in urban environments [177]
• Bio-inspired distributed flow sensing [178]

7. Tom Richardson
Contact: thomas.richardson@bristol.ac.uk
Projects supervised by Tom
• Multimodal robotic design for conservation applications [21]
• Aerial robotics for conservation applications [182]
101
BRL MSc Robotics 2022 January 17, 2022

8. Sebastian East

Contact: sebastian.east@bristol.ac.uk
Projects supervised by Sebastian
• Neural Model Predictive Control [22]

9. Sabine Hauert

Contact: sabine.hauert@bristol.ac.uk
We engineer swarms across scales, from nanoparticles for cancer treatment, to robots for
environmental monitoring. Central to our work is an interest in swarms that work in huge
numbers (+1000). In fact, we have 1000 kilobot robots at the Bristol Robotics Laboratory. Swarm
strategies either use bio-inspiration or are automatically designed using artificial evolution. In
the future we aim to make swarms a reality in real world applications.
Skills you ideally have to work with me:
We are looking for students with a variety of backgrounds (software, hardware). The key skills
required are:
• Motivation for the project
• Independent thinking and working
• Creativity
Projects supervised by Sabine

102
BRL MSc Robotics 2022 January 17, 2022

• Morphology of a robotic swarm of cubic storage units [23]


• Robot swarms for warehouse stacking[24]
• Swarming microsystems [25]
• Augmented reality for swarming intralogistics [26]
• Robot swarms to monitor Basking Sharks[27]
• Swarm Post-its for Brainstorming [28]
• Visual navigation and object tracking for industrial swarm robots performing stacking
task in a warehouse [29]
• A Robot Swarm to Assist British Rewilding [125]
• Collective Memory for a Trustworthy Swarm [126]
• Human-swarm interactions for intralogistics [181]

10.Paul O'Dowd

Contact: paul.odowd@bristol.ac.uk
Projects supervised by Paul
• Swarm Spatial Density Estimation [30]
• Effect of Spatial Density on Swarm Problem Solving [167]
• Self-Organised Heterogenous Swarm Partitioning [168]
• Cross-Talk: Examining Local Communication in Swarm Robotics [170]
• Intention Recognition for Swarm Robotic Box Pushing [171]
• Attention Holding in Human-Swarm Interaction [172]
• Objectively Interesting Robotic Drawings [173]

103
BRL MSc Robotics 2022 January 17, 2022

11.Nathan Lepora

Contact: n.lepora@bristol.ac.uk
I lead the Tactile Robotics research group at Bristol Robotics Laboratory. The group is
supported by a Research Leadership Award on a Biomimetic Brain for Robot Touch’. We work
closely with industrial partners, such as Ultrahaptics, DeepMind and Shadow Robotics. Website:
www.lepora.com
The Tactile Robotics group has three main themes: (i) development and fabrication of novel 3D-
printed tactile sensors and hands; (ii) algorithms for imparting intelligence to these tactile robots;
(iii) interpretation and inspiration for these algorithms and tactile hardware from the
computational neuroscience of human and animal perception.
• Biomimetic 3D-printed tactile sensors and hands*: Advances in multi-material 3D-printing
enable the design and fabrication of state-of-the-art biomimetic tactile robot sensors and hands.
Our lab’s interests primarily focus around an optical biomimetic tactile sensor (the TacTip) and
its integration into 3D-printed multi-fingered robotic grippers and hands (Figure below).
• Intelligent tactile perception*. The development of robust and intelligent artificial touch is
needed for robots to interact physically with complex environments. Our lab focusses on
methods for active touch that combines artificial intelligence and tactile control. The application
of deep learning methods is having a revolutionary impact on the capabilities of these robots.
• Neuroscience of perception and action*: Our research group is interested in how human and
animal perception relates to accounts of decision making from mathematics, statistics, computer
science, neuroscience, psychology and philosophy. We are particularly interested in perception
from the view of embodied cognition - that the brain and body act together to enact intelligence.
Skills you ideally have to work with me:
Willingness to combine hands-on robotics with state-of-the-art algorithms in artificial
intelligence applied to tactile perception and interaction.
PhD projects: scope for algorithm development for complete robotic systems, e.g. deep learning
applied to tactile robotic hands; algorithms from computational neuroscience could also be
considered e.g. for goal-based learning
MSc projects: Projects can be based primarily around hardware development. Familiarity with
CAD would be useful.
Projects supervised by Nathan
104
BRL MSc Robotics 2022 January 17, 2022

• Virtual telehaptic avatar [31]


• Human-like dexterity in robots using deep reinforcement learning [32]
• Swarm manipulation in industry [33]
• Next-generation biomimetic optical tactile sensing [34]
• Tactile-based learning for human-robot interaction [148]

12.Benjamin Ward-Cherrier

Contact: b.ward-cherrier@bristol.ac.uk
Most commercial upper limb prosthetics do not include sensory feedback; an essential
component to regain full functionality of the missing limb. My research proposes to address this
issue by developing an “Intelligent neuromorphic tactile prosthetic hand”: a low-cost 3d-printed
prosthetic hand that restores a sense of touch, which could transform the lives of 2 million hand
amputees worldwide.
My research takes a novel approach to prosthetic research, exploiting recent developments in
3dprinting and event-based vision to answer an ambitious research question: How can a
functional neuromorphic (mimicking neuro-biological architectures) tactile prosthetic hand be
developed, and what are the most effective control and feedback strategies for upper-limb
prosthetics?
I adopt a tactile-centric approach separated in three main phases. Initial focus will be on
biomimetic hardware development in collaboration with iniVation (neuromorphic tactile
sensors) and OpenBionics (prosthetic hands). I will then explore shared autonomy control
strategies for the prosthetic hand in partnership with academic collaborators in robotics and
neuroscience. Finally, the neural encoding of incoming tactile signals will be investigated in
collaboration with experts in that area (Hannes Saal, University of Sheffield; Jim Dunham, UoB)
for eventual integration of the tactile feedback with the human nervous system.
Skills you ideally have to work with me:
-Experience in python programming
-Interest in computational neuroscience

105
BRL MSc Robotics 2022 January 17, 2022

Projects supervised by Benjamin


• Neuromorphic tactile perception [35]
• Tactile feedback for prosthetic hands [36]

13.Hemma Philamore
Contact: hemma.philamore@bristol.ac.uk
Projects supervised by Hemma
• Gut feeling: Exploring the use of microbes for personality generation in robots. [37]
• Bio-inspired underwater navigation in challenging environments [38]
• Softly-non-spoken [39]
• Robotany An artificial ecosystem of robots [94]
• Finding Nemo: Simulating an underwater robot for environmental monitoring [95]

14.Walterio Mayol-Cuevas

Contact: Walterio.Mayol-Cuevas@bristol.ac.uk
Hello Please Note: I am currently on industry secondment at Amazon headquarters in Seattle
USA but keep active in my research with my team at Bristol. I will be happy to consider co-created
proposals only via co-supervision with people from my team. Last year we did successfully
supervise 3 projects in this way.
Robot vision: How to allow robots to understand the world and act on it while using visual
perception. My research in this area recently includes object pose recovery, deep models to
determine skill in people.
Visual SLAM/Mapping: I have been involved in visual mapping since 2002 and my team has been
fortunate to have worked and transferred tech in this area to international companies such as for
Samsung’s flagship project RoboRay and other companies. Recent work includes more intelligent
mapping and map compression.
Handheld Robotics: This is an area my group kickstarted. Handheld robots are robots that look
and behave like hand tools but are aware of the task and act to help and or predict what the user
106
BRL MSc Robotics 2022 January 17, 2022

will do or needs help doing. Check out the videos from handheldrobotics.org or this article from
The Economist https://tinyurl.com/y9r8yr4f
Deep learning for Robotics: We are looking at developing efficient deep learning models to
recover body pose and hand pose with multi DoF to help understanding what people do and this
is intended to work for better human-robot interaction or robot teaching.
New sensors: Together with colleagues from Manchester University, we are working on new
sensors and new Computer Vision methods for a new type of sensor-processor device. These we
call Pixel Processor Arrays since for every pixel we have a CPU. This is very efficient for processing
and energy consumption. Check out this ICCV paper: https://tinyurl.com/y8dvfvwp Visual
Drones: We have developed drones that can do teach-and-repeat on industry scenarios or use
cameras to build maps and track objects using the pixel processor arrays above mentioned.
Check one of the project websites: http://project-agile.org
Skills you ideally have to work with me:
• Depending on project: good programming skill C/C++/C# or Good robot integration and
design (electro mechanical/3D skill).
• If a Deep Learning project, some familiarity with DL libraries and or concepts.
• Have will to learn and work hard and strong commitment to push yourself towards new
ideas and challenges.

Projects supervised by Walterio


• Visual navigation with a Pixel Processor Array [40]
• Visual perception for handheld robots [41]
• Computer Vision with Deep Learning for Wearable or Robotic systems [42]
• Visual-based semantic SLAM for navigation in dynamic environment [43]
• Visual localisation in the wild [44]

15.Wenhao Zhang
Contact: wenhao.zhang@uwe.ac.uk
My primary research is in Computer Vision. My research focus includes 3D face recognition, eye
tracking, high-resolution 3D reconstruction and patter recognition.
A few relevant projects that I’m involved with include face recognition for ticketing at railway
stations, eye gesture recognition for human computer interaction, eye saccade analysis for
studying healthy aging and early diagnosis of disease, multispectral 3D imaging of plants, weed
detection in pasture, underfloor navigation and recognition.
Skills you ideally have to work with me:
Some experience in image processing, machine learning and programming (C++, Python,
MATLAB).

107
BRL MSc Robotics 2022 January 17, 2022

Projects supervised by Wenhao:


• Occluded face recognition [45]
• Biometric recognition from eye movements [46]
• Eye tracking in reality and virtual reality [47]
• Depth estimation from a single camera [48]
• Assessment of human-computer interaction using vision-based behavioural cues [166]

16.Roshan Weerasekera

Contact: roshan.weerasekera@uwe.ac.uk
I am a Senior Lecturer in Electronic Engineering within the Department of Engineering Design
and Mathematics in UWE. Primary focus of my research is on the design and implementation of
Integrated Sensor and Computing Platforms. At UWE, I work closely with the International Centre
of Unconventional Computing and the Health Technology Hub (HTH). I am a Senior Member of
the IEEE, a Senior Fellow in the Higher Education Academy, a Member of the editorial board of
the Elsevier Microelectronics Journal, and a Member of the technical program committee of a
number of conferences.

Prior to joining UWE Bristol in May 2016, I was a Research Scientist at the A-STAR Institute of
Microelectronics (IME), the premier research organization in Singapore. At IME, I led
public/industry consortium funded projects in the areas of 2.5D/3D IC packaging for
mobile/network applications and Integrated Cancer Cell Analysis platforms.

My teaching include Digital Logic Design, Analogue Electronics and High Speed Digital Electronic
Systems.

I had also worked at Lancaster University, UK (from 2008 to 2011); and at the University of
Peradeniya, Sri Lanka (from 1998 to 2000 and from 2001 to 2003).

Both my PhD and MSc degrees are from KTH, Sweden and BScEng Degree is from the University
of Peradeniya, Sri Lanka .
Projects supervised by Roshan:

108
BRL MSc Robotics 2022 January 17, 2022

• Osillatory Neural Networks for Robotic Manipulations [49]

17.Yaseen Zaidi
Contact: yaseen.zaidi@uwe.ac.uk
Yaseen Zaidi is a senior lecturer in the aerospace engineering cluster. He has served in various
roles on the teams of three agency-level and two academic space missions—all successfully
operational in orbit. During 7 years at Space and Upper Atmosphere Research Commission, he
was extensively trained in space engineering. Previously, he worked at the University of Cape
Town, Cape Peninsula University of Technology, and Grafton College. He was Higher Education
Commission approved Ph. D. supervisor at the Institute of Space Technology. He obtained a
bachelor's degree at Louisiana Tech University, a master's at Western New England College, and
a Ph. D. at Vienna University of Technology. He was a research fellow at the French South African
Institute of Technology. He is an associate editor of the SAE Aerospace journal. He is a fellow of
HEA, a senior member of ACM, AIAA, and IEEE, a member of INCOSE and SIAM, and a member
of the College of Reviewers of the UK Space Agency. In the Mathematics Genealogy Project, he
is a 9th generation descendant of Bessel and 10th of Gauß.
Projects supervised by Yaseen:
• Modelling of Stiff Behaviours in Spacecraft Systems with Model Based Systems
Engineering (MBSE) Framework [50]
• Rover Mission Design, Analysis and Telemetry Visualisation with OpenMCT [51]
• Steerable Thrusters on a CubeSat [52]
• Digital Factory Twinning in VR Environment [53]
• Rendezvous of a Refuelling Space Tug with OneWeb Constellation [104]
• Nukopter [105]
• Parts Tracking during Spacecraft Assembly, Integration and Test with HoloLens 2 [106]
• eVToL Digital Twin Demonstrator [107]

18.Mark Hansen
Contact: mark.hansen@uwe.ac.uk
Applied machine vision using machine/deep learning in areas such as biometrics and agri-tech.
Also designing and building 3D acquisition systems.
Skills you ideally have to work with me:
• Enthusiasm!
• Basic Python programming
• Basic understanding of machine learning and image processing
Projects supervised by Mark:

109
BRL MSc Robotics 2022 January 17, 2022

• GANs for synthetic art production [54]


• Machine Vision in Agriculture [55]
• CNNs learning to ignore [56]
• Insect sexing and counting [57]
• Open-set (re-identification) recognition of pigs [122]

19.Manuel Giuliani

Contact: manuel.giuliani@uwe.ac.uk
I am interested in building robots that are able to understand and to collaborate with humans.
One part of my research is to investigate how the architecture of a human-robot interaction
system needs to be implemented to integrate input channels like speech recognition, gesture
recognition, and person tracking with decision making and output generation components. The
other part of my research is to study how humans perceive interactions with robots. For that,
I plan and execute user studies in which the participants interact with a robot. As part of the user
studies, I also analyse the results with statistical methods. Below you can find pictures of different
human-robot interactions that I have studied.
At the moment, I mostly work on human-robot interactions on applications in nuclear
decommission and digital manufacturing.
Skills you ideally have to work with me:
• Programming experience in one of the following languages: Python, C/C++, Java
• Basic knowledge in machine learning would be beneficial
• Experience, how to execute user studies would be good, but you can also learn this skill
while doing a project with me.
Projects supervised by Manuel:
• BRL Reception Support Robot: Long-term operation and visitor interaction [58]
• Coverage path planning of a marsupial UAV-ground robot [161]
• Evaluating the use of haptic feedback modality for human-in-the-loop multirobot teams
[163]

110
BRL MSc Robotics 2022 January 17, 2022

• Determining human intervention for multirobot decision making [164]

20.Matthew Studley

Contact: Matthew2.Studley@uwe.ac.uk
Area of expertise: Robots. Machine Learning, Artificial Intelligence, Computer Programming,
Project Management, Starting a business. Ethics

Projects supervised by Matthew:


• Swarm dynamic task allocation [59]
• Remote Robotics Outreach [60]

21.Tavs Jorgensen

Contact: tavs.jorgensen@uwe.ac.uk
Tavs Jorgensen is Associate Professor and AHRC Leadership Fellow at the Centre for Fine Print
Research: cfpr.uwe.ac.uk.

Tavs's core research interests are focused on exploring the design and innovation potential
presented by new digital fabrication technologies - particular with the glass and ceramics
mediums.

He is currently in receipt of Leadership Fellow grant from the Arts and Humanities Research
Council (AHRC) with a project investigating the use of digital tools to develop new approaches
111
BRL MSc Robotics 2022 January 17, 2022

with the ceramic extrusion process for architectural applications


https://cfpr.uwe.ac.uk/project/smart-tooling-for-ceramic-profile-extrusion-new-approaches-to-
industrial-focussed-interdisciplinary-practice-based-research/. He is also developing
interdisciplinary research into the use of 3D printing technology for glass production
https://cfpr.uwe.ac.uk/project/glassworks-an-interdisciplinary-research-sandpit/. This research
has been funded by UWE's Collaborative Research Award.

Tavs has a background in ceramics having initially worked as a designer for the tableware industry
with leading companies such as Wedgwood and Lenox as clients. He taught for 10 years on the
Ceramic and Glass programme at the Royal College of Art, and was also a research fellow at the
Autonomatic Research Cluster at Falmouth University. Before joining UWE he taught on the 3D
Design course at Plymouth university. He also has a very active visiting lecturing career and have
delivered lectures at leading international institutions including Massachusetts Institute of
Technology (MIT), The School of the Art Institute of Chicago, University College London,
Chulalongkorn University, and The Royal Danish Academy in Copenhagen.

Tavs currently supervise 3 PhD students and has undertaken multiple external research degree
examinations. He is currently external examiner at the BA 3D Design and Craft course at the
University of Brighton. He is also a member of the AHRC peer review college.

Tavs also maintain an active creative practice with work exhibited at leading venues including the
Saatchi Gallery, Pushkin Museum, Museum of Art and Design and Salone de Mobile.
Projects supervised by Tavs:
• Robotic Assisted Bending of Ceramic Extrusions [61]
• Development of Robotic 3D printing for Architectural Ceramics [62]

22.Farid Dailami
Contact: Farid.Dailami@uwe.ac.uk
Area of expertise:
• Mechatronics
• Design
• Manufacturing
• Knowledge exchange
Projects supervised by Farid
• An Industry 4.0 demonstrator for product storage systems [63]
• Hand over from robot to humans in an industrial context [64]
• Vision assisted Collaborative manufacture of jewelry [65]
• Path generation for repairing a damaged part [66]
• Support removal in metallic additive layer manufacturing [67]

112
BRL MSc Robotics 2022 January 17, 2022

• Autonomous painting robot in construction industry [68]


• Robotic cutting of large oil platform structures [69]
• Classification of waste using machine learning [70]
• Motor operation prototype [71]
• Digital Twin for a Welding Cell [127]
• Unitree A1 Dog Surveyor [128]
• Digital twin of a motor and disc assembly [129]
• Modelling of a cable robot [130]

23.Martin Pearson

Contact: martin.pearson@brl.ac.uk
Biomimetic and Neurorobotics I build embodied models of neural structures to gain insight
into their function and to find novel solutions to hard engineering problems. I am particularly
interested in whisker tactile sensing as a model of active touch, its role in perception and the
development of spatial memory.
Skills you ideally have to work with me:
• Computing skills: Python, C/C++, Matlab
• Tools: Gazebo, Solidworks
• Interests: Neuroscience, animal behaviour, robot platform design
Projects supervised by Martin
• Sparse training of a Deep Predictive Coding network [72]
• Encoding robot motion using spike based hardware [73]
• Olfaction for place recognition [74]

113
BRL MSc Robotics 2022 January 17, 2022

24.Ning Wang

Contact: Ning2.Wang@uwe.ac.uk
Ning Wang has extensive research and project experience in Intelligent Systems and Robotics, in
particular on Human-Robot Interaction (HRI). She received her Ph.D degree in Electronics
Engineering at The Chinese University of Hong Kong in 2011. She worked as Post-doc research
fellow on machine learning at the Chinese University of Hong Kong, as Research fellow on HRI
with the Centre for Robotics and Neural Systems, University of Plymouth. Currently, she is a
Senior lecturer in Robotics with the Bristol Robotics Laboratory at University of the West of
England, United Kingdom. Ning has rich project experience, she has been key member of EU FP7
Project ROBOT-ERA, EU Regional Development Funded Project ASTUTE 2020 and industrial
projects with UK companies. She has been awarded several awards including best paper award
of ICIRA’15, best student paper award nomination of ISCSLP’10, and award of merit of 2008 IEEE
Signal Processing Postgraduate Forum, etc.
Projects supervised by Ning
• Driver intent prediction for smart driving assistance [75]
• Generalizable driving skills transfer from driver to smart vehicle [76]
• Remote assistance for smart vehicles [77]
• Mobile robot for elderly care in home environments [78]
• Decision-making and control for simulated highway autonomous driving based on
reinforcement learning [185]

114
BRL MSc Robotics 2022 January 17, 2022

25.David Western

Contact: david.western@uwe.ac.uk
My research spans informatics, mechatronics, sensing and signal processing for health
applications. I have developed new techniques for automated interpretation of various signals
from the human body, including electrophysiology (EEG, ECG, intracardiac electrograms) and
motion capture (IMUs, markerless camera-based systems). I also have experience in robotics and
software development.

Based at UWE's Health Tech Hub, I am custodian of our Living Lab, a facility to support
commercial and academic development of in-home health technologies. The Living Lab is a fully
furnished one-bed apartment that serves as a test-bed for evaluating human behaviour and/or
novel systems. It is equipped with a wide range of sensors to capture detailed information about
the environment, its occupants, and their activity. The data generated can inform research into
device usage, human behaviour, ergonomics/biomechanics, and other fields.

I am always happy to discuss potential collaborations with clinicians, companies, academics, and
prospective PhD students.
Projects supervised by David
• Markerless 3D Human Motion Capture for Home Environments [79]
• Augmented Reality System to Guide Electrode Placement for Clinical Electrophysiology
[80]
• Analysis of Brain Networks to Quantify Consciousness and Cognitive Function [81]
• Deep Learning for Clinical Neurophysiology [82]
• Intelligent Infrared Beamforming: the Future of Home Heating [83]
• Where Are My Glasses? Machine Vision and Speech Interaction in the Smart Home to
Support People with Dementia [84]
• Development of an Interactive Projector to Provide Ubiquitous Graphical Interfaces for
Smart Homes [85]
• Opto-genetic Tremor Suppression: Simulation and Feasibility Analysis [86]
• Using a Smartwatch to Detect and Disrupt Obsessive Compulsive (OCD) Hand Washing
[87]
115
BRL MSc Robotics 2022 January 17, 2022

• COVID-Cam: Using computer vision to track viral contamination in shared spaces [88]

26.Chenguang Yang

Contact: Charlie.Yang@uwe.ac.uk
Professor Chenguang (Charlie) Yang is with Bristol Robotics Laboratory as a Theme Leader of
Immersive Teleoperation. He received PhD degree from the National University of Singapore
(2010) and performed postdoctoral research at Imperial College London. He has been awarded
EU Marie Curie International Incoming Fellowship, UK EPSRC UKRI Innovation Fellowship, and
the Best Paper Award of the IEEE Transactions on Robotics as well as over ten conference Best
Paper Awards. His research interest lies in human robot interaction and intelligent system design.
You can find more of his research work on the YouTube channel:
https://www.youtube.com/user/chenguangyang
Projects supervised by Charlie
• Robot-assisted Composite Sheet Layup Skill Learning and generalization through
Perception Feedback [89]
• Robot-assisted ultrasound scanning skill learning to optimize ultrasound image quality
[90]
• Development of a multi-functional mobile robot manipulator [91]
• Intelligent mobile robot control platform based on smart phone [92]
• Mobile phone enabled smart car [93]
•Primitive Skill Library Building and Learning for Robot Composites Layup though
Teleoperation and Kinesthetic Demonstration [123]

116
BRL MSc Robotics 2022 January 17, 2022

27.Hamidreza Nemati

Contact: Hamidreza.Nemati@uwe.ac.uk
I obtained my PhD at the Guidance and Control Laboratory (GCL), Department of Aeronautics
and Astronautics, Kyushu University, Fukuoka, Japan (2015), became a postdoctoral researcher
at the Department of Aerospace Engineering, Amirkabir University of Technology (Tehran
Polytechnic), Tehran, Iran (2017-2018) supported by Iran's National Elites Foundation, and a
research associate at the Engineering Department, Lancaster University, Lancaster, UK (2018-
2020) worked on an EPSRC funded project via National Centre for Nuclear Robotics. I am always
interested in assisting industries, organisations and start-ups in deploying artificial intelligence
and robust control approaches to solve challenges in engineering.

My research aims at enabling innovations, advanced algorithms and the art of design in Cyber-
Physical Systems for unprecedented science and exploration. My efforts lie at the intersection of
autonomous navigation and intelligent control to meet the tight requirements posed by the novel
applications in robotics to address real-world problems. My recent work has been focused on
control of a dual-arm hydraulic manipulator and robust stabilisation of micro aerial vehicles. I am
trying to display great curiosity and attempts to fit my diverse experience into a clear
understanding of the world.
Projects supervised by Hamidreza
• Design and Stabilisation of an Amphibious Drone (Underwater Drone) [96]
• Design and Stabilisation of Distributed Flight Arrays (DFAs) [97]
• Cooperative Multi-Missile Systems [98]
• Robot Manipulators for Space Debris Removal [108]
• Spacecraft Attitude Control with Internal Fuel Sloshing [109]
• Spacecraft Station-Keeping around Libration Point Orbits in the Circular Restricted Three
Body Problem (CR3BP) [110]
• Soft Robotics: Novel components, Modelling and Simulation [111]
• Soft Robotics: Intelligent Control [112]
• Soft Robotics: Noise & Vibration [113]
• Stabilisation of Under-actuated Mechanical Systems [114]

117
BRL MSc Robotics 2022 January 17, 2022

28.Quan Zhu

Contact: quan.zhu@uwe.ac.uk
Projects supervised by Quan
• Reinforcement control learning algorithm applied to Autonomous Underwater Vehicle
(AUV) in simulation [99]
• Robotics FES for Under-actuated systems [100]
• Autonomous Landing [101]
• Mobile robots control via voice activation [102]
• U-model Based Sliding Mode Control (SMC) to Increase Robustness of Various Robots
[103]

29.Zhenyu Lu
Contact: zhenyu.lu@uwe.ac.uk
Projects supervised by Zhenyu
• New Soft Conjunction Design and Test of Human-Like Robot Hand Links [115]
• Improved Variable Impedance Actuator System for Tendon-driven Robot Hand [116]
• Exoskeleton Structure Improvement for Free-space Teleoperation [117]
• Robot Primitive Skill Learning from Few-shot Demonstrations based on Meta-Learning
[118]
• Robust Robot Primitive Skill Programming Network for Object Grasping and
Manipulation [119]

118
BRL MSc Robotics 2022 January 17, 2022

30.Anna Chatzimichali

Contact: Anna.chatzimichali@uwe.ac.uk
In the fields of social and service robots, user data are crucial in order to design a meaningful
connection with the user. The quality of such user data, as well as a clear model of their
ownership is also extremely valuable when it comes to designing future features in user
interaction with the robot. Such data are the new currency of the digital economy.
The rise of inter-connected digital goods and services marked the end of an era of local data
storage in user owned equipment. In the new era of remote computing, internet of things,
personalised machines and robots, user data are almost always stored in third party remote
resources. This shift made ownership of digital assets and data privacy increasingly abstract.
However, the landscape of ownership of such user generated data records (or personal data) is
unclear. Currently, no organisation has the obligation or the capacity to protect digital personal
data, while data protection usually happens only as a result of personal interest. The main idea
behind this work is to investigate data right architectures in social/service robots and especially
within a collaborative personalisation space. Data is power, but in a collaborative personalised
space who really owns the power?
Skills you ideally have to work with me:
• Literature review skills
• Critical thinking
• Previous knowledge/understanding of social/service robots (not mandatory)
Projects supervised by Anna
• What my robot knows about me and who owns that knowledge: User data rights in
service robotics [120]
• Robots in circular economy: The role of robots and cobots in product disassembly
[121]

119
BRL MSc Robotics 2022 January 17, 2022

31.Conor Houghton

Contact: conor.houghton@bristol.ac.uk
Conor Houghton is a reader in mathematical neuroscience. His research interest is in
understanding information processing and coding in the brain and, generally, in mathematical
and computational approaches to neuroscience.

Projects supervised by Conor


• Language games for language evolution [124]

32.Paul Bremner

Contact: paul2.bremner@uwe.ac.uk
Social human-robot interaction (HRI): social interaction is an important capability for robots
designed to operate in human environments. My research involves how such robots should
behave (often applying models from the psychology literature) and the effect of different
behaviours on people that interact with the robot.
Intuitive interfaces for robot tele-operation: whether a tele-operated robot is designed for task
performance (e.g., nuclear decommissioning) or social interaction (tele-presence) an intuitive
interface is essential for effective operation, and to minimize operator skill requirements. One
technology I am interested in for such interfaces is virtual reality. My research involves designing
and testing such interfaces.
Skills you ideally have to work with me:
120
BRL MSc Robotics 2022 January 17, 2022

• Python and/or C++ programming


• Unity game engine development (if you want to work in VR)
• Some experience of experimental design and data analysis beneficial

Projects supervised by Paul


• Data sonification for robot teleoperation [131]
• Automated gesturing for a telepresence robot [132]
• Robot companions [165]

33.Dandan Zhang

Contact: ye21623@bristol.ac.uk
My research interests include robot learning (machine learning for robotic manipulation),
human-robot shared control, teleoperation, computer vision, micro-manipulation, tactile
robotics.

Projects supervised by Dandan


• Reinforcement Learning for Robotic Manipulation Based on Multi-Sensor Fusion [133]
• Design of a two-finger robotic hand with tactile feedback [134]
• Soft robotic hand for in-hand manipulation [135]
• Surgical Gesture Segmentation and Recognition for Medical Robotic Applications [136]
• Human-Robot Cooperative Control for Medical Robotics [137]
• Sim-to-Real Transfer Learning for Robotic Manipulation [138]
• Robotic Skill Learning for Cooking [139]

121
BRL MSc Robotics 2022 January 17, 2022

34.Jiseon You

Contact: jiseon.you@uwe.ac.uk
I am a research fellow of the Bristol BioEnergy Centre located in the Bristol Robotics
Laboratory. I hold M.Eng. in Environmental Engineering and a PhD in Microbial Fuel Cell
technology.

Projects supervised by Jiseon


• Speaking microbial language [142]
• Eating, sensing, dying like a beetle [143]
• Biological robotic tongue [144]

35.Virginia Ruiz Garate

Contact: virginia.ruizgarate@uwe.ac.uk
I am a Wallscourt Fellow in Intelligent Assisted Robotics researching in BRL on adaptive
controllers for assistive robots. I lead the MSc module of Assistive Robotics and co-lead of the
Robotics Engineering And Computing for Healthcare research group (REACH). I amalso the
Robotics Programme Cluster coordinator for Open day, Offer holder day & Induction, as well as
EDM representative for FET RKE Research Culture Working Group.

For the previous 4 years, I was a PostDoc at the Human-Robot Interfaces and Physical Interaction
(HRII) laboratory of the Istituto Italiano di Tecnologia (IIT), where I started my research under the
EU project SOMA regarding new bio-inspired grasping stiffness controls for robotic hands. In the
122
BRL MSc Robotics 2022 January 17, 2022

last year I moved to investigate the application of robotic grasping controls in multi-robot tele-
impedance control frameworks and worked under the SOPHIA project. Previously, I obtained my
PhD in 2016 from the Universite Catholique de Louvain (UCL) in Belgium, being my thesis devoted
to the control of lower limb exoskeletons by means of the bio-inspired concept of artificial motor
primitives. This work took place within the framework of the EU project CYBERLEGs.

I have been serving as a reviewer for IEEE journals and conferences and co-organized the RSS
2019 workshop on “Emerging Paradigms for Robotic Manipulation: from the Lab to the
Productive World” (from which a RAM SI developed) and the workshop "Human factors in the
design and control of robots: what are we missing?” in the recent 2020 I-RIM 3D conference.

My current research interests include grasping and manipulation, bio-inspired control, assistive
robotics, and human-robot collaboration.
Projects supervised by Virginia
• Design of an interface for lower-limb rehabilitation using the Franka arm [145]
• Upper-limb rehabilitation using Franka robotic arm [146]

36.Amir Bolouri

Contact: amir.bolouri@brl.ac.k
Acquired over 9 years of industrial and academic research experience in processing and
characterization of metallic alloys: shape castings, semisolid forming, solidification, physical
metallurgy, heat treatment, mechanical properties, and microscopy

Projects supervised by Amir


• Self-powered pH-responsive mechanical meta-materials [147]

123
BRL MSc Robotics 2022 January 17, 2022

37.Jonathan Rossiter

Contact: jonathan.rossiter@bristol.ac.uk
My research interests cover all aspects of Soft Robotics, from smart materials to robotic
organisms. I am particularly interested in robots that interact with delicate environments and
objects, including the human body (e.g. implantable robotics and soft exoskeletons) the natural
environment (e.g. biodegradable, edible, energy autonomous, and environmentally remediating
and monitoring robots). Core to all these are the underlying technologies we develop including
artificial muscles, smart polymers, mechanisms of embodied intelligence and tactile stimulators.
For example, we are using tactile stimulators to generate a new paradigm of physical
communication using wearable soft devices that enable the remote communication of emotion,
feeling and personal connection. We are exploring how artificial robotic organisms can be seeded
and grow and how then can be used to enhance farming and ecology. There are many
opportunities for interesting projects in our lab!

Skills you ideally have to work with me:


Skill depend on the particular project and typically the project can be adapted to fit the
experience and needs of the student. Above all you need to have imagination and enthusiasm!
Projects supervised by Jonathan
• Mechanical Metamaterials for Soft Robots [149]
• Development of fully soft Bubble Artificial Muscles to create an assistive device for
human body motion [151]
• Exploring designs and patterns of 2D pneumatic actuator for various robotic applications
[152]
• Rapid stiffening capability of flexible 2D and 3D electroactive structures [153]
• Tendon-driven stiffening structure [154]
• Bistable soft structure inspired from bistable mechanism [155]
• Intelligent Visuo-Tactile Robotic Skin with Adaptive Morphology [156]
• Phonetic alphabet generation with Soft Structures [159]
• Peristaltic Robot for Debris Clearance helmut [160]

124
BRL MSc Robotics 2022 January 17, 2022

38.Mohammad Naghavi Zadeh


Contact: m.naghavizadeh@bristol.ac.uk
Projects supervised by Mohammad
• Mechanical Metamaterials for Soft Robots [150]

39.Andrew Conn

Contact: a.conn@bristol.ac.uk
My research revolves around the application of biological inspiration to solve challenging
engineering problems. The integration of elasticity and compliance into systems such robots is
central to this, driven by the potential for great improvements in human-robot interaction,
prosthetic devices and robot mobility in unstructured environments. The primary focus of my
research involves the development of soft actuation technologies such as electro-active
polymers, which possess actuation characteristics comparable to biological muscle. Recent
research highlights have included soft pumps, worm-like robots and flapping wing mechanisms.
Projects supervised by Andrew
• Robotic fish [174]
• Embodied digital logic [175]

125
BRL MSc Robotics 2022 January 17, 2022

40.Lucia Marucci

Contact: lucia.marucci@bristol.ac.uk
Lucia's research aims at quantitively understanding and directly engineering cell functions, with
applications spanning biotechnology and healthcare.

Projects supervised by Lucia


• Machine learning-based automatic control of living cells [180]

41.Tony Pipe

Contact: tony.pipe@brl.ac.uk
I work in three main technical areas:
1. Connected Autonomous Vehicles. I have worked in a number of areas, including the technology
of the vehicles themselves, especially their sensing and control capabilities (both in simulation
and the real world) and the factors that could impact on their usability, especially for a large
sector, i.e., those who used to be able to drive but now cannot, perhaps simply the result of the
passage of time, or because of an acquired disability.
2. Remote diagnosis of robot operational faults. For example, a welding robot may not be working
properly, how can I use multi-modal feedback to remotely diagnose the fault?
3. Nuclear Decommissioning. This is a dangerous place for people to work, but there is a great
many “nuclear assets” now requiring to be dismantled. I am interested in immersive
teleoperation of robots at differing levels of control autonomy, i.e., the concepts of “shared
126
BRL MSc Robotics 2022 January 17, 2022

control” between the remote human and the robot’s own controller with dynamic autonomy,
i.e., changing levels of remote control from detailed joint control to mission control. In all of this
I am interested in the safety of humans and robots working in close proximity to each other.

Skills you ideally have to work with me:


• General mechatronics capability
OR
• Software design and high-level programming capability
OR
• Simulation technologies
Projects supervised by Tony
• Safety braking and manoeuvrability of hospital beds with assisted drive capabilities
(Focus on Mechanical Engineering) [183]
• IoT conversion of self-powered hospital bed (Focus on Software Engineering) [184]

127

You might also like