Synthetic Viewing

Inside Challenging Environments

Synthetic Viewing application

RACE has worked on many innovative projects for challenging environments. One of these is called Synthetic Viewing. With this project, RACE is looking to find a solution for viewing inside environments where humans are not allowed. One example of these spaces would be a hot cell but many other applications such as decommissioning are possible.


The term ‘Synthetic Viewing’ refers to the combining of information from a variety of sensor types and prior knowledge (e.g. CAD models), in order to create the best possible estimate of the state of the environment under investigation. The idea of this project is to ensure that there is an accurate live 3D model of the remote environment. This will be achieved by processing remote sensor data, including camera views and sensor modalities, to track changes in object positions. The operator can then rely on views of the live 3D model for managing collision-free operations in the remote environment.


Project motivation and objectives


There will be minimal viewing capability in the ITER Hot Cell, it is therefore necessary for IO (the ITER Organization) to provide alternatives to ensure the visual on the remote maintenance. This collaborative project aims to tackle this challenge, creating a solution with Synthetic Viewing. The project will create a mock-up Synthetic Viewing system in an environment which simulates the ITER Hot Cell, namely the RACE work-hall. Using this platform, the project will investigate and demonstrate the feasibility of various elements of the system combined.

Visualisation and sensor processing


An Unreal Engine based visualisation will be provided as the primary user interface of the system. The project is using various forms of sensing, including:

  • LIDAR
  • Stereo Vision
  • Structured Light Cameras


Other than visualisation, the sensor data will also be used for control and planning. Various deep learning and traditional computer vision techniques will be combined with point cloud processing techniques. RACE is working on state-of-the-art sensor systems for the most complete visualisation of the environment possible. One of the key aspects of the project is that multiple sensor types can be combined.

Synthetic Viewing applications –

ITER and beyond


The project is part of the IRTF (ITER Robotics Test Facility) which is a RACE-IO collaboration, jointly funded by IO and BEIS (the Department for Business, Energy & Industrial Strategy). This is enabling RACE to tackle R&D work that can have a big impact on the ITER remote operations capability such as hot cell productivity and task feasibility. In the process, RACE will develop technologies that can have a much broader impact and can be used for projects such as STEP, DEMO, decommissioning and beyond.



Synthetic Viewing

Inside Challenging Environments

Synthetic Viewing application

RACE has worked on many innovative projects for challenging environments. One of these is called Synthetic Viewing. With this project, RACE is looking to find a solution for viewing inside environments where humans are not allowed. One example of these spaces would be a hot cell but many other applications such as decommissioning are possible.


The term ‘Synthetic Viewing’ refers to the combination of information from a variety of sensor types and prior knowledge (e.g. CAD models), in order to create the best possible estimate of the state of the environment under investigation. The idea of this project is to ensure that there is an accurate live 3D model of the remote environment. This will be achieved by processing remote sensor data, including camera views and sensor modalities, to track changes in object positions. The operator can then rely on views of the live 3D model for managing collision-free operations in the remote environment.


Project motivation and objectives


There will be minimal viewing capability in the ITER Hot Cell, it is therefore necessary for IO (the ITER Organization) to provide alternatives to ensure the visual on the remote maintenance. This collaborative project aims to tackle this challenge, creating a solution with Synthetic Viewing. The project will create a mock-up Synthetic Viewing system in an environment which simulates the ITER Hot Cell, namely the RACE work-hall. Using this platform, the project will investigate and demonstrate the feasibility of various elements of the system combined.

Visualisation and sensor processing


An Unreal Engine based visualisation will be provided as the primary user interface of the system. The project is using various forms of sensing, including:

- LIDAR

- Stereo Vision

- Structured Light Cameras


Other than visualisation, the sensor data will also be used for control and planning. Various deep learning and traditional computer vision techniques will be combined with point cloud processing techniques. RACE is working on state-of-the-art sensor systems for the most complete visualisation of the environment possible. One of the key aspects of the project is that multiple sensor types can be combined.

Synthetic Viewing applications –

ITER and beyond


The project is part of the IRTF (ITER Robotics Test Facility) which is a RACE-IO collaboration, jointly funded by IO and BEIS (the Department for Business, Energy & Industrial Strategy). This is enabling RACE to tackle R&D work that can have a big impact on the ITER remote operations capability such as hot cell productivity and task feasibility. In the process, RACE will develop technologies that can have a much broader impact and can be used for projects such as STEP, DEMO, decommissioning and beyond.