Meta Liverpool
Meta Liverpool is a 210 sq km Digital Twin of Liverpool and much of the Wirral, created using aerial scans from a light aircraft. By combining LiDAR and photogrammetry, we achieved an impressive accuracy of ±3 cm for photogrammetry and ±15 cm for geometry.
This project has been my primary focus for the past 2.5 years. It began as a solo effort to research and understand how to integrate a massive 2-terabyte dataset into Unreal Engine 5. The first major milestone was successfully importing the entire scan into Unreal using a plugin called Cesium.
From there, my work shifted to optimising performance and exploring ways to visualize various real-world datasets. Collaborating closely with a data scientist responsible for restructuring the data, we integrated information such as building details, Lower Layer Super Output Areas (LSOAs), air quality, and more.
As the project gained significant traction and outgrew our initial scope, we decided to rebuild it from the ground up after about a year. This time, we developed a more focused plan and a stronger framework designed to handle larger datasets efficiently.
During this rebuild, we also pursued more advanced visualization techniques by partnering with an external company, Phoboz Interactive. I advocated for moving the project to C++ from Blueprints to improve performance and maintainability. Additionally, we implemented Perforce for version control, and I have been managing its administration since.
With this new foundation, our team expanded to about 5-6 members, divided into three core areas: Unreal development, backend data science, and GPU cluster engineering. Access to a GPU cluster enabled us to implement Pixel Streaming, allowing us to share this intensive project widely without compromising performance.
Since merging and managing the Perforce integration with Phoboz Interactive, the workflow for implementing new data sets has become seamless, allowing us to quickly add and visualize new information. We are actively refactoring code to improve performance and introducing new functionality such as asynchronous processing to speed up key operations.
I have recently transitioned to other projects but continue to oversee Meta Liverpool to ensure it runs smoothly. I regularly perform bug fixes and testing, and I’m responsible for reviewing completed JIRA tasks to verify they are bug-free before each deployment. While it is no longer my main focus, I am still actively contributing by adding new features and supporting the ongoing development of the project.
St Georges Hall
St George’s Hall is a National Heritage site that we had the privilege of scanning and importing into Unreal Engine 5 as part of a small-scale Digital Twin experience.
My role in this project focused on implementing the core interactive functionality, including:
Camera jumps
User interface
Sub-level swapping
In-game gizmo controls
Video projection features
Due to the project’s focused scope and contained scale, all development was completed using Blueprints, with no need for C++. The primary objective was to showcase the hall’s architectural detail through high-fidelity visuals, cinematic camera pans, and interactive elements like floor swaps and dynamic lighting controls.
While my focus was on the core interactive functionality, the project also featured a VR component developed by other team members, as well as an interactive Meta Human powered by a large language model tailored to the scene.
VR Demo (Passthrough)
This VR demo was created for a small-to-medium-sized enterprise (SME) exploring how environmental context affects taste perception. The goal was to simulate a high-fidelity coffee shop in VR, using passthrough to allow users to see their real hands, pick up a physical coffee cup, and experience different sensory conditions—first without and then with added ambient smells. (Note: The demo video replaces passthrough visuals with a black circle, as passthrough can't be recorded.)
Working on this project taught me a great deal about building performant VR environments. I gained hands-on experience with materials, lighting, and baking lights to achieve realism without sacrificing performance. I also learned how to handle VR limitations, including framerate constraints, and how to optimise assets for smooth interaction. Additionally, I worked with MetaHumans, learning the basics of rigging and facial animation to create a more immersive presence within the scene.
This was a technically and creatively rewarding project that helped me grow my skills in VR development, environment design, and performance optimisation within Unreal Engine 5. It also allowed me to establish a solid framework for future VR projects, improving how I approach interaction, asset management, and scene setup in immersive applications.
For more information on the Virtual Engineering Centre: Click Here