15 year multi-million pound scheme
Medipark Nottingham scheme with the Queens Medical Center.
The Nottingham Medipark is part of the Nottingham Science City agenda. The Medipark is a multi-million pound scheme covering at least 15 years. It will provide 3.7 hectares of grounds and 40,000sqm of high tech development facilities for clinical research, lifesciences and healthcare research, biomedical businesses, medical device development and practise innnovation. Medipark, and the Nottingham University Hospitals Trust will enable knowledge transfer and inward investment across the QMC site as a hub of NHS, academia and industry collaboration. The QMC site currently hosts the University of Nottingham Medical and Nursing Schools, Mental Health Wards and the privately run Nottingham Treatment Centre. Extension to the award winning Nottingham Tram system are another key component of the 15 year plan.
Investment and Planning
The augmented reality was produced to drive investment and aid in design, review and planning discussion across the scheme.
Stylised modelling & rendering for optimised real time frame speeds
The preview images here are an indication of the augmented reality real time 3D interactive presentation system. The modelling and texturing is heavily optimised to the technical requirements. Optimisation of textures and face counts are crucial for rendering solid frames per second even on quite low powered tablets and phones.
CPU vs Memory vs Cloud
The challenge point will be for some time the processing power (on screen face counts & effects) and memory (texturing and total scene poly counts), vs the battery life, vs cooling of people's handheld devices. Experiments with offloading the processing to cloud facilities are interesting, but really don't seem there yet in terms of the lag between upload/download, smooth fps, smooth and solid signal and speed especially in varied locations/installations.
Real time augmented reality visualisation
Designed for use with hand held tablets and phones, the reference tracker images were implemented as flat table displays, both printed and also touchscreen table interactive LCD plan displays. The system allows simultaneous 2D plan and interactive animated 3D viewing. The interactive presentations contains numerous elements within the faces and texture restrictions.
Animated buildings trees and tram lines and vehicles
It's particularly challenging streamlining more complex and organic shapes like trees and vehicles to work within the real time 3D graphics rendering capabilities of quite basic tablets and mobiles. Much use was made of prebaked textures, UVW unwrapping techniques, multi object texture atlases reusing many parts of single textures to retexture different elements, highly optimised low poly modelling and other techniques to meet the render speed constraints.
The users experience of Augmented Reality
The recipients' experience is of viewing the 2D plans either as printed papers or on a horizontal touchscreen lcd table. They can then activate an app on the tablets / phones and point the camera at those 2D plans. The screen view shows the plans in the background of the app display while real time 3D overlays of buildings and trees and vehicles appear to grow out of the plans at the correct points. All of the tablet displayed animated assets can be viewed from user controlled angles in real time, by simply moving the tablet to a new position for a new view, or moving the tablet away or nearer to zoom out and in. The custom app running on the tablet uses the tablet camera data and image tracking software to locate what part and angle of the plans is being viewed and then 3D renders the appropriate parts and angles of the 3D assets and animates them on top of the flat plans. Different elements are animated in different ways and extra information can be discovered and areas of interest focused on.
Augmented Reality implementation and uses
The presentation system is designed for international use at conferences and exhibitions, at various forums and bodies, display to individual key investors, for review and discussion of the scheme in a number of other contexts.
Augmented Reality, Realtime vs. 3D Pre-rendering
Augmented on handheld devices
With hand held devices you are already capturing a lot of the device power by taking the realtime video, running the code to provide real time registration of the tracking elements, and then keying in the 3D elements on top. It needs to be smooth and fast with solid lock and good fps, so the actual cpu and memory left for the 3D elements you can display is by necessity going to have to be very streamlined. Especially as you cover a wider range of devices including older ones. However it gives a great feeling of immediacy, and you can manipulate elements and react to the users sphere of interest in real time, which is a great way of condensing and providing information with ongoing engagement.
Pre-rendered 'renderfarm' visuals
Pre-rendered you get greater visual quality, more polygons, bigger textures, vastly more complex global illumination and ray tracing, more effects and processing, because the rendering has been done beforehand by (teams) of much more powerful workstation computers, but you do lose that interactivity and movement. So it's a trade off and will remain so for some time to come despite tech advances.
Future handhelds and wearable impacts for #AR
- Year by year, of course we are going to see more brute power in the cpu and memory, and hopefully complementary cooling and battery life, of our handheld devices.
- We may also see smarter registration workloads, for example utilising pixel depth via multiple lenses, like Apple's dual-lens iPhone7 or Huawei's dual-lens P9.
- Off-loading some of the workload to a cloud facility may get the rock solid connection and ping times it needs
- We should also see implementations for local handshaking to offload some workload to a local lowping high bandwidth more powerful processor (and better cooled) machine. Nvidia's gamestream to shield, where a desktop PC streams it's greater performance to lower powered local devices.
- We may see even greater restrictions on the immediately local cpu power vs memory vs cooling vs battery life paradigms as we go to lighter and more compacted wearables e.g. display glasses that are comfortable, not hot and have battery to last some time