MO-SYS VP Pro XR User manual

www.mo-sys.com
0
MoSys VPPro XR
manual

www.mo-sys.com
Contents
1
About the system 2XR engine setup 20-23
Overview 20
System overview 3-4 Connection 20
Topology and requirements 3Mo-Sys VP Pro settings 20
StarTracker outputs 3VP Pro XR software setup 20
Connections diagram 4Livelink setup 21
Network drives 4Level setup 21
Placing XRMaskBP 21
Quick start guide and geometry alignment 5-10 MoSysCameraXR 21
Summary flowchart 10 Test patterns 21
Colour calibration procedure 22
Render nodes hardware setup (nDisplay) 11 XR controller panel 23
Network 11 Add AR objects 23
Synchronisation 11
FAQ 24
NVIDIA software synchronisation 12-13
Render nodes setup 14 -17
nDisplay actor setup in screenshots 14
Livelink 15
nDisplay level 15
Defining LED screens 15
Mesh builder 15
Launching 16
Project distribution 17
Cinematic Focus XR 18-19
System 18
Prerequisites 18
Connection 18
Preston FIZ setup 18
Other Render Nodes setup 19

www.mo-sys.com
2
Before installation
About the system
Mo-Sys VP Pro XR is a pre-configured, multi-
node system, joining native Unreal Engine’s
nDisplay with Mo-Sys unique features like
Cinematic XR focus and XR compositing
tools.
The system comprises of:
•A number of rendering nodes driving the
image on the LED background which
create the depth/parallax effect, using
Unreal’s nDisplay plugin.
•An XR Engine which virtually extends the
LED set from its boundaries. It blends the
real image with a virtually rendered
image. Once setup, the system
automatically detects which area is the
real LED set and which is to be replaced
by a virtual camera output. Additionally
user can add AR objects to the composite
and perform tasks like recording tracking
data or interfacing with selected cameras.
•A StarTracker for a reliable and precise
camera tracking, essential for XR to work.
Before installation
In order to speed up the installation it is
recommended to take the following steps:
•Measure the height of the LED volume
(from the bottom to top row of the pixels)
•Usually the number of the rendering
nodes corresponds to the number of LED
processors. The maximum resolution
available from one node is 3840x2160 px.
Divide the LED volume into equal-sized
parts if possible. Every part is then driven
by one LED processor.
•Prepare an FBX model of the volume in
real scale. It should be split into meshes
that correspond to the way the wall was
split (from the previous point)
•Rendering nodes have Display Port
outputs, so ensure the LED processor has
the appropriate input or the conversion is
available. If there is no DP input on LED
processor, it is recommended to use a
matrix switcher e.g. from DP to HDMI. It
guarantees the video delays are
consistent on all parts of the volume.
Camera frustum tracked
Set extension example

www.mo-sys.com
3
System Overview
Requirements
When shipped the system is already pre-
configured as follows:
XR Engine:
1. All listed for VPPro MoSysVPProManual.pdf
Unreal Engine 4.27
2. Remote Control Web Interface plugin
requires installing Node.js from
https://nodejs.org/en/
4. VPPro XR plugin
Render nodes:
1. Unreal Engine 4.27
2. Nvidia Quadro Sync Card II, if using
hardware sync (included)
3. VPPro XR plugin only on the Primary
Render Node
Global Settings
After creating a new project in Unreal Editor
open XR Controller panel and click “Auto
Configure Project”. Then restart the editor.
StarTracker data output
Set up StarTracker to send data to w
outputs: XR Engine (here having IP of
192.168.99.72)
and Primary Render Node (IP 192.168.99.83).
It’s best to set the IP addresses to static on
all used devices.
Toggle LED button on next to the PRN

www.mo-sys.com
4
System Overview
Connections diagram
The diagram below represents a setup with
three Render Nodes (RN) corresponding to
the LED processors.
Notes:
1. Set StarTracker only to send data to XR
Engine and Primary Render Node (PRN).
Tracking is passed automatically on
other Render Nodes. The tracking won’t
be just passed through Unreal’s Multi-
user if using in a configuration with a
dedicated editor computer.
2. The network is solely used for tracking
updates, triggering nDisplay and events.
There is no image streaming over
network.
Network drives
Render nodes are mapped as network
drives on XR Engine,so it’s easier to
distribute the project across.
Normally most of the editing can be done
on XR Engine and just copied to the
rendering nodes.
Drives visible from XR Engine

www.mo-sys.com
5
Quick start guide and geometry alignment
This guide is only meant to serve as an
introduction and is not definitive. There are
many ways to set up the system and this is
only one of them.
First steps on both XR Engine
and Primary render node
1. Install or verify VP Free is updated to the
latest version on every computer:
https://www.unrealengine.com/marketpl
ace/en-US/product/mo-sys-tracking
2. Install the plugin on XR Engine and the
Primary Render Node. (pages 1-4 in
VPPro doc)
3. Configure livelink preset (page 8 in
VPPro)
Prepare
We assume the StarTracker is already
setup, auto-aligned and the lens is fully
calibrated! Refer to StarTracker manual
if that is not clear.
It is a crucial task to find a transform
(position, rotation and scale) of the LED
screen(s) in relation to the tracking
coordinate system, as it is used for
rendering the correct perspective on the
screens.
When working with set extension this is
even more pronounced, as the transition
between the graphics visible through a
camera and the extended by XR engine
must be seamless.
The layout of screen(s) geometry will be
used on both the XR Engine to mask the
LEDs and the rendering nodes to define the
nDisplay configuration.
New UE project
Let’s start on the XR Engine. Create a new
empty UE project. Once successfully
installed the plugin, make sure the following
steps are completed (pages 12-13):
•VP Pro project settings are set to the
correct framerate, HasTimecodeSync is
unchecked and compositing mode is set
to XR
•Video input is coming to the engine
•Livelink preset is set up for getting
tracking data
Open AligningXR level, which can be found in
/MoSysVPPro/Content/Compositing/XR/Alig
nment
Pilot the camera to get the composite view.
Use Alignment manager actor to toggle
visibility of the grid and to show video only.
If the camera image doesn’t show up then in
MoSysCameraXR details select the input
texture to provide the video feed under
VideoInput -> Input Texture.
Geometry alignment
For the purpose of the next steps hide the
grid and show video,so there is a feedback
as to where the camera is looking at.
Depending on the shape of the LED
screen/volume the alignment procedure will
vary. If there is just a flat, rectangular wall or
multiple of these then the Mo-Sys Mesh
Builder panel can put the geometry in
place. First, we will use it to verify the ST and
lens calibration. Go to page 15 for the
details of Mesh Builder.

www.mo-sys.com
6
Quick start guide and geometry alignment
Fill in two input fields: Bottom edge and
Height of the wall. Use three buttons Store...
To spawn 3 cones on 3 corners of the LED
volume/wall. Top right, bottom left and
bottom right. If the cones are not exactly on
the corners of the LED, move them
manually using only xand yaxes, as we
assume the height is set correctly. When
that’s done, test with camera pan, tilt, zoom
and position change if the cones stick to the
corners in all circumstances. If not, verify
the quality of StarTracker auto-align
procedure and lens calibration.
Mind that there will probably be delay
between the cones and the video. You can
adjust the delays using Timed Data
Monitor, which is Unreal’s tool to manage
delays.
Rectangular LED wall
If the screen is simply a rectangle you can
use Mesh builder to generate the mesh.
Select XRMaskBP object and click Assign LED
plane. Go through the steps indicated in
Mesh Builder section on page 5 and 6. If
there are multiple screens, just place more
XRMaskBP objects and repeat the procedure.
Delays correction with Timed
Data Monitor
Open the panel from Window/Developer
Tools. If it’s not present enable the Timed
Data Monitor plugin.
Use Time Correction value (in seconds) to
delay the tracking. Watch Timing Diagram
and make sure the vertical bar is green and
safely inside the buffer. If it’s not, set the
Buffer Size to a higher value
Use the augmented cones as reference to
set the delays. Pan the camera in a series of
short, sharp movements to see if the cones
stay on the corners of the LED. Stop
between the movements to distinguish
whether the cones or the video moves first.
Adjust the Time Correction.
Before using Mesh Builder
Go to project settings -> Collision.
Verify two custom Object Channels
exist: RefPlane and LED.
The final composited image is
achieved by replacing all pixels
that are outside of the XRMaskBP
with computer generated pixels.

www.mo-sys.com
7
Quick start guide and geometry alignment
Cones test
For the system to work it is essential that
the spawned cones stay at the corners of
the LED screen at all times.
Try moving the camera around, use pan and
tilt, zoom at the corners to test if the cones
stick to them.
•Big displacement of cones near image
edges usually indicates problems with
lens calibration.
•Displacement of the cone when camera
has moved in position usually indicates
problems with StarTracker auto-aligner
(best to centre the camera on the cone)
Mind that when cones are selected the
yellow outline shows their undistorted
position, which is not of interest here. It’s
the actual red cones that should be looked
at.
Lens zoomed in with cone next to the edge
of frame –good caibration
Cone not sticking to the corner –wrong
calibration

www.mo-sys.com
8
Quick start guide and geometry alignment
Complex LED geometry
If the mesh is more complex it is required to
model it in a third party 3d program like
Blender, Maya or 3ds Max. Import the mesh
to the project and make sure the scaling is
correct.
Define the height, bottom edge height and 3
corner of the mesh and store them in Mesh
Builder as bottom left, bottom right and top
right. Three marker cones (anchors) should
be spawned. Move the real camera around
the space and make sure the cones are
stuck to the corners of the LED volume. If
they are not, adjust the horizontal
placement of the cones (don’t change the
height, which was found by measurement).
If the cones are still not bound to the
corners, verify the StarTracker alignment.
Now select the Static Mesh on XRMaskBP to
the one that was modelled beforehand. To
visualise the mask go to MoSysCameraXR
and slide Screen Alpha to 1. It is the most
practical to get a pivot point on one of the
corners of the mesh. Make an empty actor,
place it under MoSysStage and snap its
position to the bottom left LED mesh. To do
it, click on the sphere gizmo of the empty
actor and press v key.It will then offer snap
points. Now parent the XRMaskBP to the
empty actor. Snap the empty actor to the
position of the bottom left cone or copy its
position. Use the pan rotation of the empty
actor to fit the mesh into the layout given by
the other two cones.
Verify and save the layout
Hide video and show grid on Alignment
manager. Move the real studio camera
around, while piloting it to see if the mask
covers the LED screen precisely and the
cones stay at the corners. If they are not
you might have to verify ST alignment
and the lens tweaking.
Save the marker cones’ positions with Save
to file button on Mesh builder, so it can be
retrieved later with nDisplay setup.
After save you can hit Clear all to remove
the cones. Control the blur on mask’s edges
with Feathering under MoSysCameraXR
details.
XRMaskBP parented under an empty
actor in lower left corner
Pan the empty actor to fit the geometry
between the cones

www.mo-sys.com
9
Quick start guide and geometry alignment
Make a copy of the LED screen
geometry for nDisplay
The position, rotation and scale of the
screen found in the previous paragraphs is
also used to fill in the data for nDisplay
config.You can write it down and copy or
use alternatively use a level to hold the data.
The following section describes how to use
it.
Duplicate the parent actor and the mesh
attached to it. While both copied actors
selected, go to levels panel and right click on
AlignementSharedGeometry and click Move
selected actors to level. This will transfer it
to the sublevel containing the layout of the
LED volume. It might be necessary to toggle
the duplicated actors as not “visible” in
details. Now in world outliner drag the
copies and attach them to StageShared.
Make sure the parenthood is preserved.
Save all to save both AligningXR and
AlignementSharedGeometry.
nDisplay
Move to AligningNDisplay level. Edit the
AlignmentRootActor. Set the Static mesh on
nDisplayScreen to the one used in for XR
mask alignment. If working with simple
planes, make sure the static mesh is set to
Plane and not plane_1x1. Copy the parent
empty actor’s transform on
nDisplayTransform and the XRMaksBP’s
transform onto nDisplayScreen.
Go back to the main viewport and on Mesh
Builder load the marker cones by clicking
Load from file with Load parent transform
checked. Compare if the cones are aligned
with the nDisplay screen corners.
Configure the rest of nDisplay
configurations. For a quick check, you can
connect one of the LED screens to the XR
Engine video output and launch the
nDisplay instance from there. Otherwise
distribute the project across the rendering
nodes.
Remember to create an extra livelink preset
that listens to tracking data on a different
port than the one configured before for XR.
Apply the LiveLinkPreset on the
MoSysCameraNDisplayPP and select the
livelink subject representation, so the virtual
camera is receiving tracking.
Go back to AliginingXR level and load the
non-nDisplay preset. Launch nDisplay
through Switchboard and pilot the
MoSysCameraXR.
Observe to verify whether the grid is
consistent throughout the whole frame so
on both LED screen and the CG extended
area.
XR Engine full setup
See pages 20 –23
Primary Render Node full
setup
See pages 14 -17

www.mo-sys.com
10
Quick start guide and geometry alignment
The flow chart shows the process of setting
up nDisplay with set extensions

www.mo-sys.com
11
Render Nodes Hardware Setup
Network
All the rendering nodes as well as the XR
Engine must be connected in a fast local
network, preferably dedicated to them and
the StarTracker.
All the devices should have a static IP
address for easy maintenance. Prepare a list
of the addresses, as it will be necessary for
software setup. Install Tight VNC for remote
access and set the project up in Perforce.
Drivers
Install the latest drivers. Daisy Chain the
system by having an external sync source
come into the master Sync Card via a BNC
cable (Genlock) and spread the signal
further down the chain using RJ45 cables.
Ideally each render node should have only
one display connected to it (the LED
processor) If more than one are necessary
configure them via Nvidia Mosaic.
Synchronisation
Synchronisation of the LED screen output is
a key to a seamless shoot. It is crucial that
all the sections of the volume are rendering
at the same time, so there is no tearing
between sections.
This is achieved by using an Nvidia Quadro
Sync Card II. Every Render node has one of
them connected to the GPU to guarantee
synchronous rendering.
Refer to Frame Lock Configuration in the
User Guide for more information:
Quick setup
1. Connect on-set genlock to the House
Sync to the Primary Render Node
2. Daisy chain the rest of the rendering
nodes with CAT-5 Ethernet cables.
3. Enable VSync in NVIDIA control panel
4. Configure the synchronisation in
Synchronise Displays. Define whether it
is a timing server (only the Primary
Render Node)

www.mo-sys.com
12
NVIDIA software synchronisation
Nvidia Driver Utility
Set a specific configuration on the Nvidia
drivers via an Nvidia driver configure utility.
ConfigureDriver.exe, this can be obtained
here. Download and run configuredriver.exe
as administrator via Windows command
prompt. Once run type 11 and press enter.
This will enable the prePresentWait setting
and improve performance without
compromising sync.
nDisplay config
In nDisplay config actor select Nvidia (2) in
Cluster > Render Sync Policy > Type.
The Node > Window should cover the entire
desktop resolution. However the viewport
should only cover the resolution required by
the LEDs.
In order for nDisplay to lock to the
synchronisation from Nvidia sync it must
run as a fullscreen foreground Windows
application. No other application can sit
on top of the nDisplay instance while it is
running.
To manage this ensure that:
1. Nvidia control panel is not open
2. No virtual desktop is running (no
teamviewer, zoom, etc)
3. Desktop based notifications and pop ups
are disabled.
4. Windows desktop resolution scaling is
set to 100%
5. Fullscreen optimization is disabled on
Unreal executable (using Fix ExeFlags from
Switchboard
You can verify that it is running in the
correct mode by enabling option 8 in
configureDriver.exe
(“Enable the SwapGroupPresentIndicator for
Direct x”).
Setting up EDID
Once you have everything setup in the
Nvidia control panel with regards to sync,
resolutions and colour space etc, it is useful
to export the current EDID and then load it
from a file, find the instructions here.
Alternatively you can set your EDID via your
switcher, such as a Lightware matrix.
It's important to note that incorrectly
configured EDIDs can half the performance
of nDisplay when using sync policy 2.
To avoid this ensure you have an EDID
which allows you to select a PC resolution at
the frequency you wish to shoot at (i.e.
24hz) and is marked as “(native)”. Another
solution to this is to create custom
resolution based on the standard 3840 x
2160 60hz PC resolution and then set it to
the appropriate frequence (i.e. 24hz, 25hz,
30hz etc)
Sync validation
Use a bouncing ball test to validate the
synchronisation. Place the ball on the edge
between 2 parts of the LED wall, so it’s
visible on two separate segments of the
wall. The ball will bounce up and down. If
the ball is consistent it means the nDisplay
setup is synced. If you see tearing of the ball
then the synchronisation is failing.
The ball blueprint can be found under
MoSysVPPro Content -> nDisplay ->
Blueprints: BP_BouncingBall.uasset
Additionally Switchboard shows Nvidia
driver and sync status of the nDisplay
nodes. For successful sync the
“PresentMode”column should indicate that
each node is in “Hardware Composed:
Independent Flip”, if it states “Composed:
Flip” then you will want check nothing is
overlapping nDisplay as full screen
application on the nodes.

www.mo-sys.com
13
NVIDIA software synchronisation
Bouncing ball test
Prepare the nDisplay level, so the ball is
positioned in between the nDisplay screens.
After launching in Switchboard the ball with
bounce quickly up and down

www.mo-sys.com
14
Render nodes setup (nDisplay)
Example nDisplay config actor
setup, described on next page

www.mo-sys.com
15
Render nodes setup (nDisplay)
Livelink Setup
Start with the Primary Render Node and set
up the livelink connection with StarTracker.
Refer to our VP Free manual for setup:
Remember to stop the livelink in editor
before launching the project through
nDisplay Launcher, as the tracking can only
go to one application.
nDisplay Level
Open “nDisplayExample4_27”level in
MoSysVPPro Content/nDisplay/Maps.
We will be following the "in-Camera VFX
Template", which can be found in
In-Camera VFX Quick Start guide here:
https://docs.unrealengine.com/en-
US/WorkingWithMedia/InCameraVFX/InCam
eraVFXQuickStart/index.html
MoSysCameraNDisplay is used for the
inner frustum. Refer to VP Free manual for
more information on nDisplay integration
on pages 9-11.
The content should have a dedicated level,
which is a sublevel to the main nDisplay
level. nDisplay level should only store the
objects related to nDisplay architecture and
necessary to track the camera. You can add
or remove the content levels there, so it fits
the current shoot. Below is an example of
the levels setup:
Defining LED Screens
Depending on the number and shape of the
LED volumes the virtual layout needs to be
adjusted. Usually one Node is responsible
for one viewport.
Edit nDisplayRootActorST to define the
viewports. Refer to Step 3 and the
screenshots on previous page.
Top part of the nDisplay config actor defines
the LED screen transform (where it is in
space) and the bottom part describes the
pixel mapping on screen space (In other
words where, in relation to the top right
corner of the screen, the image should be
rendered)
Every viewport responsible for rendering to
a part of the LED volume, is represented as
a mesh in Unreal. This can take any shape. It
can be a simple plane or a preprepared
mesh like a curved model of a screen.
Refer to Step 2 - Create LED Panel Geometry
in In-Camera VFX Quick Start guide for
building a complex mesh.
See screenshots on previous page.
Mesh Builder
You can use Mesh Builder to generate this
representation automatically, by looking
through a camera with the crosshair at the
corners of the screen. Three observations
are used for calculations.

www.mo-sys.com
16
Render nodes setup (nDisplay)
The tool was designed for rectangular
screen or a combination of rectangular
screens, however spawned corners can
facilitate placing a more complex mesh as
well.
The procedure of finding the corners and
generating a mesh is as follows:
1. Select a tracked camera used for
observations
2. Enter height of the bottom edge of the
LED screen and the height of the screen
from the bottom to the upper edge.
3. Select a plane to fit as the LED screen
representation and click: Assign LED
plane. The plane has to be present in the
level, as it is not spawned.
4. Position the camera so it is looking
centrally at consecutive corners of the
screen (through crosshair) and click on
corresponding store buttons.
5. Click Calculate and move mesh to fit the
plane in place.
6. Optionally check LED screen levelled to
only use 2 bottom corners. The
calculation assumes the screen is
perfectly vertical.
Launching
See Step 4 - Launching Your Project with
nDisplay in In-Camera VFX Quick Start to
launch the project on the LED screens.
Use Switchboard plugin to launch:
1. Enable it in Edit -> Plugins.
2. Install SwitchboardListener on all the
rendering nodes and then start
Switchboard on Primary Node.
3. Configure the setup in Configs->New
config, by selecting the project and
Engine folder.
4. Add nDisplay device from Add device
dropdown and specify the config file.
5. Connect devices.
6. Start all connected devices.
After you launch in nDisplay launcher it
should appear on the primary screen.

www.mo-sys.com
17
Render nodes setup (nDisplay)
Project Distribution
The Project must be copied to all of the
Rendering Nodes to the same directory as
the Primary Render Node.
All the rendering nodes should store the
same version of the project, so it is
recommended to use a version control
software like Perforce
Free File Sync
Good alternative, especially when a quick
distribution on local network is taking
advantage of the mapped network drives
and distributing the project through Free
File Sync. Set it up on XR Engine, so the
source (on the left) is a local directory of the
UE project and the destination (on the right)
is the network drive.
Set synchronisation option to “update”.
Click “compare” to see the differences and
“Synchronize” to update the destination.
Loading from network drive
Alternatively, especially for testing, a user
can use a project that is located in Windows
shared folder, visible to all rendering nodes.
In that way there is no need to distribute the
project as it is loaded on start from the
shared folder. This is not recommended for
production, as the stability cannot be
guaranteed and the start time is usually
longer.
This approach requires from user to assign a path
to the network location in the Switchboard
settings.

www.mo-sys.com
18
Cinematic Focus XR
System
Cinematic Focus XR provides a seamless
focus control with LED stages. The system
takes advantage of the existing geometry
defined for nDisplay and camera pose given
by StarTracker. Allowing focus pullers to use
it naturally regardless of whether an object
is real or virtual. It is currently compatible
with Preston FIZ system.
Prerequisites
Preston FIZ system with:
•Preston MDR3
•Hand unit 3 or 4
-Mo-Sys additionally provides cabling
needed for the control. MDR3 needs a
firmware update to version 1.130.
- Install the FTDI drivers, so a USB Serial
Port appears in the Device Manager:
https://ftdichip.com/drivers/
Connection
Connect the serial cable to the Preston MDR
4 pin Lemo serial. On the other end plug it
to the Primary Render Node USB port.
Configure the serial port in Device Manager
and take note of the COM port number.
Apply the following settings:
•Bits per second: 115200
•Data bits: 8
•Parity: none
•Stop bits: 1
•Flow control Xon/Xoff
Advanced settings,
•BM Options, Latency Timer 2 msec
Preston FIZ Setup
Setting up the FIZ system is out of the scope
of this manual. Please refer to Preston’s
documents for more information. It can be
found here:
https://prestoncinema.com/#!/downloads/fi
rmware
•After power up the system will try to find
end stops of the lens. It is a part of the
calibration.
•Make sure to map focus to the lens
reading distances. This can be done in
both imperial and metric units.
VP Pro XR Software Setup
Open the Unreal project on Primary
Render Node.
Verify a collision profile is setup in Project
settings -> Collision -> Object Channels.
There should be a custom Object Channel
called LED. Create one if missing.
Add a new component on
MoSysCameraNDisplay called
PrestonFocusPullForLED.

www.mo-sys.com
19
Cinematic Focus XR
Configure the component.
1. Enter the com port used for
communication
2. Check Emit Focus Event if using multiple
rendering nodes.
3. Specify a distance in [cm] from the LED
screen. The physical focus should stop
while pulling focus into the virtual scene,
to mitigate the Moire Effect, which
appears when LED array is in focus. The
inserted distance should allow the
system to keep it just outside of the
depth of field.
In nDisplay config blueprint, set the
collision presets in collision section on the
mesh(es) used with viewports as follows:
Collision Presets set to Custom and below
Object Type is set to LED.
Finally, set Manual Focus to true to enable
Hand control to set the focus distance.
There is a sample level with the component
already on the camera. It’s
nDisplayLensControl in MoSysVPPro Content
-> nDisplay -> LensControl.
Other Render Nodes setup
If used with multi Render Nodes there has
to be an extra component added to the
camera to receive the focus updates.
Find “nDisplayFocusReceiver”in
MoSysVPFree Content/Blueprints.
Drag the nDisplayFocusReceiver onto the
MoSysCameraNDisplay’s components
Table of contents