ExRobotics ExR-2 User manual

ExR-2 Robot
Operating Guide

ExRobotics B.V.
ExR-2 Robot
Operating Guide
Document No.:
20220412IP1
Version No.: 2
Owner:
Ian Peerless
Date:
2022-04-30
Page 1 of 39
This document is considered an uncontrolled copy when printed. Always ensure that you print and use a current version.
Copyright 2022 ExRobotics B.V.
Responsible
[Person who is responsible for
writing the document, i.e.
author]
Name:
Ian Peerless
Job Title:
Director
Signature:
Date:
Accountable
[Person who is accountable for
the quality, validity and
timeliness of the document, i.e.
Process Owner]
Name:
Ian Peerless
Job Title:
Director
Signature:
Date:
Consulted on this document
version
[Persons who have endorsed the
document]
Names:
Stefan Kohlbrecher
Jeroen Mostert
Alberto Romay
Dorian Scholz
Ronald Schreurs
Daan Hitzbleck
Informed
[Persons who have been
informed of this version of the
document]
Names:
All ExR employees, ER employees,
first line support, and fleet
managers.

ExRobotics B.V.
ExR-2 Robot
Operating Guide
Document No.:
20220412IP1
Version No.: 2
Owner:
Ian Peerless
Date:
2022-04-30
Page 2 of 39
This document is considered an uncontrolled copy when printed. Always ensure that you print and use a current version.
Copyright 2022 ExRobotics B.V.
Document Change Log:
Version
#
Date of re-
issue
Changes Made
Page
Author of
Change
1
2022-04-12
First draft derived from Robot System Operating
Guide 20190122IP1 Version 12
All
Ian Peerless
2
2022-04-30
Included Energy Robotics comments and additional
details about battery replacement
Various
Ian Peerless

ExRobotics B.V.
ExR-2 Robot
Operating Guide
Document No.:
20220412IP1
Version No.: 2
Owner:
Ian Peerless
Date:
2022-04-30
Page 3 of 39
This document is considered an uncontrolled copy when printed. Always ensure that you print and use a current version.
Copyright 2022 ExRobotics B.V.
Contents
1. Introduction ..........................................................................................................................................5
2. Robots ...................................................................................................................................................6
3. Control Stations & Communication ......................................................................................................7
4. Docking Stations & Charging.................................................................................................................8
5. LiDAR Based Navigation, Object Detection and Object Avoidance ......................................................9
6. Cloud Software................................................................................................................................... 12
6.1. Fleet Management and Fleet Status............................................................................................. 12
6.2. Driver Screen................................................................................................................................. 13
6.3. Mission Editor ............................................................................................................................... 16
6.4. Mission Report.............................................................................................................................. 16
6.5. Engineer Screen ............................................................................................................................ 17
7. Autonomous Missions........................................................................................................................ 17
7.1. Overview....................................................................................................................................... 17
7.2. Line Following Navigation............................................................................................................. 18
7.3. Tag Based Inspections................................................................................................................... 18
7.4. Teach and Repeat Navigation....................................................................................................... 19
7.5. Skills .............................................................................................................................................. 23
7.6. Click and Inspect ........................................................................................................................... 23
8. Operating Robots............................................................................................................................... 23
8.1. Operative Training ........................................................................................................................ 23
8.2. Authorisation and Authentication ................................................................................................ 24
8.3. Customer support......................................................................................................................... 25
8.4. Software releases.......................................................................................................................... 25
8.5. Routine Maintenance ................................................................................................................... 25
8.6. Replacing tracks ............................................................................................................................ 25
8.7. Opening the hull............................................................................................................................ 26
8.8. Calibrating gas detectors .............................................................................................................. 27
8.9. Replacing batteries ....................................................................................................................... 27
8.10. Changing SIM cards.................................................................................................................. 30
8.11. Security .................................................................................................................................... 31
9. Questions and Answers...................................................................................................................... 32
9.1. What is my robot’s status............................................................................................................. 32
9.2. Why can’t I connect to a robot? ................................................................................................... 32
9.3. Why has my robot stopped working in cold weather?................................................................. 32
9.4. Why doesn’t my controller work? ................................................................................................ 32

ExRobotics B.V.
ExR-2 Robot
Operating Guide
Document No.:
20220412IP1
Version No.: 2
Owner:
Ian Peerless
Date:
2022-04-30
Page 4 of 39
This document is considered an uncontrolled copy when printed. Always ensure that you print and use a current version.
Copyright 2022 ExRobotics B.V.
9.5. How much data will the robot transfer over its wireless connection?......................................... 33
9.6. How long will the robot’s batteries last?...................................................................................... 33
9.7. Why am I surprised by the battery level reported by the cloud software? ................................. 33
9.8. How do I refresh my control screen?............................................................................................ 33
9.9. Why does my robot “drift” to the left or right ............................................................................. 33
9.10. Why have I experienced unexpected behaviour while operating a robot? ............................ 33
9.11. How do I wash a contaminated robot? ................................................................................... 34
10. Robot and Docking Station Specifications ......................................................................................... 35
10.1. Environment ................................................................................................................................. 35
10.2. Terrain and Manoeuvrability ........................................................................................................ 35
10.3. Availability..................................................................................................................................... 36
10.4. Standard Hardware & Software.................................................................................................... 36
10.5. Sensing & Audio Options .............................................................................................................. 37
10.6. Charging Options........................................................................................................................... 38
10.7. Connectivity & Navigation Options............................................................................................... 38
10.8. Cloud-Based Reporting & Analysis Options.................................................................................. 39
10.9. Noses for Honeywell 3000 Mk II................................................................................................... 39

ExRobotics B.V.
ExR-2 Robot
Operating Guide
Document No.:
20220412IP1
Version No.: 2
Owner:
Ian Peerless
Date:
2022-04-30
Page 5 of 39
This document is considered an uncontrolled copy when printed. Always ensure that you print and use a current version.
Copyright 2022 ExRobotics B.V.
1. Introduction
This document is one of three that will help operators to use their ExR-2 robots safely and
effectively:
▪ExR-2 and Docking Station Operating Instructions
▪ExR-2 Robot Operating Guide.
▪ExR-2 Robot Deployment (Quick Start) Guide.
All of these documents and some explanatory videos can be downloaded from Downloads |
ExRobotics
The instructions focus on the safe deployment, operation and servicing of robots and docking
stations. The guides provide additional information about robot systems and their use. The guides
don’t usually replicate important information that’s in the instructions so it’s essential that robot
users read and understand the instructions. If there’s a conflict between the documents, the
instructions will always prevail.
This operating guide:
▪Describes each key part of the robot system:
oRobots.
oControl stations that are used to communicate with the robot.
oDocking stations that recharge the robots’ batteries.
oCloud software that enables users to interact with robots.
oObject detection and avoidance.
▪Explains how to set up autonomous missions.
▪Recommends how robot operatives should be trained.
▪Provides advice for operating robots.
The guide will be updated and made available to robot fleet managers and first line support as new
robot hardware and software is deployed.

ExRobotics B.V.
ExR-2 Robot
Operating Guide
Document No.:
20220412IP1
Version No.: 2
Owner:
Ian Peerless
Date:
2022-04-30
Page 6 of 39
This document is considered an uncontrolled copy when printed. Always ensure that you print and use a current version.
Copyright 2022 ExRobotics B.V.
2. Robots
ExR-2 robots are assembled from a number of modules mounted on a skeleton that is clad in sheet
steel. Many of these modules are optional. The full range of modules is identified in these clad and
unclad pictures. More detailed robot specifications are included in Section 10.

ExRobotics B.V.
ExR-2 Robot
Operating Guide
Document No.:
20220412IP1
Version No.: 2
Owner:
Ian Peerless
Date:
2022-04-30
Page 7 of 39
This document is considered an uncontrolled copy when printed. Always ensure that you print and use a current version.
Copyright 2022 ExRobotics B.V.
The only controls mounted on each robot are the red emergency stop switch and the black on/off
switch:
▪When the red emergency stop switch is pressed downwards it immobilises the drive motors.
The robot can’t be driven until the switch is released by rotating it and letting it spring
upwards.
▪When the black switch is rotated anticlockwise to the “off” position the power supply to all
components (except some circuits in the electronics box) is shut-off.
▪When the emergency stop switch is released and the robot is switched on, the robot is held in
position by its motors.
▪When the emergency stop switch is pressed down and/or the robot is switched off, there’s no
power to the motors so the robot moves freely. This means it will roll down a slope under the
influence of gravity.
3. Control Stations & Communication
Customers are responsible for providing control stations. By default, robots are controlled with a:
▪PC with a screen resolution of at least 1920 x 1080 and Google Chrome or Microsoft Edge.
▪Xbox gamepad for driving the robot.
▪Mouse or trackpad for operating the cursor on the PC’s screen.
Should a customer prefer to control their robots with “tablets” (without gamepads) they should
perform a risk assessment because there will be no physical emergency stop on the control station.
In the near future customers will then be able to adapt the user interface for use with tablets.
However, at the moment they should contact their account manager.
It’s important that there are no intrusive firewalls between the control station and the cloud
software (see below).
Port
Protocol
IP
3478
UDP
Will be given to you by Energy Robotics
Robots transfer data and receive instructions via wireless networks. This might be a public 4G
network, a private 4G network, or a WiFi network. They also have a short-range WiFi network that
can be used to correct faults if the main communications network fails. 4G networks are secured
using VPN technology.
The wireless network will connect the robot to a server which might be a private server, or a service
provided by a third party company such as Amazon (AWS) or Microsoft (Azure). This server will host
the cloud software (see Section 6) and transfer data to other servers.

ExRobotics B.V.
ExR-2 Robot
Operating Guide
Document No.:
20220412IP1
Version No.: 2
Owner:
Ian Peerless
Date:
2022-04-30
Page 8 of 39
This document is considered an uncontrolled copy when printed. Always ensure that you print and use a current version.
Copyright 2022 ExRobotics B.V.
4. Docking Stations & Charging
A robot automatically charges itself using the induction charger built into its docking station. It can
also autonomously dock and undock as described in Section 7. If the robot is being remotely
controlled, the operative should approach the docking station in “slow speed” mode and in the
direction shown in the picture below. Provided the robot is reasonably straight and central when
approaching the docking station, the robot will automatically align the induction charging plates
using the plastic strips under its hull. To facilitate robot alignment, the robot should approach the
docking station in a straight line for at least 3 meters. Once the front of the robot’s hull is pressing
against the front of the docking station the operative should stop driving forwards and switch off
the motors by pressing the red button on the gamepad.
Once the induction charger is connected, the “Wireless Charger” and “Charging” boxes on the cloud
software will be checked. Hovering the cursor over the charging box will reveal the charging
current. It takes approximately 8 hours to recharge a battery pack from 10% to 90% charge.
When the robot is fully docked it will “go to sleep” and begin to charge using the induction charger.
There can be a delay of up to 120 seconds before this happens. When the robot is asleep, many of
its components are switched off to reduce power consumption and speed charging. However:
▪The gas detectors remain on so that they don’t need to “warm up” before commencing the
next mission.
▪It continues to communicate so that it can quickly be woken up.
The docking station should be powered up at all times since charging may be disrupted if it’s
switched on when the robot’s already docked. Also, it’s best not to switch off the robot (using the
black On/Off switch) when the robot is in the docking station since this too can disrupt charging. If
this happens pull the robot 30cm back and switch it off and on again.

ExRobotics B.V.
ExR-2 Robot
Operating Guide
Document No.:
20220412IP1
Version No.: 2
Owner:
Ian Peerless
Date:
2022-04-30
Page 9 of 39
This document is considered an uncontrolled copy when printed. Always ensure that you print and use a current version.
Copyright 2022 ExRobotics B.V.
When a driver checks the “Wake Up” box the robot video streams will typically appear within 1
minute. The robot is then ready to use.
A power socket module (when fitted) enables the robot to be charged within 4 hours. This requires
the robot to be manually plugged into a power supply using a quick-charger that’s supplied with the
socket. The quick-charger’s lead is typically 3 meters long. If the robot is docked, you should first
switch off power to the induction charger. When inserting the quick-charge plug, rotate the entire
body clockwise before tightening the ring around its base. You can check the status of the charging
using the LEDs and instructions on the quick charger. This video provides more detail:
https://exrobotics.global/media/uploads/mp4/5/8/58_instruction-how-to-connect-the-quick-
charger.mp4
5. LiDAR Based Navigation, Object Detection and Object Avoidance
ExR-2 robots are usually supplied with a LiDAR module. This is at the heart of autonomous
navigation (see Section 7.4), object detection (which also works during remote controlled missions)
and object avoidance.
It’s useful to understand how the LiDAR works before performing missions. It scans the
environment around the robot as indicated in this diagram.

ExRobotics B.V.
ExR-2 Robot
Operating Guide
Document No.:
20220412IP1
Version No.: 2
Owner:
Ian Peerless
Date:
2022-04-30
Page 10 of 39
This document is considered an uncontrolled copy when printed. Always ensure that you print and use a current version.
Copyright 2022 ExRobotics B.V.
The LiDAR is tilted down towards the front of the robot so that it can detect the ground close to the
robot in its usual direction of travel.
The tilted LiDAR also means that it can see higher objects behind the robot so it can build a 3D
model of the surroundings with more relief than a horizontally mounted LiDAR. The robot builds
this model as it performs its first remote controlled mission. The robot then uses this model to
navigate during subsequent autonomous missions.
Sometimes there might be temporary items in the surroundings, e.g. people, cars or scaffolding.
These should be minimised as far as possible because they will be incorporated into the 3D model.
However, provided there aren’t too many temporary items the robot will still be able to localise its
position using the other information in the map.
Temporary items do not affect object detection and avoidance because these functions are based
on live LiDAR data that is gathered during the ongoing mission. However, it should be remembered
that collision detection and avoidance isn’t perfect. The following conditions can hamper
performance.
Condition
Description
Potential work-arounds
Drop-offs
Drop-offs are more difficult to detect than
obstacles because if the ground is very
reflective (for instance in heavy rain) there
might be false positives and steep downward
ramps cannot be reliably distinguished from
drop-offs using LiDAR alone.
Drop-offs are usually
permanent features and it’s
very serious if the robot drops
from a height. Robot routes
should avoid drop-offs or
barriers should be erected
around them.
Upwards
ramps
The robot will usually interpret these ramps
as an obstacle and stop.
Contact your account manager
to see if the software can be
adjusted.
Narrow gaps
The robot is designed to navigate 1 meter
wide gaps. However some conditions might
prevent this and/or customers may wish to
navigate narrower gaps.
Contact your account manager
to see if the software can be
adjusted.
Thin and small
obstacles such
as vertical
pipes.
Avoidance is optimised to avoid collisions
with a variety of objects. Very small objects
are not considered as increased sensitivity
increases the likelihood of false-positive
avoidance.
If there’s a risk of collision,
choose a different route or
enclose such items in a larger
“box”.
Avoid leaving small, temporary
items on the robot’s routes.
Sharp turns
The LiDAR cannot see low objects to the side
of the robot. If the robot turns sharply and
the object is within 0.85m of the front of the
robot it may not be detected.
Take this into account when
planning robot routes. Avoid
leaving low, temporary items
near sharp turns.

ExRobotics B.V.
ExR-2 Robot
Operating Guide
Document No.:
20220412IP1
Version No.: 2
Owner:
Ian Peerless
Date:
2022-04-30
Page 11 of 39
This document is considered an uncontrolled copy when printed. Always ensure that you print and use a current version.
Copyright 2022 ExRobotics B.V.
Reversing
The LiDAR cannot detect objects lower than
0.5m behind the robot and there’s a blind
spot behind the panning inspection module.
Only reverse over short
distances over which the robot
has recently driven forwards.
Overhead
obstacles
The LiDAR might not detect low items that
are close to the robot and that might hit the
panning inspection module. This is especially
significant during sharp turns and for thin
overhead items like cables.
Keep robot routes clear of such
items.
Shiny and low
reflection
items.
The Ex enclosure of the LiDAR causes
significant attenuation which means shiny
and low reflective objects might be
detectable only close to the robot
Paint or cover these items.
Avoid leaving shiny, temporary
items on the robot’s routes.
Rain, steam
and snow
These will obstruct the LiDAR’s beam.
Localisation and object detection will be less
effective.
Do not perform missions in
these conditions or provide
remote supervision of the
mission by a human.
Clear materials
LiDAR will not detect objects like glass doors.
Facilities seldom contain such
items.
High contrast
environments
& inadequate
lighting
The LiDAR is not affected by light in the
visible range.

ExRobotics B.V.
ExR-2 Robot
Operating Guide
Document No.:
20220412IP1
Version No.: 2
Owner:
Ian Peerless
Date:
2022-04-30
Page 12 of 39
This document is considered an uncontrolled copy when printed. Always ensure that you print and use a current version.
Copyright 2022 ExRobotics B.V.
6. Cloud Software
Robots are operated and data is collected using the “cloud”. Access is granted as described in
Section 8.2. Six types of screens are available as described below.
6.1. Fleet Management and Fleet Status
Fleet Management is the first screen that appears when users log on. It allows them to connect to
any robot to which they’ve been granted access. Scrolling down reveals more robots.
The Fleet Status screen (see below) is accessed by clicking on the “navigation menu” icon to the top
left of the fleet management screen.

ExRobotics B.V.
ExR-2 Robot
Operating Guide
Document No.:
20220412IP1
Version No.: 2
Owner:
Ian Peerless
Date:
2022-04-30
Page 13 of 39
This document is considered an uncontrolled copy when printed. Always ensure that you print and use a current version.
Copyright 2022 ExRobotics B.V.
6.2. Driver Screen
Once a user has connected to a robot, most of the display information is intuitive:
▪The major functions are summarised in the picture below.
▪For safety reasons, only one operative can control a robot at any given time. The logo towards
the top right of the screen will show who that is. To take over control, click on that icon.
▪All video and LiDAR streams are displayed on the control station. Clicking on the “Expand”
icon of any video stream window moves it to the largest window.
▪To take a snapshot, hover the cursor over any video stream and then click on the Point of
Interest (POI) icon that appears in the top-left corner. Then use the cursor to select the area
of interest and click on the “Accept” button to capture the image. Alternatively, you can take
a full-size picture without selecting the POI icon.
▪Snapshots are displayed in the “Media Log” once they have been uploaded to the server (this
happens automatically after taking them) and can be viewed in large scale by clicking on
them. From there they can be saved to the local machine by right clicking the full-sized
picture and selecting “Save As...”.
▪To take a video, a mission must be active. This is done by undocking the robot and driving to
the location of interest (Videos are currently only recorded if the robot status is “Mission
Active”). Once ready, hover the cursor over any video stream and click on the “Video” icon
(small circle) that appears in the top-left corner. The message “recording” will appear. To stop
recording click the “Video” icon again.
▪Videos will be available only after the robot is back in the docking station (to save bandwidth
when driving, videos are not uploaded immediately). Videos are displayed in the Mission
Report website under “Recorded Media” and can be viewed in large scale by clicking on them.
From there they can be saved to the local machine by right clicking the full-sized picture and
selecting “Save As...”.

ExRobotics B.V.
ExR-2 Robot
Operating Guide
Document No.:
20220412IP1
Version No.: 2
Owner:
Ian Peerless
Date:
2022-04-30
Page 14 of 39
This document is considered an uncontrolled copy when printed. Always ensure that you print and use a current version.
Copyright 2022 ExRobotics B.V.
▪The icons at the top of the “Media Log” can be used to filter the time of the last events e.g.
images from the last 4 hours. Additionally, the “cloud download” icon to the right can be used
to download a complete set of recordings as a zip file.
▪The audio stream of the microphone can be started and paused by clicking on the
“Microphone” window.
▪The screen shows the gas levels for those gas detectors that are fitted to the robot. The gas
alarm levels for the robot can be adjusted by clicking on the icon to the top right of each gas
display window. An audio/visual alarm is emitted when the alarm level is exceeded.
▪The autonomy controls are grouped together. When the "Keep Awake” box is ticked the
robot won’t sleep. If it’s unticked the robot will save 4G costs and battery power by sleeping
(whether or not it’s docked).
▪Select a mission using the drop-down box. To launch the mission, click on the “Play” button
to the left of the mission drop-down box, to cancel the mission press the button again.
Missions will usually be started when the robot is in a docking station. However, it’s also
possible to launch “line-following” missions when the orange line is visible in the down-facing
camera.
▪Using the cursor to activate the "Stop” button has the same effect as pressing the emergency
stop switch on the gamepad. The drive motors are isolated until the switch is released. This is
done using “Auto” or the green button on the gamepad.
▪The gas detectors and lights controls are accessed by clicking on “More”.
▪The top right of the screen shows the robot’s status. A tick adjacent to each item indicates:
oMission Active –robot has been commanded to move and it’s not charging anymore.
oWireless Charger–the robot’s coil is connected to that in the docking station.
oCharging –current is flowing into the battery pack.
oE-Stop Released –the robot’s emergency stop has been released ready to drive.

ExRobotics B.V.
ExR-2 Robot
Operating Guide
Document No.:
20220412IP1
Version No.: 2
Owner:
Ian Peerless
Date:
2022-04-30
Page 15 of 39
This document is considered an uncontrolled copy when printed. Always ensure that you print and use a current version.
Copyright 2022 ExRobotics B.V.
oMotors Enabled –the robot’s motors are not isolated anymore, so the robot can be
driven manually or autonomously (this takes a few seconds to change after checking the
“Auto” box or pressing the green gamepad button).
oDriving –the robot is in motion.
oGas Detectors On –The gas detectors are powered up (they have individual warm-up
times and only after this time the gas displays will start showing measurement values).
oManual Control –the robot is being controlled from this control station by a driver.
▪Hovering over an icon or text will often provide more information.
▪Drivers can change the robot location on the screen by clicking on the “Site Config” option in
the navigation menu to the top left of the screen. They can report issues to our engineers
using the “Feedback” option in the user menu to the top right of the screen.
The Gamepad controls are illustrated below:
▪The green “A” button activates remote control by enabling the gamepad and drive motors. It
also de-activates the emergency stop button (see below). When this is done it will typically
take 30 seconds for the motors to be activated.
▪The blue “X” button switches from remote control to autonomous driving.
▪The yellow “Y” button deactivates remote control by disabling the gamepad.
▪The red “B” button on the gamepad stops the robot and isolates the motors, identical to the
“Stop” icon.
▪The analogue joysticks are used to drive (LSB) and steer (RSB) the robot.
▪Holding down the LB/RB buttons changes speed mode (slowest - no buttons pressed, fastest -
both buttons pressed).
▪If the LT button is held down, the RSB controls the inspection module panning and the
elevating mast (if fitted). Push the RSB button:
oLeft to rotate the panning module anticlockwise.
oRight to rotate the panning module clockwise.
oForwards to raise the mast.
oBackwards to lower the mast.

ExRobotics B.V.
ExR-2 Robot
Operating Guide
Document No.:
20220412IP1
Version No.: 2
Owner:
Ian Peerless
Date:
2022-04-30
Page 16 of 39
This document is considered an uncontrolled copy when printed. Always ensure that you print and use a current version.
Copyright 2022 ExRobotics B.V.
For safety reasons, robots cannot be remotely controlled when the operative is using another
application on the control station. A pop-up will appear with a warning. Clicking on the pop-up will
re-enable remote control.
6.3. Mission Editor
This screen enables planners to construct and edit autonomous missions as described in Section 7.
6.4. Mission Report
A typical mission report screen is as follows.
The data for a mission is displayed by clicking on the relevant block to the left of the screen.
▪The gas detector and acoustic analyser values are displayed on the adjacent graphs.
▪The plan will be green where the values were below the lower alarm level, amber where
they were between the two alarm levels, and red where they were above the upper alarm
level. Section 6.2 describes how to set the alarm levels.
FORWARDS

ExRobotics B.V.
ExR-2 Robot
Operating Guide
Document No.:
20220412IP1
Version No.: 2
Owner:
Ian Peerless
Date:
2022-04-30
Page 17 of 39
This document is considered an uncontrolled copy when printed. Always ensure that you print and use a current version.
Copyright 2022 ExRobotics B.V.
▪Other information that has been gathered at Waypoints is displayed in the right-hand part
of the screen.
▪Snapshots and gas readings are uploaded immediately. Other recordings are uploaded
when the robot returns to its docking station. Video and sound recordings are limited to 2
minutes for each action.
To study a chart in more detail:
▪Hover the cursor over a point on the chart to get a digital reading.
▪Zoom in with the mouse wheel or by left-clicking and dragging.
▪Pan by using the shift key, left-clicking and dragging.
6.5. Engineer Screen
This screen is used by ExRobotics and Energy Robotics and will not usually be used by customers.
7. Autonomous Missions
7.1. Overview
A robot mission is typically a circuit that starts and finishes at a docking station. During the circuit
the robot will perform actions for points of interest when the robot is located at a waypoint:
▪A typical action is to record a video, snapshot, sound, or sensor reading.
▪Actions are targeted at points of interest (POIs). This is a 3D location at which the appropriate
camera or sensor is targeted. Examples of POIs are valves, flanges and pumps. To target the
POI the robot will usually need to change its azimuth (rotate) and in some cases will require a
camera to lift its field of view (elevate). There can be more than one action at a POI.
▪Waypoints are 2D locations from which POIs are observed. There can be multiple POIs at a
waypoint. In Line-following navigation, waypoints are defined by an array of chili-tags.
▪There can be multiple waypoints on a mission.
When in autonomous mode, by default the robot will stop if the connection to the driver’s control
station is broken for more than 5 seconds. This means that an active control station is required for
the robot to be operational in autonomous mode. The robot will also stop if it loses sight of the
orange line. In this situation an audio/visual alarm will be triggered on the robot control screen.
If an autonomous mission is interrupted (by a person or the software) human intervention is
required to restart the mission. For line-following navigation the robot will need to be re-positioned
over the line, for other forms of autonomy it will need to be positioned close to the autonomous
route. In both situations a human operator will then need to re-click the start mission icon.
Robots perform their missions using orange lines on the ground (line following navigation), chili-
tags (tag-based inspections) and/or a virtual model created by the robot’s LiDAR module (“teach &
repeat” or “click and inspect navigation”). The deployment guide describes how to establish the
required infrastructure. This manual describes how to use that infrastructure using this mission
editor screen.

ExRobotics B.V.
ExR-2 Robot
Operating Guide
Document No.:
20220412IP1
Version No.: 2
Owner:
Ian Peerless
Date:
2022-04-30
Page 18 of 39
This document is considered an uncontrolled copy when printed. Always ensure that you print and use a current version.
Copyright 2022 ExRobotics B.V.
7.2. Line Following Navigation
For undocking, define the exit direction of the robot when leaving the docking station. When the
robot “undocks” it drives in reverse until the middle point between the “Dock 1” and the “12:00”
tags. Then the robot will rotate and exit the docking station to start following the line in one of
three possible directions: 03:00 (robot rotates 90 degrees clockwise), 06:00 (robot rotates 180
degrees), or 09:00 (robot rotates 90 degrees counter-clockwise).
▪From the Mission Editor screen select action “Undocking”
▪Select the Exit Direction (e.g. 09:00)
▪Write a name in Select Action Name e.g. “Undock”
▪Click “Save”
For waypoints that are used as junctions where a robot selects between alternative routes.
▪From the Mission Editor screen select action “Junction”
▪Select Exit Direction (e.g. 06:00).
▪Select the waypoint number for the Junction (the number on the chili-tag).
▪Give the action a name e.g. “Junction 1”
▪Click “Save”
7.3. Tag Based Inspections
1. Launch an autonomous mission from the Robot Control screen and stop the mission a few
meters before the waypoint at which actions are to be performed (with the orange line visible
from the floor cam).
2. Open the Mission Editor screen.
3. Click on “Create Mission” and give a name
Robot control
Add Action
Save Mission

ExRobotics B.V.
ExR-2 Robot
Operating Guide
Document No.:
20220412IP1
Version No.: 2
Owner:
Ian Peerless
Date:
2022-04-30
Page 19 of 39
This document is considered an uncontrolled copy when printed. Always ensure that you print and use a current version.
Copyright 2022 ExRobotics B.V.
4. Click on “Add new action” and press the play button. The robot will drive to the tag and stop
between it and the 12:00 tags.
5. Rotate the robot (and elevate the camera) using the left/right icons on the mission editor
until the sensor is pointing at the POI.
6. Select from the drop-down menus the desired action, sensor, waypoint number and give a
recognisable name (e.g. equipment TAG number) to the action.
7. If “Full-size” is not selected, a region of interest ROI needs to be selected and a photo must be
taken. For this, hover the mouse over the image and use the buttons that appear in the top-
left corner.
8. Click on “Add Action”.
9. Repeat steps from step 5. for all actions that might be required at that waypoint.
10.Drive manually to the line and repeat steps from 4. for other waypoints.
11.Finally, click on the yellow “Disk” icon on the bottom-right to save the mission.
All inspection actions (e.g. Photo) can also be defined and executed at junction waypoints. For this
you just need to add the actions that you require to the waypoint number as described previously
and then add a “Junction” action to the same waypoint number.
Actions are listed by waypoint in the right-hand column of the mission editor. Planners can create
missions by ticking the actions to be performed and saving the mission using a recognisable name.
When a mission is to be performed, select and launch the mission from the Robot Control screen
(see Section 6.2). Actions will be recorded in the mission report.
7.4. Teach and Repeat Navigation
The robot can drive without the need of physical guidance such as a line. Using its 3D sensors, the
robot can extract geometrical information of the surrounding environment. The robot can be used
to record this environment so that it can learn how to reach points of interest.
To teach a mission, the following steps need to be taken:
1. Open the Mission Editor which you can find by clicking on the burger menu on the top-left
corner
2. Press the green button on your gamepad to enable motors
Other manuals for ExR-2
1
Table of contents
Other ExRobotics Robotics manuals
Popular Robotics manuals by other brands

SuperDroid Robots
SuperDroid Robots IG52 DB Assembly and operation

YASKAWA
YASKAWA MOTOMAN-VA1400 instructions

Boston Acoustics
Boston Acoustics SPOT Instructions for use

Evil Mad Scientist
Evil Mad Scientist WaterColorBot v 2.0 quick start guide

MIR
MIR 1350 user guide

ActivMedia Robotics
ActivMedia Robotics Pioneer 2 manual