MIR 250 Quick start guide

Technical Guide (en)
Date: 07/2020
Revision: v.1.0

MiR250 Technical Guide (en) 07/2020 - v.1.0 ©Copyright 2020: Mobile Industrial Robots A/S. 2
Copyright and disclaimer
All rights reserved. No parts of this manual may be reproduced in any form without the
express written permission of Mobile Industrial Robots A/S (MiR). MiR makes no warranties,
expressed or implied, in respect of this document or its contents. In addition, the contents of
the document are subject to change without prior notice. Every precaution has been taken in
the preparation of this manual. Nevertheless, MiR assumes no responsibility for errors or
omissions or any damages resulting from the use of the information contained.
Copyright © 2020 by Mobile Industrial Robots A/S.
Contact the manufacturer:
Mobile Industrial Robots A/S
Emil Neckelmanns Vej 15F
DK-5220 Odense SØ
www.mobile-industrial-robots.com
Phone: +45 20 377 577
Email: support@mir-robots.com
CVR: 35251235

MiR250 Technical Guide (en) 07/2020 - v.1.0 ©Copyright 2020: Mobile Industrial Robots A/S. 3
Table of contents
1. About this document 4
2. Robot sub-systems 5
2.1 Navigation and control system 5
2.2 Safety system 22
2.3 Motor and brake control system 39
3. Robot components 45
3.1 Safety laser scanners 45
3.2 3D cameras 50
3.3 Proximity sensor modules and indicator lights 54
3.4 Drive train 59
3.5 Power board 64
3.6 Safety contactors 66
3.7 Robot computer 70
3.8 Safety PLC 72
3.9 Router and access point 76

1. About this document
MiR250 Technical Guide (en) 07/2020 - v.1.0 ©Copyright 2020: Mobile Industrial Robots A/S. 4
1. About this document
This document describes the components, sub-systems, and connections in the MiR250
robot, providing an overview of how the robot works.
This guide is intended to be used to provide additional information regarding how MiR250
robots and their key components work. The guide focuses mainly on providing information
that may be used for troubleshooting the robot.

2. Robot sub-systems
MiR250 Technical Guide (en) 07/2020 - v.1.0 ©Copyright 2020: Mobile Industrial Robots A/S. 5
2. Robot sub-systems
The following sections describe these robot sub-systems:
•The navigation and control system determines the path the robot should follow to reach
its goal destination.
•The safety system monitors the robot's components and surroundings through several
functions and brings the robot to a stop if an unsafe situation occurs.
•The motor and brake system is part of the two previous systems and is used to either
move the robot along its path or to bring the robot to a stop.
2.1 Navigation and control system
The navigation and control system is responsible for driving the robot to a goal position
while avoiding obstacles.
System overview
The purpose of the navigation and control system is to guide the robot from one position on
a map to another position. The user provides the map and chooses the goal position the
robot must move to. The diagram in Figure 2.1 describes the processes in the navigation and
control system. The main processes involved in the navigation system are:
•Global planner
The navigation process starts with the global planner determining the best path for the
robot to get from its current position to the goal position. It plans the route to avoid walls
and structures on the map.
•Local planner
While the robot is following the path made by the global planner, the local planner
continuously guides the robot around detected obstacles that are not included on the
map.
•Obstacle detection
The safety laser scanners, 3D cameras, and proximity sensors are used to detect obstacles
in the work environment. These are used to prevent the robot from colliding with
obstacles.
•Localization
This process determines the robot's current position on the map based on input from the
motor encoders, inertial measurement unit(IMU), and safety laser scanners.

2. Robot sub-systems
MiR250 Technical Guide (en) 07/2020 - v.1.0 ©Copyright 2020: Mobile Industrial Robots A/S. 6
•Motor controller, motors, and brakes
The motor controller determines how much power each motor must receive to drive the
robot along the intended path safely. Once the robot reaches the goal position, the brakes
are engaged to stop the robot.
Each part of the process is described in greater detail in the following sections.
Figure 2.1. Flow chart of the navigation and robot system. The user provides the necessary input for the robot
to generate a path to the goal position. The robot executes the steps in the navigation loop until it reaches the
goal position and stops by engaging the brakes.

2. Robot sub-systems
MiR250 Technical Guide (en) 07/2020 - v.1.0 ©Copyright 2020: Mobile Industrial Robots A/S. 7
User input
To enable the robot to navigate autonomously, you must provide the following:
•A map of the area, either from a .png file or created with the robot using the mapping
function.
•A goal destination on that map.
•The current position of the robot on the map. This usually only needs to be provided when
a new map is activated.
Figure 2.2. On the map, the current position of the robot is identified by the robot icon , and the goal
destination is the robot position in this example. The robot computer now determines a path from the
current position to the goal position.
Once the robot computer has a map with the robot's current position and a goal destination,
it begins planning a route between the two positions on the map using the global planner.
Global planner
The global planner is an algorithm in the robot computer that generates a path to the goal
position.This path is known as the global path.

2. Robot sub-systems
MiR250 Technical Guide (en) 07/2020 - v.1.0 ©Copyright 2020: Mobile Industrial Robots A/S. 8
Figure 2.3. The global path is shown with the blue dotted line that leads from the start to the goal position.
The global path is created only at the start of a move action or if the robot has failed to
reach the goal position and needs to create a new path. The generated path only avoids the
obstacles the robot detected when the path was made and the obstacles marked on the map.
The global path can be seen in the robot interface as a dotted line from the robot's start
position to the goal position.
Figure 2.4. The dotted line from the start position of the robot to the goal position is the global path generated
by the robot computer.

2. Robot sub-systems
MiR250 Technical Guide (en) 07/2020 - v.1.0 ©Copyright 2020: Mobile Industrial Robots A/S. 9
Local planner
The local planner is used continuously while the robot is driving to guide it around obstacles
while still following the global path.
Figure 2.5. The global path is indicated with the dotted blue line. The local path is indicated with the blue arrow,
showing the robot driving around a dynamic obstacle.
Whereas the global planner creates a single path from start to finish, the local planner
continues to create new paths that adapt to the current position of the robot and the
obstacles around it. The local planner only processes the area that is immediately
surrounding the robot, using input from the robot sensors to avoid obstacles.
The local path is not displayed in the robot interfaces. The arrows in the
images here are visual aid used in this guide only.

2. Robot sub-systems
MiR250 Technical Guide (en) 07/2020 - v.1.0 ©Copyright 2020: Mobile Industrial Robots A/S. 10
Figure 2.6. The local planner usually follows the global planner, but as soon as an obstacle gets in the way, the
local planner determines which immediate path will get the robot around the obstacle. In this case, it will likely
choose the path indicated with a green arrow.
Once the local path is determined, the robot computer derives the desired rotational
velocity of each drive wheel to make the robot follow the local path, and sends the desired
velocities for each motor to the motor controllers—see Motor controller and motors on
page19.
Obstacle detection
The robot detects obstacles continuously while driving. This enables the robot to use the
local planner to drive around obstacles and to determine the robot's current position on the
map.
Three sensor types are responsible for detecting obstacles:
•The safety laser scanners
•The 3D cameras
•The proximity sensors
The following illustrations show how the robot sees the surrounding environment and how it
is portrayed in the robot interface.

2. Robot sub-systems
MiR250 Technical Guide (en) 07/2020 - v.1.0 ©Copyright 2020: Mobile Industrial Robots A/S. 11
What a human sees What the laser scanners
see What the 3D cameras see
A chair placed in the
corner of a room is
detectable by the robot.
In the robot interface, the
red lines on a map are
obstacles detected by the
laser scanners, and the
purple clouds are an
aggregate of the 3D
camera and laser scanner
data. The scanners only
detect the four legs of the
chair.
The 3D cameras detect
more details of the chair
when the robot gets close
enough to it.This view
cannot be seen in the
robot interface.
Safety laser scanners
The safety laser scanners on MiR250 are of the type AOPDDR (active opto-electronic
protective device responsive to diffuse reflection). AOPDDR is a protective device that uses
opto-electronic transmission and reception elements to detect the reflection of the optical
radiation generated by the protective device. The reflection is generated by an object in a
defined two-dimensional area. This is a type of ESPE (electro-sensitive protective device). In
this guide, the term safety laser scanner is used.
Two safety laser scanners, diagonally placed on front and rear corners of the robot, scan
their surroundings. Each safety laser scanner has a 270° field of view, overlapping and thus
providing a full 360° visual protection around the robot.

2. Robot sub-systems
MiR250 Technical Guide (en) 07/2020 - v.1.0 ©Copyright 2020: Mobile Industrial Robots A/S. 12
When in motion, the safety laser scanners continuously scan the surroundings to detect
objects.
Figure 2.7. The two safety laser scanners together provide a full 360° view around the robot.
The laser scanners have the following limitations:
•They can only detect objects that intersect a plane at 200 mm height from the floor.
•They do not detect transparent obstacles well.
•The scanner data can be inaccurate when detecting reflective obstacles.
•The laser scanners may detect phantom obstacles if they are exposed to strong direct
light.
If you are using the robot in an area with walls made of glass or reflective
material, mark the walls as Forbidden zones on the map and not as a walls.
Walls in the map that the robot cannot detect will confuse the robot's
navigation system.

2. Robot sub-systems
MiR250 Technical Guide (en) 07/2020 - v.1.0 ©Copyright 2020: Mobile Industrial Robots A/S. 13
3Dcameras
Two 3D cameras positioned on the front of the robot detect objects in front of the robot. The
3D cameras detect objects:
•Vertically up to 1800 mm at a distance of 1200 mm in front of the robot.
•Horizontally in an angle of 114° and 250 mm to the first view of ground.
The 3D cameras are only used for navigation. They are not part of the robot's safety system.
The camera readouts are used as 3D point cloud data. They are not recording
recognizable objects or people.
Figure 2.8 shows the field of view of the cameras.

2. Robot sub-systems
MiR250 Technical Guide (en) 07/2020 - v.1.0 ©Copyright 2020: Mobile Industrial Robots A/S. 14
Figure 2.8. The two 3D cameras can see objects up to 1800 mm above floor height at a distance of 1200 mm in
front of the robot and have a horizontal field of view of 114°.
The 3D cameras have the following limitations:
•They can only detect objects in front of the robot, unlike the full 360° view of the laser
scanners.
•They do not detect transparent or reflective obstacles well.
•They do not detect holes or decending stairways.
•The cameras are not reliable at determining depth when viewing structures with

2. Robot sub-systems
MiR250 Technical Guide (en) 07/2020 - v.1.0 ©Copyright 2020: Mobile Industrial Robots A/S. 15
repetitive patterns.
•The cameras may detect phantom obstacles if they are exposed to strong direct light.
Proximity sensors
Proximity sensors placed in all four corners of the robot detect objects close to the floor that
cannot be detected by the safety laser scanners.
Using infrared light, the proximity sensors point downwards and make sure that the robot
does not run into low objects, such as pallets and forklift forks. They have a range between
5-20 cm around the robot.
Because of the proximity sensor's limited range, the data from them is only useful when the
robot is standing still or moving at reduced speeds, for example, when the robot it pivoting
or docking.
Figure 2.9. The proximity sensors in the corners of the robot detect objects below the safety laser scanners
plane of view.
The proximity sensors have the following limitations:
•They do not have a long range and are mainly used to detect obstacles missed by the
laser scanners and cameras.
•When the robot is driving fast, obstacles detected by the proximity sensors are too close
for the robot to stop for or avoid them.

2. Robot sub-systems
MiR250 Technical Guide (en) 07/2020 - v.1.0 ©Copyright 2020: Mobile Industrial Robots A/S. 16
Localization
The goal of the localization process is for the robot to determine where it is currently
located on its map. The robot has three inputs for determining where it is:
•The initial position of the robot. This is used as a reference point for the methods used to
determine the robot position.
•The IMUand encoder data. This is used to determine how far and fast the robot has
traveled from the initial position.
•The laser scanner data. This is used to determine the likely positions of the robot by
comparing the data with nearby walls on the map.
This data is used by a particle filter to determine the most likely position of the robot on the
map.
IMUand motor encoders
Both the data from the IMU (Inertial Measurement Unit) and motor encoders is used to
derive where and how fast the robot has traveled over time from its initial position. The
combination of both sets of data makes the derived position more accurate.
The IMU measures the acceleration and pivot speed of the robot. From this, the robot can
derive the distance the robot has driven and how much it has turned.
The motor encoders measure how many times the motor of each drive wheel has rotated.
With each rotation of the motor, the robot has driven forward the same length of the
circumference of the drive wheel. Measuring it from both encoders also enables the robot to
determine when it is turning.
If the drive wheels are worn down significantly or the robot is running with an
incorrect gear ratio, the robot will miscalculate how far it has traveled based
on the encoder data.
Laser scanners and particle filtering
The robot computer compares the input from the laser scanners with the walls on the map
to try to find the best match. This is done using a particle filter algorithm. The robot

2. Robot sub-systems
MiR250 Technical Guide (en) 07/2020 - v.1.0 ©Copyright 2020: Mobile Industrial Robots A/S. 17
computer only compares with the area where it expects the robot to be based on the
encoder and IMUdata. This means it is important that the initial position of the robot is
correct.
The robot computer uses the comparison and the odometry data from the encoders and
IMUto produce a number of points where the robot is most likely to be. As the robot moves
and the sensors collect another set of data, the robot computes another set of likely positions
based on new data and the previous data. This process is known as particle filtering.
Failed localization Successful localization
Figure 2.10. In a failed localization, the robot cannot determine a position where the red lines (laser scanner
data) align with the black lines on the map. When the robot can localize itself, it determines a cluster of likely
positions, indicated in the images above as blue dots.
To make sure the robot can localize itself well using particle filtering, consider the following
when creating a map:
•There must be unique and distinguishable static landmarks on the map that are easily
recognizable. A landmark is a permanent structure that the robot can use to orient itself,
such as corners, doorways, columns, and shelves.

2. Robot sub-systems
MiR250 Technical Guide (en) 07/2020 - v.1.0 ©Copyright 2020: Mobile Industrial Robots A/S. 18
No distinguishable landmarks Many distinguishing landmarks
•The robot must be able to detect the static landmarks that are marked on the map to be
able to approximate its current position. Make sure there are not too many dynamic
obstacles around the robot so that it cannot detect any static landmarks.
Cannot detect any landmarks Can detect enough landmarks

2. Robot sub-systems
MiR250 Technical Guide (en) 07/2020 - v.1.0 ©Copyright 2020: Mobile Industrial Robots A/S. 19
•To improve the robot's localization, it can often help to divide long continuous walls on
the map. Even if the walls are connected in the actual work environment, it can help the
localization process if the walls on the map are divided into smaller sections.
Undivided walls Divided walls
•The robot does not compare the laser scanner data with the entire map, but only around
the area that it expects to be close to based on the IMU and encoder data and its initial
position. This is why it is important that the initial position you place the robot at on the
map is accurate.
•The robot can drive for a short distance without being correctly localized. As it drives, the
estimated positions should converge to a small area, indicating the robot has determined
an accurate estimate. If this does not occur within a set time limit, the robot reports a
localization error.
Motor controller and motors
The robot computer compares the desired velocity of the robot with its current velocity. The
computer determines how far the rotational velocity of each motor is from the desired
velocity of each motor needed to make the robot follow the intended path. The robot
computer sends the necessary changes in velocity for each motor to the motor controllers.

2. Robot sub-systems
MiR250 Technical Guide (en) 07/2020 - v.1.0 ©Copyright 2020: Mobile Industrial Robots A/S. 20
The motor controller translates the difference into the amount of power that must be sent to
each motor to achieve the desired velocity. The motor controller regulates whether the
amount of power sent to the motors is resulting in the correct velocity by translating the
motor encoder data into the robot's velocity and comparing this to the desired velocity—see
Motor and brake control system on page39.
The robot computer keeps checking that the position derived from the localization process is
following the intended path. If the robot begins to drive away from the path, the computer
corrects the desired velocity that it sends to the motor controllers to ensure that the robot
drives with the correct trajectory.
In this way, the robot uses its sensors to determine how far it is from achieving the desired
trajectory, enabling it to correct itself as it drives.
Brakes
Once the approximated position of the robot determined from localization is the same as the
goal position calculated by the global planner, the robot stops by using the dynamic brake
function.
Figure 2.11. The robot has reached the goal position and stops by engaging the brakes.
The dynamic brake function stops the robot by short circuiting the power that was used to
rotate the motor. When this happens, the power that was used to drive the robot forward is
now reversed to stop the rotation of the drive wheels
Once the robot has stopped, the mechanical brakes are enabled. These brakes are used to
keep the robot in place once it has stopped. You can compare it with the parking brake or
hand brake in a car.
Other manuals for 250
5
Table of contents
Other MIR Robotics manuals