VRm AreaScan3D User manual

AreaScan3D
Manual

This manual (document version 1.3, date of issue 12/2013) is applica-
ble to the VRmagic AreaScan3D sensor. Subject to technical changes.
The document is protected by copyright. All rights reserved. No part of
this documentation may be reproduced or transmitted for any purpose
in any form or by any means, electronically or mechanically, without
expressly written permission.


4
Table of Contents
1 Product Specifications..................................................................5
1.1 Introduction .............................................................................5
1.2 Scope of Delivery....................................................................6
1.3 Intended Use...........................................................................6
1.4 Type Label...............................................................................6
1.5 Measuring Principle.................................................................7
2 Sensor Design............................................................................. 11
3 Mounting.....................................................................................12
4 Electrical Installation ..................................................................14
4.1 Connections and Indicators...................................................14
4.2 Connecting the Sensor..........................................................15
5 Operation ....................................................................................17
5.1 Measuring Field Requirements .............................................17
5.2 Positioning the Sensor ..........................................................18
5.3 Positioning the Measured Object..........................................19
6 Software / API..............................................................................21
7 Technical Specifications .............................................................22
7.1 Type-specific data .................................................................22
7.2 General data ..........................................................................23
7.3 Conformity to Standards .......................................................23
8 Troubleshooting..........................................................................24
8.1 Hardware...............................................................................24
8.2 Errors in 3D Image ................................................................25

5
Product Specifications
1 Product Specifications
1.1 Introduction
With the AreaScan3D you hold in your hands a groundbreaking
product in the area of optical 3D measurement, offering the following
highlights:
• Optical 3D sensor based on digital fringe projection,
• Metric calibrated measuring data,
• Export formats: 3D point cloud or height encoding gray level image,
• GenICam transport layer compatible,
• Interfaces to Common Vision Blox & HALCON,
• Aluminum case, IP65 conforming,
• M12 standard industrial connectors,
• 24V operation,
• Ethernet.

6
Product Specifications
1.2 Scope of Delivery
• Sensor, factory calibrated,
• Quickstart manual,
• User guide (this document),
• USB stick with software and documentation.
After delivery, check that the contents of the package are complete.
1.3 Intended Use
The VRmagic AreaScan3D is an area sensor. The device is intended
to provide three-dimensional measurements of surfaces of a certain
area and depth from a predefined distance. The field of application is
industrial image processing. The measurements are transmitted via an
Industrial Ethernet interface as ready-to-use 3D data sets.
1.4 Type Label
Fig. 1: Type Label

7
Product Specifications
1.5 Measuring Principle
The 3D measurement method implemented in the sensor is based on
triangulation.
Fringes (stripes) are projected onto an object. A camera sees the fring-
es from a different angle. The perspective deformation of the fringes
seen encodes the 3D coordinates of object points along the fringes.
With this basic principle alone, the distance resolution would depend
on the camera resolution and the triangulation angle.
Fig. 2: Triangulation

8
Product Specifications
Fig. 3: Phase measurement
Phase measuring fringe projection, however, uses stripes that are not
crisp, but have a sine-shaped brightness modulation. These stripes
can be quite wide, as their position can be determined at high accu-
racy from the gray levels of the stripe flanks. The distance resolution
here is no longer dependent on the camera resolution. Distance reso-
lution can be measured more than 10 times more accurate, compared
to mere triangulation.

9
Product Specifications
In order to retain phase values for any point on an object, at least 3
fringe patterns with a phase shift of 1/3 fringe width are required (see
following figure). In practice, at least 4 fringe patterns are used.
Fig. 4: Three patterns combined
Repeating patterns deliver no absolute distance indication, however.
On highly fragmented surfaces it would not be possible to clearly iden-
tify the single fringes.
In order to assign each fringe seen to a proper distance range, a
sequence of binary patterns is projected. Black and white patterns of
stripes with varying width are used for this (see following figure). The
sequence of brightness values results in a binary number which di-
rectly represents the stripe number. This number also indicates which
particular stripe in the phase pattern is seen by a certain pixel of the
camera. Hence, by a combination of binary and phase patterns, even
highly fragmented objects are precisely measured in all detail.
Fig. 5: Binary coded stripes

10
Product Specifications
Fig. 6: DLP projection chip, 2 tilting mirrors (detail drawing), electron microscope image
(Source: TI)
The projector used has to grant precise gray levels because of the
phase measurement. Digital micro mirrors are best suited for this.
Hundreds of thousands of microscopic tilting mirrors are arranged on
a chip (Digital Micromirror Device (DMD)). These micro mirrors can
switch positions within microseconds, by electrostatic force. Gray
levels are generated by varying the on/off times. This results in digital
precision. The DMD is the main component of the projector.
DLP projectors (Digital Light Processing) offer high efficiency, low
temperature drift, and extreme durability (the mechanical components,
etched from pure, monocrystalline silicon, do not know any material
degradation).

11
Sensor Design
2 Sensor Design
Projection unit, camera and signal processing components are en-
closed in a protective housing conforming to IP65.
Projector and camera each work through a protective glass window.
The viewing angle of the camera is perpendicular to the case. The
fringe projector has a slanted adjustment and throws its light into the
camera‘s field of view. Therefore, an exact working distance is re-
quired.
projector
camera
Fig. 7: Sensor design

12
Mounting
3 Mounting
For mounting, the aluminum case of the sensor has six M6 thread
bores at its bottom side. Depending on the mounting surface you will
need suitable fastening material and tools.
The sensor should be mounted directly to a solid metal part. This
ensures an optimum heat dissipation through the sensor case. Ensure
that the ambient conditions are met (see “Technical Specifications” on
page 22).
Mounting the AreaScan3D
1. Decide in which position the AreaScan3D is to be mounted (orien-
tation, distance to the measured surface, see „Technical Specifica-
tions“ on page 22).
2. Vibrations can disturb the measuring process substantially. If
necessary, use vibration absorbing components for mounting the
sensor.
3. Pencil 6 mounting holes onto the mounting surface. The distance
of the hole centers has to match the distance of the mounting hole
centers on the AreaScan3D case (see following figure).
4. Drill 6 holes with sufficient diameters into the mounting surface.
5. Mount the AreaScan3D.

13
Mounting
Fig. 8: Dimensions and mounting holes

14
Electrical Installation
4 Electrical Installation
4.1 Connections and Indicators
The sensor has 3 standard screw connectors according to
IEC 61076-2-101 for M12 (see following figure).
Power: M12 A-coding 4-pin connector
Pin 1+2 24 V (18 ... 30 V DC, ripple < 2 V)
Pin 3+4 GND
Input/Output: M12 A-coding 8-pin connector
Pin 1 Output 0 / +24 V, 100 mA
Pin 2+4+5+6 GND
Pin 3 Input 0 / +24 V
Pin 7 reserved
Pin 8 reserved
100 MBit Ethernet: M12 D-coding 4-pin connector
Pinout according to Industrial Ethernet standard
Status LED (green)
LED off no power supply (also possible directly after
connecting)
LED flashes sensor is booting (duration: approx. 1 minute )
LED on sensor ready

15
Electrical Installation
Ethernet Power Status LEDInput/Output
Fig. 9: Connections and indicators
4.2 Connecting the Sensor
Fig. 10: Communications – block scheme
The sensor is controlled by the network (LAN) connection only. Any
number of sensors can be connected to a network, also via switches
or routers, similar to PCs and other network enabled devices.

16
Electrical Installation
Warning!
Possible damage to sensor.
If the sensor is connected to a network by ODSCAD
or another application and hence is active, the sensor
should never just be switched off, or disconnected from
the power source. This could, under unfavorable circum-
stances, cause damage to the sensor.
Always deactivate the senor beforehand by closing the
software application. Only then disconnect the sensor
from power.
Connecting the AreaScan3D
1. Connect the Ethernet connection of the AreaScan3D to an existing
network or PC using an adequate Industrial Ethernet cable (see
„Connections and Indicators“ on page 14).
2. The I/O lines (Input/Output, one each) can be configured for various
functions via the software (e.g., the input as a trigger for starting a
measurement).
3. Connect the AreaScan3D to a suitable power supply.
ðAfter a short time, the status LED starts flashing. The sensor is
not ready for operation while the LED flashes.
ðAfter approx. 1 minute the LED stops flashing and stays on con-
tinuously. The sensor is ready.
Further procedures such as
• Finding the sensor address,
• Receiving a camera image,
• Software setup,
• Measurement,
are described in the software documentation on the provided USB
stick.

17
Operation
5 Operation
5.1 Measuring Field Requirements
Contrary to 2D cameras, which can be focused to arbitrary distances,
the 3D sensor requires the measured object to be in the field of view
of both the camera and the projector. This defines a certain distance
range and a certain common field of view, or measuring area.
The measuring volume of the 3D sensor is roughly determined by its
lateral length and width, and by a depth range that lies around the opti-
mal measuring distance (see following figure). These parameters have
to be taken into account when positioning the sensor (see “Technical
Specifications” on page 22).
Fig. 11: Three-dimensional measuring volume

18
Operation
5.2 Positioning the Sensor
Positioning the AreaScan3D
1. Align the measured object surface as perpendicular as possible to
the camera axis.
2. Project a cross-hair onto the measured object. The cross-hair can be
selected in the user software. This serves for adjustment of both
distance and positioning.
ðThe correct distance is adjusted when the projected crosshair
appears in the center of the camera live image (see user soft-
ware, live image display).
ðThe available measuring area results from the camera live
image.
Note: The measuring area is smaller than the area on the object
illuminated by the projector. The projection always has a large
reserve area in order to guarantee a complete coverage of the
camera field of view at any distance within the specified dis-
tance range.
3. Because of the triangulation angle and the viewing angle of the
camera, shadowing (areas not seen and therefore not measurable)
can occur on steep edges. To minimize this effect, tilt the object
accordingly.

19
Operation
5.3 Positioning the Measured Object
After the sensor has been positioned, the measured object most prob-
ably has to be aligned, too.
Positioning the measured object
1. Use the user software to project a static fringe pattern onto the
object.
2. Position the object so that the predominant or the most undulated
surface structures are aligned perpendicular to the stripes (see
following figure). Aligning those surface structures parallel to the
stripes would lead to larger triangulation angles up to the occur-
rence of shadowing and can also result in artifacts (height peaks)
with strongly reflective surfaces.
ðA good alignment is given if both wide and narrow fringes deli-
ver a good contrast in the camera image over the entire surface
area and depth range.
3. Depending on surface reflectance and local angles relative to the
sensor, direct reflections of projector light into the camera can oc-
cur. This may result in excessive brightness values and may cause
missing or wrong data at these points.
If this is the case, tilt the object slightly. Keep an eye on the measu-
red surface: it has to stay within the distance range of the central
measuring field.
For further instructions refer to the software documentation on the
provided USB stick.

20
Operation
Fig. 12: Example for good alignment
Table of contents