NExT Lab
Maker SpacesFabLabNExT LabRobotics LabPrint Room and Loans
  • NExT Lab
  • Contact Details
  • NExT Lab Access
  • Sustainability
    • 3D Printing
  • Case Studies & Projects
    • |3DS|VR| Voices of Country
    • |3DS| Our Quiet Neighbour
    • |3DS| OFF FORM | OFF MODERN
    • |3DP|AR| Prosthetic Habitats
    • |AR| Studio 40: The Field
    • |VR|3DP| Gravity Sketch: Door Handles
    • |3DS| 3D Scanning Examples
    • |AR|3DP| GRANULAR
  • 3D Printing |3DP|
    • 3D Printing at the NExT Lab
      • Other 3D Printing Options
    • Get Started
    • Design Approaches
    • Modelling Guidelines
    • 3D Print Farm
      • Quick-Start Guide
        • File Naming Conventions
      • Detailed Overview
        • 3D Printing Mesh Preparation
        • Submitting a Print Request
        • Post-Submission: Updating, Paying & Collecting
        • Slicing & Settings
    • Open Access Printers
      • PRUSA Open-Access
        • Workflows
          • Materials Experimentation
          • Experimental Techniques
        • Prusa i3 MK3S Fundamentals
        • Hardware Glossary
          • Extruder
          • Hotend & Nozzle
          • Print Surface, Bed & Y Axis
          • PINDA Inductive Probe
          • X-Axis Gantry
          • Z-Axis Stage
        • Software/Slicer Glossary
          • Plater/Virtual Print Bed
          • Print Settings
          • Filament Settings
          • Printer Settings
        • Troubleshooting
          • Filament Jam/Clog
          • Worn Nozzle
          • Broken/Loose Heatbreak
          • First Layer Issues/Prints Not Sticking to Bed
          • Stringing & Oozing Hotend
    • Use Own Filament
    • Key Techniques
      • Hollowing Models
      • Combating Warping
      • Split Models & Joints
      • Joints and Connections
      • Fillets & Chamfers
      • Accuracy, Precision & Tolerancing
      • Post-Processing & Finishing
        • No Sanding Method
        • Sanding Method
        • Epoxy Method
        • Fillers Method
      • Printing for Transparency
      • Mesh Techniques
        • Meshes 101
        • Working with Meshes
        • Repairing Meshes
        • Other Techniques
          • Thicken a Mesh with Grasshopper
          • Mesh Manipulation with Blender
          • Custom Supports in Meshmixer
      • Topography Models
      • Using the Makerbot Experimental Extruder
      • Troubleshooting
      • Adjusting Print Settings
    • Resources
      • Downloadable Software & Accounts
      • Software Primers
        • Autodesk Meshmixer
        • Blender
    • Mold Making and Casting
  • 3D Scanning |3DS|
    • 3D Scanning at the NExT Lab
    • 3D Scanning Use Cases
    • Guides
      • Principles of 3D Scanning / Digital Reconstruction
      • Photogrammetry
        • Photogrammetry Theory
        • Photogrammetry Benchmark
        • Technical Guides
          • From Photos to 3D Spatial Data
          • Advanced Techniques
          • Taking Measurements + Visualisation
          • From Photogrammetry to 3D Printing
      • BLK360 Terrestrial LiDAR Scanner
        • BLK360 Benchmark
        • Scan
        • Register
          • Export from iPad
        • Process
      • Artec Handheld SLT Scanners
        • Using the Scanners
        • Manual Alignment
        • Fill Holes
        • Smoothing
        • Frame Selection
      • VLX LiDAR SLAM Scanner
        • VLX setup
        • Preparing to Scan
        • Using the Scanner
        • Processing the Scans
      • Working with 3D Scan Data
        • Point Clouds and Rhino
        • Point Clouds and Cloud Compare
        • Point Clouds and Blender
        • Point Clouds to Meshes
    • Troubleshooting
      • General
      • Artec EVA
      • Leica BLK360
      • VLX
  • Augmented Reality |AR|
    • Augmented/Mixed Reality at the NExT Lab
      • Use Case of AR
    • Guides
      • Hololens 2
      • Fologram
        • Fologram Applications
          • Fologram for Hololens
          • Fologram for Mobile
        • Fologram for Rhino
        • Fologram for Grasshopper
        • Shared Experiences / Tracked Models
        • Extended Functionality
          • Preparing Models for AR
          • Interactivity
          • Fabrication
      • Unity and Vuforia
        • Unity Primer
        • 2D Targets (Image Targets)
        • 3D Targets (Object Targets)
        • Vuforia Primer
        • Creating a Simple AR App
          • Unity Next Steps: Interaction
          • Model Recognition
    • Troubleshooting
      • Hololens & Fologram
      • FAQ: Augmented Reality
    • Resources
      • Platforms (Hardware)
        • Microsoft Hololens
        • Mobile
      • Software Packages
      • Student Contact
        • AR: Intro Sessions
        • AR: Workshops and Resources
          • UntYoung Leaders Program Workshopitled
          • Young Leaders Program Workshop
          • Construction as Alchemy
  • Virtual Reality |VR|
    • Virtual Reality at the NExT Lab
    • Guides
      • Virtual Reality Hardware Set Up
        • Meta Quest 3
          • Troubleshooting
        • HTC Vive Headsets
          • HTC Vive
            • Troubleshooting
          • HTC Vive Pro
          • HTC Vive Cosmos
            • Troubleshooting
      • Twinmotion VR
        • Twinmotion VR: Features
        • Twinmotion VR: Troubleshooting
      • Virtual Reality Experiences
        • Unreal Engine
          • Unreal Engine Primer
            • Process: Level Building, Playing & Packaging
            • Actors: Components, Content and Editors
            • Materials & Textures
            • Lighting & Mobility
            • Player: VR and non-VR
            • Interactivity & Blueprints
          • Unreal Engine: Guides
            • Setting up a VR-ready File & Templates
            • Creating a Basic VR Experience
            • Custom Collision and Navigation
            • UV and Lightmaps
            • Outputting Content
            • Unreal Troubleshooting
            • Point Cloud Visualisation
          • VR: Video Tutorial Series
            • Exporting from Rhino
            • Model Preparation in 3DS Max
            • Unreal Engine
      • Designing in Virtual Reality
        • Gravity Sketch
          • Quick Start
        • Masterpiece Creator
    • Student Contact
      • VR: Intro Sessions
  • Sensing
    • Body Tracking
      • Usage
        • Technical Specifications
      • Data Analysis in Grasshopper
        • Analysis Examples
      • Animated Point Clouds(UE)
  • ROBOTICS
    • Robotic Dog
      • Operational Health & Safety
      • Robot Dog Setup
      • Operation Setup
        • Operation Basics
        • Arm Mode
        • Programming Mode
        • Mapping Mode
      • Advanced Operations
      • Expansion Equipment / Attachments
      • Basic Simulation
      • Troubleshooting
Powered by GitBook
On this page
  • Interface
  • Datasets
  • Scanning Interface
  • Photo Settings
  • Plan Your Scan
  • Loop Closures
  • Time Limits
  • Preparing the Space

Was this helpful?

  1. 3D Scanning |3DS|
  2. Guides
  3. VLX LiDAR SLAM Scanner

Preparing to Scan

PreviousVLX setupNextUsing the Scanner

Last updated 2 months ago

Was this helpful?

Interface

Datasets

This section provides an overview of datasets. Familiarise yourself before beginning your scan.

  1. Once the scanner has been turned on, calibrated, and mounted, proceed with these steps.

  2. Make yourself a new dataset.

If you close the dataset you will not be able to add to it again.

However, you can combine two datasets after extracting them from the scanner during processing stages.

Scanning Interface

Begin the scan only after you have understood the interface and spatial requirements of the VLX.

  • New dataset > name your dataset > start mapping

  • This screen will now show up (LIDAR mapping of your environment)

Use the pause/start button when

  • people or moving objects are entering the environment to minimise noise and alignment issues.

  • the device is being where you do not want to scan or take photos, but still want to continue capturing in the same Dataset

Note: In pause mode, only the 3D scan process is paused. Localization is always running. To avoid impairing the quality of the recorded map, do not move faster or less carefully than normal.

Caution: If the speed indicator turns red, it means that the user is moving so fast that the quality of the dataset might suffer. Move or rotate more slowly.

Photo Settings

  • Point cloud colour is achieved by projecting a photo onto the point data.

  • There are two methods for taking photos;

  1. Manual: Trigger the photo manually using the button below the screen / above the handle.

  1. Automatic: Once you travel the specified distance, the scanner will automatically take photos (adjustable in settings). You can take additional photos manually while automatic mode is switched on.

  • These options can be selected in settings OR if you have already started scanning, but tapping the button displayed below

  • You can review photos in the photo viewer. Here you can also delete photos containing unwanted elements to exclude them from the colour information.

It is recommended to take photos every 1m for good point-cloud colouring. In more open spaces a photo every three meters is sufficient.


Plan Your Scan

SLAM

The VLX 3 uses LiDAR-based SLAM (Simultaneous Localisation and Mapping) technology. SLAM is able to map out, and place the device spatially within the environment while moving. As you walk, it will continuously capture data and align it, building up a digital environment.

Drift

SLAM is therefore an incremental process, which means that small mis-alignments will accumulate over large distances; this is called drift.

On the dashboard, this may looked like curved forms where it it is actually linear, or overlapping, misaligned point clouds.

  • If this occurs, return to a point of the scan that has been scanned with higher accuracy, and make your way towards to the problem area.

  • Alternatively, you can scan the problem area again in a different dataset.

  • Both options reuquire post-processing.

  • Allowing loop closures while scanning will reduce the possibility of drift.

Loop Closures

Points where you overlap point-cloud data are called loop closures.

While you loop around the site during the scan, come back to points you have already scanned to create points of reference and better align your data.

Your original path is shown as a solid line on the user interface. Cross this line to create a loop closure.

Examples of how you might do this (in plan view) are below.

Ensure loop closured every 15 - 30m of your scan. This can get complicated so where possible, plan ahead.


Time Limits

Each dataset has a limitation of 1 hour for manageable file-sizes. If your project may exceed this time in size, please refer to the section on Large Sites in the next article.

Battery Life: 1.5hrs for 2 batteries, hot-swappable. Charge Time: 2hrs. With the four batteries, you should be able to get up to 3 datasets (1 hour each).

Preparing the Space

To ensure the quality of the scanned data, prepare the site so that you do not encounter any obstacles. Where possible, complete the steps below:

  • Open all the doors with doorstoppers

  • Turn on all lights and open window coverings, such as curtains and blinds.

  • Move obstacles and any confidential material out of view.

  • Ensure the batteries are fully charged.

  • Ensure the SSD is inserted correctly into its slot and is ready to receive data.

Do not use the scanner in the rain. If it starts raining during your scan, move to cover immediately.

Do not scan in high temperatures. The scanner can overheat if it exceeds 35 degrees Celsius.

Lighting conditions

Overcast light produces the best colour info from the photos, however bright light and dark light are suitable to scan in.

  • Please note than in bright light, reflective surfaces may not scan properly.

  • If you are scanning in the dark, photos will bot be able to pick up colour information. Also be wary of your surroundings and do not risk falling.

Obstacles

More info on how to navigate obstacles and environmental factors while scanning is provided on the next page

Locating the manual photo button