NExT Lab
Maker SpacesFabLabNExT LabRobotics LabPrint Room and Loans
  • NExT Lab
  • Contact Details
  • NExT Lab Access
  • Sustainability
    • 3D Printing
  • Case Studies & Projects
    • |3DS|VR| Voices of Country
    • |3DS| Our Quiet Neighbour
    • |3DS| OFF FORM | OFF MODERN
    • |3DP|AR| Prosthetic Habitats
    • |AR| Studio 40: The Field
    • |VR|3DP| Gravity Sketch: Door Handles
    • |3DS| 3D Scanning Examples
    • |AR|3DP| GRANULAR
  • 3D Printing |3DP|
    • 3D Printing at the NExT Lab
      • Other 3D Printing Options
    • Get Started
    • Design Approaches
    • Modelling Guidelines
    • 3D Print Farm
      • Quick-Start Guide
        • File Naming Conventions
      • Detailed Overview
        • 3D Printing Mesh Preparation
        • Submitting a Print Request
        • Post-Submission: Updating, Paying & Collecting
        • Slicing & Settings
    • Open Access Printers
      • PRUSA Open-Access
        • Workflows
          • Materials Experimentation
          • Experimental Techniques
        • Prusa i3 MK3S Fundamentals
        • Hardware Glossary
          • Extruder
          • Hotend & Nozzle
          • Print Surface, Bed & Y Axis
          • PINDA Inductive Probe
          • X-Axis Gantry
          • Z-Axis Stage
        • Software/Slicer Glossary
          • Plater/Virtual Print Bed
          • Print Settings
          • Filament Settings
          • Printer Settings
        • Troubleshooting
          • Filament Jam/Clog
          • Worn Nozzle
          • Broken/Loose Heatbreak
          • First Layer Issues/Prints Not Sticking to Bed
          • Stringing & Oozing Hotend
    • Use Own Filament
    • Key Techniques
      • Hollowing Models
      • Combating Warping
      • Split Models & Joints
      • Joints and Connections
      • Fillets & Chamfers
      • Accuracy, Precision & Tolerancing
      • Post-Processing & Finishing
        • No Sanding Method
        • Sanding Method
        • Epoxy Method
        • Fillers Method
      • Printing for Transparency
      • Mesh Techniques
        • Meshes 101
        • Working with Meshes
        • Repairing Meshes
        • Other Techniques
          • Thicken a Mesh with Grasshopper
          • Mesh Manipulation with Blender
          • Custom Supports in Meshmixer
      • Topography Models
      • Using the Makerbot Experimental Extruder
      • Troubleshooting
      • Adjusting Print Settings
    • Resources
      • Downloadable Software & Accounts
      • Software Primers
        • Autodesk Meshmixer
        • Blender
    • Mold Making and Casting
  • 3D Scanning |3DS|
    • 3D Scanning at the NExT Lab
    • 3D Scanning Use Cases
    • Guides
      • Principles of 3D Scanning / Digital Reconstruction
      • Photogrammetry
        • Photogrammetry Theory
        • Photogrammetry Benchmark
        • Technical Guides
          • From Photos to 3D Spatial Data
          • Advanced Techniques
          • Taking Measurements + Visualisation
          • From Photogrammetry to 3D Printing
      • BLK360 Terrestrial LiDAR Scanner
        • BLK360 Benchmark
        • Scan
        • Register
          • Export from iPad
        • Process
      • Artec Handheld SLT Scanners
        • Using the Scanners
        • Manual Alignment
        • Fill Holes
        • Smoothing
        • Frame Selection
      • VLX LiDAR SLAM Scanner
        • VLX setup
        • Preparing to Scan
        • Using the Scanner
        • Processing the Scans
      • Working with 3D Scan Data
        • Point Clouds and Rhino
        • Point Clouds and Cloud Compare
        • Point Clouds and Blender
        • Point Clouds to Meshes
    • Troubleshooting
      • General
      • Artec EVA
      • Leica BLK360
      • VLX
  • Augmented Reality |AR|
    • Augmented/Mixed Reality at the NExT Lab
      • Use Case of AR
    • Guides
      • Hololens 2
      • Fologram
        • Fologram Applications
          • Fologram for Hololens
          • Fologram for Mobile
        • Fologram for Rhino
        • Fologram for Grasshopper
        • Shared Experiences / Tracked Models
        • Extended Functionality
          • Preparing Models for AR
          • Interactivity
          • Fabrication
      • Unity and Vuforia
        • Unity Primer
        • 2D Targets (Image Targets)
        • 3D Targets (Object Targets)
        • Vuforia Primer
        • Creating a Simple AR App
          • Unity Next Steps: Interaction
          • Model Recognition
    • Troubleshooting
      • Hololens & Fologram
      • FAQ: Augmented Reality
    • Resources
      • Platforms (Hardware)
        • Microsoft Hololens
        • Mobile
      • Software Packages
      • Student Contact
        • AR: Intro Sessions
        • AR: Workshops and Resources
          • UntYoung Leaders Program Workshopitled
          • Young Leaders Program Workshop
          • Construction as Alchemy
  • Virtual Reality |VR|
    • Virtual Reality at the NExT Lab
    • Guides
      • Virtual Reality Hardware Set Up
        • Meta Quest 3
          • Troubleshooting
        • HTC Vive Headsets
          • HTC Vive
            • Troubleshooting
          • HTC Vive Pro
          • HTC Vive Cosmos
            • Troubleshooting
      • Twinmotion VR
        • Twinmotion VR: Features
        • Twinmotion VR: Troubleshooting
      • Virtual Reality Experiences
        • Unreal Engine
          • Unreal Engine Primer
            • Process: Level Building, Playing & Packaging
            • Actors: Components, Content and Editors
            • Materials & Textures
            • Lighting & Mobility
            • Player: VR and non-VR
            • Interactivity & Blueprints
          • Unreal Engine: Guides
            • Setting up a VR-ready File & Templates
            • Creating a Basic VR Experience
            • Custom Collision and Navigation
            • UV and Lightmaps
            • Outputting Content
            • Unreal Troubleshooting
            • Point Cloud Visualisation
          • VR: Video Tutorial Series
            • Exporting from Rhino
            • Model Preparation in 3DS Max
            • Unreal Engine
      • Designing in Virtual Reality
        • Gravity Sketch
          • Quick Start
        • Masterpiece Creator
    • Student Contact
      • VR: Intro Sessions
  • Sensing
    • Body Tracking
      • Usage
        • Technical Specifications
      • Data Analysis in Grasshopper
        • Analysis Examples
      • Animated Point Clouds(UE)
  • ROBOTICS
    • Robotic Dog
      • Operational Health & Safety
      • Robot Dog Setup
      • Operation Setup
        • Operation Basics
        • Arm Mode
        • Programming Mode
        • Mapping Mode
      • Advanced Operations
      • Expansion Equipment / Attachments
      • Basic Simulation
      • Troubleshooting
Powered by GitBook
On this page
  • Processing
  • CloudCompare
  • Rhino 8
  • Meshing
  • CloudCompare
  • Mesh Lab
  • Rhino 8
  • Rendering
  • CloudCompare
  • Rhino
  • Blender

Was this helpful?

  1. 3D Scanning |3DS|
  2. Guides

Working with 3D Scan Data

PreviousProcessing the ScansNextPoint Clouds and Rhino

Last updated 1 day ago

Was this helpful?

3D Scanning outcomes usually start in a point cloud format. Point clouds are a collection of enriched data points; each point has geometry; which is a position (XYZ coordinates), and attributes called scalar fields; which can be any form of numerical data like colour (RGB) and surface orientation (vectors).

Point Clouds usually need some form of post-processing, such as cleaning or alignment before they can be used. They can also be converted into a mesh; a digital representation of a surface.

Some workflows, such as Photogrammetry and the Handheld Artec 3D Scanners, will take you through the whole process of cleaning to meshing. Other workflows may just produce a raw or semi-cleaned point cloud.

Most 3D packages like Rhino, Autodesk can natively support point clouds, but how you can interact with them may be limited, so it will depend on what you may want to do with the Point Clouds. However, NExT Lab is familiar and able to advise or consult on the following workflows:


Processing

CloudCompare

CloudCompare is a powerful open-source and free software dedicated to point cloud processing. The interface may be a bit clunky, but it is feature-rich - all the basic and expected geometric processing tools are available for cleaning, alignment and measurements - but it also allows you to directly operate on the point cloud's scalar field attributes as well, allowing you to visualise, calculate, generate and manipulate them. CloudCompare also offers tools for running a variety of geometric and statistical calculations over a dataset; such as more meta attributes like point cloud density, to more geometric attributes like surface roughness, curvature, normals, etc.

NExT Lab recommends Cloud Compare as the primary point cloud processing tool, even though you may not need all the features.

Rhino 8

Rhino can be used for basic cleaning of and display Point Clouds. However, they can be interacted with to an extent as one would point objects - so they can be used as reference for modelling, be sectioned for drawing production, and rendered to a limited degree.


Meshing

CloudCompare

CloudCompare uses the industry standard Poisson Surface Reconstruction to fit a mesh surface onto the point cloud - it requires a clean dataset with surface orientation attributes for best success.

Mesh Lab

Mesh Lab is an open-source and free software dedicated to mesh processing. It has a variety of meshing algorithms which may suit some types of point clouds more than others.

Rhino 8

Rhino 8 can mesh point clouds in a very forgiving, but inaccurate way using its new Shrinkwrap feature - by 'inflating' each point, it combines into a thick surface that will deviate from the initial point cloud. Further standard mesh processing can be carried out to get it closer to the initial point cloud.


Rendering

CloudCompare

CloudCompare provides viewport based rendering, so the scene may be composed with all the features available to CloudCompare - point size, colour view, scalar field view. The camera is limited to perspective and orthogonal views for static camera or fly-throughs. Artistically, there are not many options to alter the aesthetic. Furthermore, it will always render each point as the same size regardless of its distance from the camera, - this tends to result in flat, ghostly effect.

Rhino

Point Clouds are a special type of point object that will be rendered natively but require additional processing. Natively, they are rendered the same way as Cloud Compare as fixed sized objects regardless of distance, and will not interact with Rhino lights and shadows in an intuitive way. Some external render engines like V-Ray do allow more expected manipulations and interactions.

Blender

Points are treated as the vertices of a mesh, this allows them to be manipulated point clouds as if they were any other object inside Blender. It is then fully integrated into the rest of Blender's 3D production pipeline and suite of tools; cameras, modelling, animation and lighting, so there is complete control over the content and aesthetic. Scalar field attributes can be accessed via weight painting and geometry nodes.

Below is an example of a render with each point rendered as a simple physical sphere - where apparent size changes with depth - this mimics real-world perspective allowing for a more intuitive percepetion of depth.


Point Clouds and Cloud Compare
On the left you can see that each point is presented as an XYZ position and its attriubtes. In the preview, the XYZ positions are represented as points. Here the colour attribute is shown.
'Physical' points that can fully interact with light.