NExT Lab
Maker SpacesFabLabNExT LabRobotics LabPrint Room and Loans
  • NExT Lab
  • Contact Details
  • NExT Lab Access
  • Sustainability
    • 3D Printing
  • Case Studies & Projects
    • |3DS|VR| Voices of Country
    • |3DS| Our Quiet Neighbour
    • |3DS| OFF FORM | OFF MODERN
    • |3DP|AR| Prosthetic Habitats
    • |AR| Studio 40: The Field
    • |VR|3DP| Gravity Sketch: Door Handles
    • |3DS| 3D Scanning Examples
    • |AR|3DP| GRANULAR
  • 3D Printing |3DP|
    • 3D Printing at the NExT Lab
      • Other 3D Printing Options
    • Get Started
    • Design Approaches
    • Modelling Guidelines
    • 3D Print Farm
      • Quick-Start Guide
        • File Naming Conventions
      • Detailed Overview
        • 3D Printing Mesh Preparation
        • Submitting a Print Request
        • Post-Submission: Updating, Paying & Collecting
        • Slicing & Settings
    • Open Access Printers
      • PRUSA Open-Access
        • Workflows
          • Materials Experimentation
          • Experimental Techniques
        • Prusa i3 MK3S Fundamentals
        • Hardware Glossary
          • Extruder
          • Hotend & Nozzle
          • Print Surface, Bed & Y Axis
          • PINDA Inductive Probe
          • X-Axis Gantry
          • Z-Axis Stage
        • Software/Slicer Glossary
          • Plater/Virtual Print Bed
          • Print Settings
          • Filament Settings
          • Printer Settings
        • Troubleshooting
          • Filament Jam/Clog
          • Worn Nozzle
          • Broken/Loose Heatbreak
          • First Layer Issues/Prints Not Sticking to Bed
          • Stringing & Oozing Hotend
    • Use Own Filament
    • Key Techniques
      • Hollowing Models
      • Combating Warping
      • Split Models & Joints
      • Joints and Connections
      • Fillets & Chamfers
      • Accuracy, Precision & Tolerancing
      • Post-Processing & Finishing
        • No Sanding Method
        • Sanding Method
        • Epoxy Method
        • Fillers Method
      • Printing for Transparency
      • Mesh Techniques
        • Meshes 101
        • Working with Meshes
        • Repairing Meshes
        • Other Techniques
          • Thicken a Mesh with Grasshopper
          • Mesh Manipulation with Blender
          • Custom Supports in Meshmixer
      • Topography Models
      • Using the Makerbot Experimental Extruder
      • Troubleshooting
      • Adjusting Print Settings
    • Resources
      • Downloadable Software & Accounts
      • Software Primers
        • Autodesk Meshmixer
        • Blender
    • Mold Making and Casting
  • 3D Scanning |3DS|
    • 3D Scanning at the NExT Lab
    • 3D Scanning Use Cases
    • Guides
      • Principles of 3D Scanning / Digital Reconstruction
      • Photogrammetry
        • Photogrammetry Theory
        • Photogrammetry Benchmark
        • Technical Guides
          • From Photos to 3D Spatial Data
          • Advanced Techniques
          • Taking Measurements + Visualisation
          • From Photogrammetry to 3D Printing
      • BLK360 Terrestrial LiDAR Scanner
        • BLK360 Benchmark
        • Scan
        • Register
          • Export from iPad
        • Process
      • Artec Handheld SLT Scanners
        • Using the Scanners
        • Manual Alignment
        • Fill Holes
        • Smoothing
        • Frame Selection
      • VLX LiDAR SLAM Scanner
        • VLX setup
        • Preparing to Scan
        • Using the Scanner
        • Processing the Scans
      • Working with 3D Scan Data
        • Point Clouds and Rhino
        • Point Clouds and Cloud Compare
        • Point Clouds and Blender
        • Point Clouds to Meshes
    • Troubleshooting
      • General
      • Artec EVA
      • Leica BLK360
      • VLX
  • Augmented Reality |AR|
    • Augmented/Mixed Reality at the NExT Lab
      • Use Case of AR
    • Guides
      • Hololens 2
      • Fologram
        • Fologram Applications
          • Fologram for Hololens
          • Fologram for Mobile
        • Fologram for Rhino
        • Fologram for Grasshopper
        • Shared Experiences / Tracked Models
        • Extended Functionality
          • Preparing Models for AR
          • Interactivity
          • Fabrication
      • Unity and Vuforia
        • Unity Primer
        • 2D Targets (Image Targets)
        • 3D Targets (Object Targets)
        • Vuforia Primer
        • Creating a Simple AR App
          • Unity Next Steps: Interaction
          • Model Recognition
    • Troubleshooting
      • Hololens & Fologram
      • FAQ: Augmented Reality
    • Resources
      • Platforms (Hardware)
        • Microsoft Hololens
        • Mobile
      • Software Packages
      • Student Contact
        • AR: Intro Sessions
        • AR: Workshops and Resources
          • UntYoung Leaders Program Workshopitled
          • Young Leaders Program Workshop
          • Construction as Alchemy
  • Virtual Reality |VR|
    • Virtual Reality at the NExT Lab
    • Guides
      • Virtual Reality Hardware Set Up
        • Meta Quest 3
          • Troubleshooting
        • HTC Vive Headsets
          • HTC Vive
            • Troubleshooting
          • HTC Vive Pro
          • HTC Vive Cosmos
            • Troubleshooting
      • Twinmotion VR
        • Twinmotion VR: Features
        • Twinmotion VR: Troubleshooting
      • Virtual Reality Experiences
        • Unreal Engine
          • Unreal Engine Primer
            • Process: Level Building, Playing & Packaging
            • Actors: Components, Content and Editors
            • Materials & Textures
            • Lighting & Mobility
            • Player: VR and non-VR
            • Interactivity & Blueprints
          • Unreal Engine: Guides
            • Setting up a VR-ready File & Templates
            • Creating a Basic VR Experience
            • Custom Collision and Navigation
            • UV and Lightmaps
            • Outputting Content
            • Unreal Troubleshooting
            • Point Cloud Visualisation
          • VR: Video Tutorial Series
            • Exporting from Rhino
            • Model Preparation in 3DS Max
            • Unreal Engine
      • Designing in Virtual Reality
        • Gravity Sketch
          • Quick Start
        • Masterpiece Creator
    • Student Contact
      • VR: Intro Sessions
  • Sensing
    • Body Tracking
      • Usage
        • Technical Specifications
      • Data Analysis in Grasshopper
        • Analysis Examples
      • Animated Point Clouds(UE)
  • ROBOTICS
    • Robotic Dog
      • Operational Health & Safety
      • Robot Dog Setup
      • Operation Setup
        • Operation Basics
        • Arm Mode
        • Programming Mode
        • Mapping Mode
      • Advanced Operations
      • Expansion Equipment / Attachments
      • Basic Simulation
      • Troubleshooting
Powered by GitBook
On this page
  • Agisoft Metashape
  • User Interface
  • Navigating the Model Space
  • Workflow
  • Enabling the GPU
  • Adding Photos
  • Adding Chunks
  • Viewing Photos
  • Aligning Photos
  • Align Photos
  • Checkpoint
  • Hiding/Showing Interface Elements
  • Cleaning Up Tie-Points
  • Generating Dense Cloud
  • Viewing the Dense Cloud
  • Generating a Mesh
  • Exporting
  • Exporting Point Clouds
  • Exporting Meshes
  • Taking Measurements + Visualisation
  • Miscellaneous Functions

Was this helpful?

  1. 3D Scanning |3DS|
  2. Guides
  3. Photogrammetry
  4. Technical Guides

From Photos to 3D Spatial Data

PreviousTechnical GuidesNextAdvanced Techniques

Last updated 4 years ago

Was this helpful?

This guide will cover using Agisoft Metashape to produce a point cloud and/or mesh from a set of photos.

Agisoft Metashape Professional v.1.6.2 was used in this article.

Agisoft Metashape

User Interface

When first entering the program, it helps to recognise the layout of the software. In Figure 1, the following highlighted sections are:

  1. Red: Workspace and reference panel. Here, a list of your chunks, images and points as well as markers and reference points are located.

  2. Blue: The model space. This is where you interact with the photogrammetry model, a gumball is always present to give you orientation along with the axes gizmo in the bottom-right. There is also a Photos tab that will appear if one tries to edit a photograph

  3. Yellow: Photos, console and job timeline. Photos timeline is the important one here; they show the status of each image (aligned, marked) and allow one to add markers in each image. The console tab is used for debugging and the job tab is not used.

  4. Orange: Quick access toolbar. Various tools, and options are located here.

Navigating the Model Space

Command

Function

Right Mouse Button (Hold)

Pan

Left Mouse Button (Hold)

Rotates around the gumball

Ctrl + Scroll Wheel

Change perspective angle

Workflow

Metashape is a fairly straightforward software to use, the majority of the workflow is covered by going down the [Workflow] menu on the top.

Enabling the GPU

Ensure the GPU is enabled, this will speed up processing times considerably.

Under [Tools > Preferences > GPU], ensure the GPU device is enabled. On University systems, these are GeForce RTX 2080 and GeForce GTX 1080.

Adding Photos

Adding Chunks

The first step is to add the photos you have taken to the software.

Right-Click in the Workspace, and add a [New Chunk]. Metashape processes a set of images as a 'chunk'; think of it as a layer in Photoshop or other software. Figure 2, 3 & 4 shows how chunks and photos are added.

TIP: You can select and use/view any chunk by double-clicking it.

Use [Workflow > Add Photos] to open up your system's File Explorer, navigate to your photos and select all of them.

Viewing Photos

Once the photos are added, they will appear in the Photos panel and will be listed in the chunk in the Workspace on the left (Figure 5).

Aligning Photos

When your photos have all been inserted into Metashape, the next step is to instruct Metashape to align the photos.

Metashape uses a process where it will generate Feature Points called tie-points through triangulation of the photos. This process can vary in processing time depending on the quality selected.

Align Photos

Use [Workflow > Align Photos] to begin the alignment process.

In general, 'Medium' settings are a good compromise between processing time and output quality. The other settings can be left as is as well. If you require higher quality you can adjust it accordingly but be wary of time.

When completed, one can see the position of the image's capture location mapped out around the tie-points. These are called cameras by Metashape.

Checkpoint

If your photos do not align properly at this stage, it would be wise to reconsider your matrix, or refer to the Advanced Techniques: Manual Markers to assist in the alignment process.

Hiding/Showing Interface Elements

It is useful to know how to show and hide elements in the model space as well as shortcut controls. Open [Model>Show/Hide Items] to view a list of interface elements.

The important ones to keep note are:

  • [Show Cameras] controls the visibility of the cameras that surround the model.

  • [Show Thumbnails] controls the visibility of the cameras thumbnail image.

  • [Show Markers] controls the visibility of the markers in the model space.

  • [Show Trackball] controls the visibility of the trackball at the centre of the model space. Note that the trackball is not visible when other tools such as the Selection tool is used.

  • [Show Grid] controls the visibility of the grid.

Some of these visibility controls are also located on the toolbar.

Cleaning Up Tie-Points

The process of aligning photos tend to generate some tie-points that are floating far away and surrounding the model. Cleaning this up will greatly assist the legibility of the object in further processes.

In the toolbar, look for the Selection tool (see image below). The drop down menu shows several selection options such as [Rectangular], [Circular] and [Free-Form]; choose whichever is applicable.

Then, one can select the stray/unwanted tie-points in the model space by marquee selecting (left click and drag) these points. Simply hit the Delete on the keyboard to remove these points.

Generating Dense Cloud

Dense clouds are essentially a high count (and therefore high detail) point cloud that is generated by interpolating and point matching the existing tie-points and data from the images.

It allows the generation of meshes and detail reproduction of the object. This part of the process does take a lot of time. Do plan ahead.

To do this, go to [Workflow > Build Dense Cloud].

In the Build Dense Cloud dialogue box, choose your quality settings. It is recommended to set 'Medium' quality for a balance of processing time and quality. Stick to the default settings for the rest.

Viewing the Dense Cloud

Once the dense cloud has been built, it does not appear on the screen immediately. You can control the visibility of the dense cloud through the toolbar. Locate the Dense Cloud icon (see image below) and click on it to make the dense cloud visible.

Generating a Mesh

Agisoft Metashape can produce a mesh from the dense cloud that can be exported out to other softwares for fabrication purposes.

To do this, go to [Workflow > Build Mesh].

When the Build Mesh dialogue box appears, select the mesh quality that you desire.

Exporting

All Export options are found under [File > Export] Agisoft Metashape can export to a variety of standard file types.

Exporting Point Clouds

Point clouds contain exact points and information pertaining to each point, such as colour and the normal plane of the point.

In the Export Options, you can choose which of these qualities to save. Usually you would want the Source Data to be the [Dense Cloud] for a complete point cloud.

Do not convert colour to 8-bit.

TIP: .PLY, .XYZ and .E57 are common Point Cloud filetypes. Use .PLY if you intend to use this object with Rhinoceros 3D.

Exporting Meshes

Meshes are exported in a similar way.

TIP: .OBJ and .PLY are common Mesh filetypes.

Taking Measurements + Visualisation

With a point cloud/mesh generated, you can now take measurements.

Miscellaneous Functions

Show Images allows you to view snapped images to double-check the alignment.

Advanced Techniques
Taking Measurements + Visualisation
Agisoft Metashape Layout
Adding new, empty Chunks
Adding photos
Browse for your photos using File Explorer
The photo library and the Chunk should be populated with your photos
You can find the entire workflow in the Workflow menu, choose Align Photos
Align Photos context menu
Aligned photos - The points are the tie-points (feature points) used to position the cameras in 3D space
There are a few selection tools to choose from based on the region you need removing
Marquee Selection into point deletion
Workflow menu for the next step, building a dense point cloud
Build Dense Cloud dialogue menu
Ensure you use these controls to view the appropriate step
Dense cloud preview on
Build Mesh workflow and the generated mesh