NExT Lab
Maker SpacesFabLabNExT LabRobotics LabPrint Room and Loans
  • NExT Lab
  • Contact Details
  • NExT Lab Access
  • Sustainability
    • 3D Printing
  • Case Studies & Projects
    • |3DS|VR| Voices of Country
    • |3DP|AR| Prosthetic Habitats
    • |AR| Studio 40: The Field
    • |VR|3DP| Gravity Sketch: Door Handles
    • |3DS| 3D Scanning Examples
    • |AR|3DP| GRANULAR
  • 3D Printing |3DP|
    • 3D Printing at the NExT Lab
      • Other 3D Printing Options
    • Get Started
    • Design Approaches
    • Modelling Guidelines
    • 3D Print Farm
      • Quick-Start Guide
        • File Naming Conventions
      • Detailed Overview
        • 3D Printing Mesh Preparation
        • Submitting a Print Request
        • Post-Submission: Updating, Paying & Collecting
        • Slicing & Settings
    • Open Access Printers
      • PRUSA Open-Access
        • Workflows
          • Materials Experimentation
          • Experimental Techniques
        • Prusa i3 MK3S Fundamentals
        • Hardware Glossary
          • Extruder
          • Hotend & Nozzle
          • Print Surface, Bed & Y Axis
          • PINDA Inductive Probe
          • X-Axis Gantry
          • Z-Axis Stage
        • Software/Slicer Glossary
          • Plater/Virtual Print Bed
          • Print Settings
          • Filament Settings
          • Printer Settings
        • Troubleshooting
          • Filament Jam/Clog
          • Worn Nozzle
          • Broken/Loose Heatbreak
          • First Layer Issues/Prints Not Sticking to Bed
          • Stringing & Oozing Hotend
    • Use Own Filament
    • Key Techniques
      • Hollowing Models
      • Combating Warping
      • Split Models & Joints
      • Joints and Connections
      • Fillets & Chamfers
      • Accuracy, Precision & Tolerancing
      • Post-Processing & Finishing
        • No Sanding Method
        • Sanding Method
        • Epoxy Method
        • Fillers Method
      • Printing for Transparency
      • Mesh Techniques
        • Meshes 101
        • Working with Meshes
        • Repairing Meshes
        • Other Techniques
          • Thicken a Mesh with Grasshopper
          • Mesh Manipulation with Blender
          • Custom Supports in Meshmixer
      • Topography Models
      • Using the Makerbot Experimental Extruder
      • Troubleshooting
      • Adjusting Print Settings
    • Resources
      • Downloadable Software & Accounts
      • Software Primers
        • Autodesk Meshmixer
        • Blender
    • Mold Making and Casting
  • 3D Scanning |3DS|
    • 3D Scanning at the NExT Lab
    • 3D Scanning Use Cases
    • Guides
      • Principles of 3D Scanning / Digital Reconstruction
      • Photogrammetry
        • Photogrammetry Theory
        • Photogrammetry Benchmark
        • Technical Guides
          • From Photos to 3D Spatial Data
          • Advanced Techniques
          • Taking Measurements + Visualisation
          • From Photogrammetry to 3D Printing
      • BLK360 Terrestrial LiDAR Scanner
        • BLK360 Benchmark
        • Scan
        • Register
          • Export from iPad
        • Process
      • Artec Handheld SLT Scanners
        • Using the Scanners
        • Manual Alignment
        • Fill Holes
        • Smoothing
        • Frame Selection
      • VLX LiDAR SLAM Scanner
        • VLX setup
        • Preparing to Scan
        • Using the Scanner
        • Processing the Scans
      • Working with Point Clouds
        • Point Clouds to Meshes
    • Troubleshooting
      • General
      • Artec EVA
      • Leica BLK360
      • VLX
  • Augmented Reality |AR|
    • Augmented/Mixed Reality at the NExT Lab
      • Use Case of AR
    • Guides
      • Hololens 2
      • Fologram
        • Fologram Applications
          • Fologram for Hololens
          • Fologram for Mobile
        • Fologram for Rhino
        • Fologram for Grasshopper
        • Shared Experiences / Tracked Models
        • Extended Functionality
          • Preparing Models for AR
          • Interactivity
          • Fabrication
      • Unity and Vuforia
        • Unity Primer
        • 2D Targets (Image Targets)
        • 3D Targets (Object Targets)
        • Vuforia Primer
        • Creating a Simple AR App
          • Unity Next Steps: Interaction
          • Model Recognition
    • Troubleshooting
      • Hololens & Fologram
      • FAQ: Augmented Reality
    • Resources
      • Platforms (Hardware)
        • Microsoft Hololens
        • Mobile
      • Software Packages
      • Student Contact
        • AR: Intro Sessions
        • AR: Workshops and Resources
          • UntYoung Leaders Program Workshopitled
          • Young Leaders Program Workshop
          • Construction as Alchemy
  • Virtual Reality |VR|
    • Virtual Reality at the NExT Lab
    • Guides
      • Virtual Reality Hardware Set Up
        • Meta Quest 3
          • Troubleshooting
        • HTC Vive Headsets
          • HTC Vive
            • Troubleshooting
          • HTC Vive Pro
          • HTC Vive Cosmos
            • Troubleshooting
      • Twinmotion VR
        • Twinmotion VR: Features
        • Twinmotion VR: Troubleshooting
      • Virtual Reality Experiences
        • Unreal Engine
          • Unreal Engine Primer
            • Process: Level Building, Playing & Packaging
            • Actors: Components, Content and Editors
            • Materials & Textures
            • Lighting & Mobility
            • Player: VR and non-VR
            • Interactivity & Blueprints
          • Unreal Engine: Guides
            • Setting up a VR-ready File & Templates
            • Creating a Basic VR Experience
            • Custom Collision and Navigation
            • UV and Lightmaps
            • Outputting Content
            • Unreal Troubleshooting
            • Point Cloud Visualisation
          • VR: Video Tutorial Series
            • Exporting from Rhino
            • Model Preparation in 3DS Max
            • Unreal Engine
      • Designing in Virtual Reality
        • Gravity Sketch
          • Quick Start
        • Masterpiece Creator
    • Student Contact
      • VR: Intro Sessions
  • Sensing
    • Body Tracking
      • Usage
        • Technical Specifications
      • Data Analysis in Grasshopper
        • Analysis Examples
      • Animated Point Clouds(UE)
  • ROBOTICS
    • Robotic Dog
      • Operational Health & Safety
      • Robot Dog Setup
      • Operation Setup
        • Operation Basics
        • Arm Mode
        • Programming Mode
        • Mapping Mode
      • Advanced Operations
      • Expansion Equipment / Attachments
      • Basic Simulation
      • Troubleshooting
Powered by GitBook
On this page
  • Model Targets
  • Requirements: Model Data
  • Creating Model Targets
  • Using Model Targets in Unity
  • Adding Content to the ModelTarget
  • Other Resources

Was this helpful?

  1. Augmented Reality |AR|
  2. Guides
  3. Unity and Vuforia
  4. Creating a Simple AR App

Model Recognition

PreviousUnity Next Steps: InteractionNextTroubleshooting

Last updated 4 years ago

Was this helpful?

This is a Work in Progress Article based on materials available on the .

Model Targets

Introduction to Model Targets in Unity (Source: Vuforia Library)

Model Targets utilise 3D model data of an object (for example a 3D Scan or Digital Model).

Model Targets enable applications to track physical objects by using this digital data as reference information. Model targets can include rigid and opaque objects of different scales, from Architectual Landmarks to Small Figurines.

Requirements: Model Data

In order to create a Model Target, 3D model data must first be created. 3D CAD models and 3D Scans are suitable data sources.

The Model Target itself must be a rigid object which does not contain transparent or shiny surfaces. The object must also remain in a fixed location.

Creating Model Targets

The Model Target Generator software will grade quality or suitability of the mesh, and allow a user to add "Guide Views" . This Guide View is an outline of a 3D model from a particular perspective.

From the Model Target Generator, you can export your Model Target dataset as a .unitypackage.

Import the .unitypackage in to your Unity Project.

Using Model Targets in Unity

Model Targets can be added in to a Unity Scene as GameObjects.

GameObject > Vuforia Engine > Model Targets > Model Target

Select the ModelTarget from the Hierarchy Window, and designate it a child of the ARCamera.

Ensure that the Model Target Behaviour scripts are correct: Inspector Window > Model Target Behaviour (Script)

  • The Database and Model Target fields should match that of the recently imported database

  • The Physical Length, Physical Width and Physical Height fields should match the physical dimensions of the real-world object that the project references.

  • The Guide View Mode field should be set to Guide View 2D

Debug the application, and verify that when the Guide View aligns with the physical object, the guide view lines disappear.

When this occurs, the application is tracking the physical object.

Adding Content to the ModelTarget

Content, such as a rendered 3d model or annotations can be rendered upon the application tracking the object.

Add any asset or GameObject to the hierarchy window and designate it a child of the ModelTarget.

Other Resources

Students at MSD can on the Knowledge Base and can access training and equipment loans using .

See:

Model Targets are created using the . This software converts a 3D model in to a Vuforia Engine dataset. This software is only available on Windows.

Figure 1.0 - Vuforia Model Target Generator

Add an ARCamera to the Environment and enter an appropriate license key. .

(Vuforia Library)

(Vuforia Developer Portal)

(Vuforia Developer Library)

learn more about Digital Reproduction
SimplyBookMe
Model Targets Supported Objects & CAD Model Best Practices (Vuforia Library)
Model Target Generator
Model Targets
Vuforia Model Target Generator
An Introduction to Model Targets in Unity
Read more here
Vuforia Library
Model Targets | VuforiaLibrary
Download Tools | Vuforia Developer Portal
Introduction to Model Targets in Unity | VuforiaLibrary
Logo
Logo
Logo