NExT Lab
Maker SpacesFabLabNExT LabRobotics LabPrint Room and Loans
  • NExT Lab
  • Contact Details
  • NExT Lab Access
  • Sustainability
    • 3D Printing
  • Case Studies & Projects
    • |3DS|VR| Voices of Country
    • |3DS| Our Quiet Neighbour
    • |3DS| OFF FORM | OFF MODERN
    • |3DP|AR| Prosthetic Habitats
    • |AR| Studio 40: The Field
    • |VR|3DP| Gravity Sketch: Door Handles
    • |3DS| 3D Scanning Examples
    • |AR|3DP| GRANULAR
  • 3D Printing |3DP|
    • 3D Printing at the NExT Lab
      • Other 3D Printing Options
    • Get Started
    • Design Approaches
    • Modelling Guidelines
    • 3D Print Farm
      • Quick-Start Guide
        • File Naming Conventions
      • Detailed Overview
        • 3D Printing Mesh Preparation
        • Submitting a Print Request
        • Post-Submission: Updating, Paying & Collecting
        • Slicing & Settings
    • Open Access Printers
      • PRUSA Open-Access
        • Workflows
          • Materials Experimentation
          • Experimental Techniques
        • Prusa i3 MK3S Fundamentals
        • Hardware Glossary
          • Extruder
          • Hotend & Nozzle
          • Print Surface, Bed & Y Axis
          • PINDA Inductive Probe
          • X-Axis Gantry
          • Z-Axis Stage
        • Software/Slicer Glossary
          • Plater/Virtual Print Bed
          • Print Settings
          • Filament Settings
          • Printer Settings
        • Troubleshooting
          • Filament Jam/Clog
          • Worn Nozzle
          • Broken/Loose Heatbreak
          • First Layer Issues/Prints Not Sticking to Bed
          • Stringing & Oozing Hotend
    • Use Own Filament
    • Key Techniques
      • Hollowing Models
      • Combating Warping
      • Split Models & Joints
      • Joints and Connections
      • Fillets & Chamfers
      • Accuracy, Precision & Tolerancing
      • Post-Processing & Finishing
        • No Sanding Method
        • Sanding Method
        • Epoxy Method
        • Fillers Method
      • Printing for Transparency
      • Mesh Techniques
        • Meshes 101
        • Working with Meshes
        • Repairing Meshes
        • Other Techniques
          • Thicken a Mesh with Grasshopper
          • Mesh Manipulation with Blender
          • Custom Supports in Meshmixer
      • Topography Models
      • Using the Makerbot Experimental Extruder
      • Troubleshooting
      • Adjusting Print Settings
    • Resources
      • Downloadable Software & Accounts
      • Software Primers
        • Autodesk Meshmixer
        • Blender
    • Mold Making and Casting
  • 3D Scanning |3DS|
    • 3D Scanning at the NExT Lab
    • 3D Scanning Use Cases
    • Guides
      • Principles of 3D Scanning / Digital Reconstruction
      • Photogrammetry
        • Photogrammetry Theory
        • Photogrammetry Benchmark
        • Technical Guides
          • From Photos to 3D Spatial Data
          • Advanced Techniques
          • Taking Measurements + Visualisation
          • From Photogrammetry to 3D Printing
      • BLK360 Terrestrial LiDAR Scanner
        • BLK360 Benchmark
        • Scan
        • Register
          • Export from iPad
        • Process
      • Artec Handheld SLT Scanners
        • Using the Scanners
        • Manual Alignment
        • Fill Holes
        • Smoothing
        • Frame Selection
      • VLX LiDAR SLAM Scanner
        • VLX setup
        • Preparing to Scan
        • Using the Scanner
        • Processing the Scans
      • Working with 3D Scan Data
        • Point Clouds and Rhino
        • Point Clouds and Cloud Compare
        • Point Clouds and Blender
        • Point Clouds to Meshes
    • Troubleshooting
      • General
      • Artec EVA
      • Leica BLK360
      • VLX
  • Augmented Reality |AR|
    • Augmented/Mixed Reality at the NExT Lab
      • Use Case of AR
    • Guides
      • Hololens 2
      • Fologram
        • Fologram Applications
          • Fologram for Hololens
          • Fologram for Mobile
        • Fologram for Rhino
        • Fologram for Grasshopper
        • Shared Experiences / Tracked Models
        • Extended Functionality
          • Preparing Models for AR
          • Interactivity
          • Fabrication
      • Unity and Vuforia
        • Unity Primer
        • 2D Targets (Image Targets)
        • 3D Targets (Object Targets)
        • Vuforia Primer
        • Creating a Simple AR App
          • Unity Next Steps: Interaction
          • Model Recognition
    • Troubleshooting
      • Hololens & Fologram
      • FAQ: Augmented Reality
    • Resources
      • Platforms (Hardware)
        • Microsoft Hololens
        • Mobile
      • Software Packages
      • Student Contact
        • AR: Intro Sessions
        • AR: Workshops and Resources
          • UntYoung Leaders Program Workshopitled
          • Young Leaders Program Workshop
          • Construction as Alchemy
  • Virtual Reality |VR|
    • Virtual Reality at the NExT Lab
    • Guides
      • Virtual Reality Hardware Set Up
        • Meta Quest 3
          • Troubleshooting
        • HTC Vive Headsets
          • HTC Vive
            • Troubleshooting
          • HTC Vive Pro
          • HTC Vive Cosmos
            • Troubleshooting
      • Twinmotion VR
        • Twinmotion VR: Features
        • Twinmotion VR: Troubleshooting
      • Virtual Reality Experiences
        • Unreal Engine
          • Unreal Engine Primer
            • Process: Level Building, Playing & Packaging
            • Actors: Components, Content and Editors
            • Materials & Textures
            • Lighting & Mobility
            • Player: VR and non-VR
            • Interactivity & Blueprints
          • Unreal Engine: Guides
            • Setting up a VR-ready File & Templates
            • Creating a Basic VR Experience
            • Custom Collision and Navigation
            • UV and Lightmaps
            • Outputting Content
            • Unreal Troubleshooting
            • Point Cloud Visualisation
          • VR: Video Tutorial Series
            • Exporting from Rhino
            • Model Preparation in 3DS Max
            • Unreal Engine
      • Designing in Virtual Reality
        • Gravity Sketch
          • Quick Start
        • Masterpiece Creator
    • Student Contact
      • VR: Intro Sessions
  • Sensing
    • Body Tracking
      • Usage
        • Technical Specifications
      • Data Analysis in Grasshopper
        • Analysis Examples
      • Animated Point Clouds(UE)
  • ROBOTICS
    • Robotic Dog
      • Operational Health & Safety
      • Robot Dog Setup
      • Operation Setup
        • Operation Basics
        • Arm Mode
        • Programming Mode
        • Mapping Mode
      • Advanced Operations
      • Expansion Equipment / Attachments
      • Basic Simulation
      • Troubleshooting
Powered by GitBook
On this page
  • Interaction Scripts
  • Ray-Tracer Script
  • Object Material
  • Interaction Script
  • Extension
  • Conclusion

Was this helpful?

  1. Augmented Reality |AR|
  2. Guides
  3. Unity and Vuforia
  4. Creating a Simple AR App

Unity Next Steps: Interaction

PreviousCreating a Simple AR AppNextModel Recognition

Last updated 5 years ago

Was this helpful?

This is a Work in Progress Article based on materials available on .

Interaction Scripts

So far, we've developed a basic AR application that recognizes and tracks our target image and displays the designated 3D graphics. However, for a complete AR application, we also need to be able to interact with the objects, augmenting the reality.

For this purpose, we need to be able to detect where we clicked—or touched, in the case of a mobile device. We'll do this by implementing a ray-tracer.

First, create a folder named "scripts" under Assets to keep everything organized. We are going to store our script files in this folder. Then create a C# Script file in this folder. Name it "rayTracer". Naming is important due to the fact that the following code should match this specific file name. If you prefer to use a different name for your script file, you should also change the provided code accordingly.

Creating a script file

Ray-Tracer Script

Copy and paste the following code into the C# Script file you have just created and named "rayTracer".

using UnityEngine;
using System.Collections;
using System.Collections.Generic;
 
public class rayTracer : MonoBehaviour {
 
    private List<GameObject> touchList = new List<GameObject>();
    private GameObject[] touchPrev;
    private RaycastHit hit;
 
     
    void Update () {
 
        #if UNITY_EDITOR
 
        if (Input.GetMouseButton(0) || Input.GetMouseButtonDown(0) || Input.GetMouseButtonUp(0)) {
 
            touchPrev = new GameObject[touchList.Count];
            touchList.CopyTo (touchPrev);
            touchList.Clear ();
 
            Ray ray = Camera.main.ScreenPointToRay (Input.mousePosition);
            //Debug.DrawRay(ray.origin, ray.direction*10000, Color.green, 10, false);
 
            if (Physics.Raycast (ray, out hit)) {
 
                GameObject recipient = hit.transform.gameObject;
                touchList.Add (recipient);
 
                if (Input.GetMouseButtonDown(0)) {
                    recipient.SendMessage ("touchBegan", hit.point, SendMessageOptions.DontRequireReceiver);
 
                }
                if (Input.GetMouseButtonUp(0)) {
                    recipient.SendMessage ("touchEnded", hit.point, SendMessageOptions.DontRequireReceiver);
 
                }
                if (Input.GetMouseButton(0)) {
                    recipient.SendMessage ("touchStay", hit.point, SendMessageOptions.DontRequireReceiver);
 
                }
            }
 
            foreach (GameObject g in touchPrev) {
                if(!touchList.Contains(g)){
                    g.SendMessage ("touchExit", hit.point, SendMessageOptions.DontRequireReceiver);
                }
            }
        }
 
        #endif
 
        if (Input.touchCount > 0) {
 
            touchPrev = new GameObject[touchList.Count];
            touchList.CopyTo (touchPrev);
            touchList.Clear ();
     
            foreach (Touch touch in Input.touches) {
 
                Ray ray = Camera.main.ScreenPointToRay (touch.position);
 
                if (Physics.Raycast (ray, out hit)) {
 
                    GameObject recipient = hit.transform.gameObject;
                    touchList.Add (recipient);
 
                    if (touch.phase == TouchPhase.Began) {
                        recipient.SendMessage ("touchBegan", hit.point, SendMessageOptions.DontRequireReceiver);
 
                    }
                    if (touch.phase == TouchPhase.Ended) {
                        recipient.SendMessage ("touchEnded", hit.point, SendMessageOptions.DontRequireReceiver);
 
                    }
                    if (touch.phase == TouchPhase.Stationary || touch.phase == TouchPhase.Moved) {
                        recipient.SendMessage ("touchStay", hit.point, SendMessageOptions.DontRequireReceiver);
 
                    }
                    if (touch.phase == TouchPhase.Canceled) {
                        recipient.SendMessage ("touchExit", hit.point, SendMessageOptions.DontRequireReceiver);
 
                    }
                }
            }
 
            foreach (GameObject g in touchPrev) {
                if(!touchList.Contains(g)){
                    g.SendMessage ("touchExit", hit.point, SendMessageOptions.DontRequireReceiver);
                }
            }
        }
    }
}

This script detects both mouse clicks if you are working on the Unity editor and touch inputs if you have deployed your application on a mobile device with a touch screen.

Once you've created your rayTracer script, you need to activate it by assigning it to one of the objects in the scene. I selected the ARCamera object and added the rayTracer scripts as a component by using the Add Component button under the Inspector tab.

Object Material

Now we are going to assign a material to our Cube object and change the color of the material upon interaction with the cube.

Under Assets, create a material and name it as you wish.

Now assign this material by dragging and dropping over the cube object.Advertisement

Interaction Script

Create a new C# Script under the scripts folder and name it "interaction".

Copy the following C# code into your "interaction" script file and then add this script file to the cube object as a component, just as we did with the "rayTracer" script file. However, this time it should be a component of the cube object—this is important in order to be able to only interact with the cube object.

using UnityEngine;
using System.Collections;
 
public class interaction : MonoBehaviour {

    public static Color defaultColor;
    public static Color selectedColor;
    public static Material mat;
 
    void Start(){
       
        mat = GetComponent<Renderer>().material;
 
        mat.SetFloat("_Mode", 2);
        mat.SetInt("_SrcBlend", (int)UnityEngine.Rendering.BlendMode.SrcAlpha);
        mat.SetInt("_DstBlend", (int)UnityEngine.Rendering.BlendMode.OneMinusSrcAlpha);
        mat.SetInt("_ZWrite", 0);
        mat.DisableKeyword("_ALPHATEST_ON");
        mat.EnableKeyword("_ALPHABLEND_ON");
        mat.DisableKeyword("_ALPHAPREMULTIPLY_ON");
        mat.renderQueue = 3000;
 
        defaultColor = new Color32 (255, 255, 255, 255);
        selectedColor = new Color32 (255, 0, 0, 255);
 
        mat.color = defaultColor;
    }
 
    void touchBegan(){
        mat.color = selectedColor;
        //Add your own functionality here
    }
 
    void touchEnded(){
        mat.color = defaultColor;
        //Add your own functionality here
    }
 
    void touchStay(){
        mat.color = selectedColor;
        //Add your own functionality here
    }
 
    void touchExit(){
        mat.color = defaultColor;
        //Add your own functionality here
    }
}

In this "interaction" script, we are referring to the material of the cube object as "mat".

We created two different material objects named defaultColor and selectedColor. defaultColor is selected to be white, as the RGBA parameters indicate, which are (255, 255, 255, 255).

We initialize the cube object's material color as defaultColor by the following line:

mat.color = defaultColor;

We have four different functions for four different states:

  • touchBegan() is called at the instant you touched on the object.

  • touchEnded() is called when you release your finger.

  • touchStay() is called right after you touched on the object—this function follows touchBegan(). So, if you assign different colors to your material in these functions, you are unlikely to see the color assigned in the touchStay() function, since it is the very first instant the touch is recognized.

  • touchExit() is called when you drag your finger out of the cube object's surface, instead of releasing your finger, which calls the touchEnded() function as explained above.

In our code, when we touch on the cube object, we assign the selectedColor object to mat.color, which is the color of our cube object's material.

By assigning the selectedColor within the touchStay() function, we make sure that the color of the cube object will be equal to selectedColor as long as we keep our finger on the cube object. If we release our finger or drag it out of the cube object, we assign defaultColor to the material's color parameter by calling the touchEnded() or touchExit() functions in accordance with the action we took.

Now run the project and click on the cube object once the target image is recognized and the cube object has appeared. It should turn red and white again when you release your click or move it out of the cube object's surface.

You can experiment with different colors for the four different actions to comprehend them thoroughly.

Extension

In the C# script above, functionality can be added to the events, as denoted by the comment //Add your own functionality here

Conclusion

In this tutorial, we've gone through an introduction to the Vuforia SDK for Unity along with its developer portal, and we've seen how to generate a target image and an appropriate license key.

On top of that, we generated custom script files in order to be able to interact with the augmented graphics. This tutorial is just an introduction to enable you to start using Vuforia with Unity and creating

Assigning rayTracer script
Creating material
Final result
TutsPlus