NExT Lab
Maker SpacesFabLabNExT LabRobotics LabPrint Room and Loans
  • NExT Lab
  • Contact Details
  • NExT Lab Access
  • Sustainability
    • 3D Printing
  • Case Studies & Projects
    • |3DS|VR| Voices of Country
    • |3DS| Our Quiet Neighbour
    • |3DS| OFF FORM | OFF MODERN
    • |3DP|AR| Prosthetic Habitats
    • |AR| Studio 40: The Field
    • |VR|3DP| Gravity Sketch: Door Handles
    • |3DS| 3D Scanning Examples
    • |AR|3DP| GRANULAR
  • 3D Printing |3DP|
    • 3D Printing at the NExT Lab
      • Other 3D Printing Options
    • Get Started
    • Design Approaches
    • Modelling Guidelines
    • 3D Print Farm
      • Quick-Start Guide
        • File Naming Conventions
      • Detailed Overview
        • 3D Printing Mesh Preparation
        • Submitting a Print Request
        • Post-Submission: Updating, Paying & Collecting
        • Slicing & Settings
    • Open Access Printers
      • PRUSA Open-Access
        • Workflows
          • Materials Experimentation
          • Experimental Techniques
        • Prusa i3 MK3S Fundamentals
        • Hardware Glossary
          • Extruder
          • Hotend & Nozzle
          • Print Surface, Bed & Y Axis
          • PINDA Inductive Probe
          • X-Axis Gantry
          • Z-Axis Stage
        • Software/Slicer Glossary
          • Plater/Virtual Print Bed
          • Print Settings
          • Filament Settings
          • Printer Settings
        • Troubleshooting
          • Filament Jam/Clog
          • Worn Nozzle
          • Broken/Loose Heatbreak
          • First Layer Issues/Prints Not Sticking to Bed
          • Stringing & Oozing Hotend
    • Use Own Filament
    • Key Techniques
      • Hollowing Models
      • Combating Warping
      • Split Models & Joints
      • Joints and Connections
      • Fillets & Chamfers
      • Accuracy, Precision & Tolerancing
      • Post-Processing & Finishing
        • No Sanding Method
        • Sanding Method
        • Epoxy Method
        • Fillers Method
      • Printing for Transparency
      • Mesh Techniques
        • Meshes 101
        • Working with Meshes
        • Repairing Meshes
        • Other Techniques
          • Thicken a Mesh with Grasshopper
          • Mesh Manipulation with Blender
          • Custom Supports in Meshmixer
      • Topography Models
      • Using the Makerbot Experimental Extruder
      • Troubleshooting
      • Adjusting Print Settings
    • Resources
      • Downloadable Software & Accounts
      • Software Primers
        • Autodesk Meshmixer
        • Blender
    • Mold Making and Casting
  • 3D Scanning |3DS|
    • 3D Scanning at the NExT Lab
    • 3D Scanning Use Cases
    • Guides
      • Principles of 3D Scanning / Digital Reconstruction
      • Photogrammetry
        • Photogrammetry Theory
        • Photogrammetry Benchmark
        • Technical Guides
          • From Photos to 3D Spatial Data
          • Advanced Techniques
          • Taking Measurements + Visualisation
          • From Photogrammetry to 3D Printing
      • BLK360 Terrestrial LiDAR Scanner
        • BLK360 Benchmark
        • Scan
        • Register
          • Export from iPad
        • Process
      • Artec Handheld SLT Scanners
        • Using the Scanners
        • Manual Alignment
        • Fill Holes
        • Smoothing
        • Frame Selection
      • VLX LiDAR SLAM Scanner
        • VLX setup
        • Preparing to Scan
        • Using the Scanner
        • Processing the Scans
      • Working with 3D Scan Data
        • Point Clouds and Rhino
        • Point Clouds and Cloud Compare
        • Point Clouds and Blender
        • Point Clouds to Meshes
    • Troubleshooting
      • General
      • Artec EVA
      • Leica BLK360
      • VLX
  • Augmented Reality |AR|
    • Augmented/Mixed Reality at the NExT Lab
      • Use Case of AR
    • Guides
      • Hololens 2
      • Fologram
        • Fologram Applications
          • Fologram for Hololens
          • Fologram for Mobile
        • Fologram for Rhino
        • Fologram for Grasshopper
        • Shared Experiences / Tracked Models
        • Extended Functionality
          • Preparing Models for AR
          • Interactivity
          • Fabrication
      • Unity and Vuforia
        • Unity Primer
        • 2D Targets (Image Targets)
        • 3D Targets (Object Targets)
        • Vuforia Primer
        • Creating a Simple AR App
          • Unity Next Steps: Interaction
          • Model Recognition
    • Troubleshooting
      • Hololens & Fologram
      • FAQ: Augmented Reality
    • Resources
      • Platforms (Hardware)
        • Microsoft Hololens
        • Mobile
      • Software Packages
      • Student Contact
        • AR: Intro Sessions
        • AR: Workshops and Resources
          • UntYoung Leaders Program Workshopitled
          • Young Leaders Program Workshop
          • Construction as Alchemy
  • Virtual Reality |VR|
    • Virtual Reality at the NExT Lab
    • Guides
      • Virtual Reality Hardware Set Up
        • Meta Quest 3
          • Troubleshooting
        • HTC Vive Headsets
          • HTC Vive
            • Troubleshooting
          • HTC Vive Pro
          • HTC Vive Cosmos
            • Troubleshooting
      • Twinmotion VR
        • Twinmotion VR: Features
        • Twinmotion VR: Troubleshooting
      • Virtual Reality Experiences
        • Unreal Engine
          • Unreal Engine Primer
            • Process: Level Building, Playing & Packaging
            • Actors: Components, Content and Editors
            • Materials & Textures
            • Lighting & Mobility
            • Player: VR and non-VR
            • Interactivity & Blueprints
          • Unreal Engine: Guides
            • Setting up a VR-ready File & Templates
            • Creating a Basic VR Experience
            • Custom Collision and Navigation
            • UV and Lightmaps
            • Outputting Content
            • Unreal Troubleshooting
            • Point Cloud Visualisation
          • VR: Video Tutorial Series
            • Exporting from Rhino
            • Model Preparation in 3DS Max
            • Unreal Engine
      • Designing in Virtual Reality
        • Gravity Sketch
          • Quick Start
        • Masterpiece Creator
    • Student Contact
      • VR: Intro Sessions
  • Sensing
    • Body Tracking
      • Usage
        • Technical Specifications
      • Data Analysis in Grasshopper
        • Analysis Examples
      • Animated Point Clouds(UE)
  • ROBOTICS
    • Robotic Dog
      • Operational Health & Safety
      • Robot Dog Setup
      • Operation Setup
        • Operation Basics
        • Arm Mode
        • Programming Mode
        • Mapping Mode
      • Advanced Operations
      • Expansion Equipment / Attachments
      • Basic Simulation
      • Troubleshooting
Powered by GitBook
On this page
  • Get Started
  • The Kit
  • Simulation
  • Technical Specifications
  • Use Cases
  • Navigation and exploration
  • Surveillance and inspection
  • Logistics and Assistance

Was this helpful?

  1. ROBOTICS

Robotic Dog

PreviousAnimated Point Clouds(UE)NextOperational Health & Safety

Last updated 4 months ago

Was this helpful?

Hosted at the NExT Lab and working in collaboration with Robotics Lab, the UniTree Go 2 Robot Quadruped is a highly agile robotic platform designed to offer tasks like surveillance, advanced mobility for research purposes.


Get Started


The Kit

The UniTree Go2 RobotDog package comes in a suitcase that contains all relevant equipment. Please ensure you have all the items upon borrowing, and upon return:

  • UniTree Go2 Quadrupped Robot

  • Remote Control

  • Battery

  • iPad

Available expansion equipment/attachment:

  • UniTree Go2 Charging Station

  • UniTree D1 Servo Arm for Go2

  • Ouster OS2

  • Long-range LiDar Kit

  • Emlid Reach RS2 + L1/L2/L5 RTK GNSS receiver


Simulation

You can use the Unitree Go App with a personal account to practice the movements and functionality of the Robot Dog in a virtual environment. We recommend this to familiarise yourself with the general controls.


Technical Specifications

General

Robot Type

Quadruped, consumer-grade interactive robot

Degrees of Freedom

12 (precision joint motors)

Material

Aluminium alloy and high-strength engineering plastics

Speed

Maximum Running Speed

5 m/s (laboratory conditions)

Auto-follow Speeds

Slow: 1.5 m/s; Fast: 3.0 m/s

Remote Control

Distance

Over 30 meters

Modes

Companion remote, handheld remote, app-based

App Features

Image transmission, OTA updates, custom modes, real-time controls

Navigation

Intelligent Avoidance

360°×90° sensing via 4D LIDAR L1, 0.05m minimum detection range

Side-follow System

ISS2.0 with 50% improved positioning accuracy

Obstacle Avoidance

Forward only

Voice Interaction

Voice Control

Supports real-time commands and high-accuracy voice recognition

Build & Mobility

Maximum Climbing Angle

40° (30° for AIR version)

Maximum Stair Step Height

16 cm

Terrain Suitability

Flat surfaces; not for wet, dusty, or soft terrains

Stability Mechanism

Composite force and position control for precision

Battery

Endurance

1–2 hours depending on load and terrain

Battery Placement

Inserted on the side; 3 or more bars recommended for operation

Customization

Custom Modes

Various motions (e.g., dance, moonwalk, stretch) via app or remote

Accessories

Auto Retractable Strap

Adjustable hand strap (PRO version only)

Consumable Parts

Spare foot-end pads included for rough terrain use

Limitations

Environmental Conditions

Operates best in 5°C–35°C; not waterproof or dust-proof

Friction Requirements

Unsuitable for icy, wet, or low-friction grounds


Use Cases

The Robot Dog is a programmable robot, its base function is to maneuver/navigate environments.

Basic manual operation is done via remote control through an app or the physical remote, along with voice commands.

Further functions can be achieved through programming the Robot with further attachments and equipment. As these functions get developed through Research and Teaching, they will become available for use.

Navigation and exploration

Autonomous navigation: Equipped with 4D LIDAR and obstacle avoidance for autonomous movement and terrain mapping.

Terrain exploration: Suitable for navigating and exploring controlled environments.

Surveillance and inspection

Remote monitoring: Real-time image transmission allows for surveillance or inspection in hard-to-reach or unsafe areas.

Inspection tasks: Useful for inspecting industrial or confined areas autonomously.

Logistics and Assistance

Lightweight object transportation: Can carry or move lightweight items, especially when paired with the D1 Servo Arm.

Routine assistance: Assists in controlled tasks such as fetching or organising objects in structured environments.

Training LMS ⧉

Complete relevant RobotDog training modules to access Equipment Loans.

Equipment Loan ⧉

Loan from the Media Hub once you have finished the induction training.

Consultations ⧉

Consult our technicians about your projects or for technical guidance.

Basic Simulation
Expansion Equipment / Attachments