NExT Lab
Maker SpacesFabLabNExT LabRobotics LabPrint Room and Loans
  • NExT Lab
  • Contact Details
  • NExT Lab Access
  • Sustainability
    • 3D Printing
  • Case Studies & Projects
    • |3DS|VR| Voices of Country
    • |3DP|AR| Prosthetic Habitats
    • |AR| Studio 40: The Field
    • |VR|3DP| Gravity Sketch: Door Handles
    • |3DS| 3D Scanning Examples
    • |AR|3DP| GRANULAR
  • 3D Printing |3DP|
    • 3D Printing at the NExT Lab
      • Other 3D Printing Options
    • Get Started
    • Design Approaches
    • Modelling Guidelines
    • 3D Print Farm
      • Quick-Start Guide
        • File Naming Conventions
      • Detailed Overview
        • 3D Printing Mesh Preparation
        • Submitting a Print Request
        • Post-Submission: Updating, Paying & Collecting
        • Slicing & Settings
    • Open Access Printers
      • PRUSA Open-Access
        • Workflows
          • Materials Experimentation
          • Experimental Techniques
        • Prusa i3 MK3S Fundamentals
        • Hardware Glossary
          • Extruder
          • Hotend & Nozzle
          • Print Surface, Bed & Y Axis
          • PINDA Inductive Probe
          • X-Axis Gantry
          • Z-Axis Stage
        • Software/Slicer Glossary
          • Plater/Virtual Print Bed
          • Print Settings
          • Filament Settings
          • Printer Settings
        • Troubleshooting
          • Filament Jam/Clog
          • Worn Nozzle
          • Broken/Loose Heatbreak
          • First Layer Issues/Prints Not Sticking to Bed
          • Stringing & Oozing Hotend
    • Use Own Filament
    • Key Techniques
      • Hollowing Models
      • Combating Warping
      • Split Models & Joints
      • Joints and Connections
      • Fillets & Chamfers
      • Accuracy, Precision & Tolerancing
      • Post-Processing & Finishing
        • No Sanding Method
        • Sanding Method
        • Epoxy Method
        • Fillers Method
      • Printing for Transparency
      • Mesh Techniques
        • Meshes 101
        • Working with Meshes
        • Repairing Meshes
        • Other Techniques
          • Thicken a Mesh with Grasshopper
          • Mesh Manipulation with Blender
          • Custom Supports in Meshmixer
      • Topography Models
      • Using the Makerbot Experimental Extruder
      • Troubleshooting
      • Adjusting Print Settings
    • Resources
      • Downloadable Software & Accounts
      • Software Primers
        • Autodesk Meshmixer
        • Blender
    • Mold Making and Casting
  • 3D Scanning |3DS|
    • 3D Scanning at the NExT Lab
    • 3D Scanning Use Cases
    • Guides
      • Principles of 3D Scanning / Digital Reconstruction
      • Photogrammetry
        • Photogrammetry Theory
        • Photogrammetry Benchmark
        • Technical Guides
          • From Photos to 3D Spatial Data
          • Advanced Techniques
          • Taking Measurements + Visualisation
          • From Photogrammetry to 3D Printing
      • BLK360 Terrestrial LiDAR Scanner
        • BLK360 Benchmark
        • Scan
        • Register
          • Export from iPad
        • Process
      • Artec Handheld SLT Scanners
        • Using the Scanners
        • Manual Alignment
        • Fill Holes
        • Smoothing
        • Frame Selection
      • VLX LiDAR SLAM Scanner
        • VLX setup
        • Preparing to Scan
        • Using the Scanner
        • Processing the Scans
      • Working with Point Clouds
        • Point Clouds to Meshes
    • Troubleshooting
      • General
      • Artec EVA
      • Leica BLK360
      • VLX
  • Augmented Reality |AR|
    • Augmented/Mixed Reality at the NExT Lab
      • Use Case of AR
    • Guides
      • Hololens 2
      • Fologram
        • Fologram Applications
          • Fologram for Hololens
          • Fologram for Mobile
        • Fologram for Rhino
        • Fologram for Grasshopper
        • Shared Experiences / Tracked Models
        • Extended Functionality
          • Preparing Models for AR
          • Interactivity
          • Fabrication
      • Unity and Vuforia
        • Unity Primer
        • 2D Targets (Image Targets)
        • 3D Targets (Object Targets)
        • Vuforia Primer
        • Creating a Simple AR App
          • Unity Next Steps: Interaction
          • Model Recognition
    • Troubleshooting
      • Hololens & Fologram
      • FAQ: Augmented Reality
    • Resources
      • Platforms (Hardware)
        • Microsoft Hololens
        • Mobile
      • Software Packages
      • Student Contact
        • AR: Intro Sessions
        • AR: Workshops and Resources
          • UntYoung Leaders Program Workshopitled
          • Young Leaders Program Workshop
          • Construction as Alchemy
  • Virtual Reality |VR|
    • Virtual Reality at the NExT Lab
    • Guides
      • Virtual Reality Hardware Set Up
        • Meta Quest 3
          • Troubleshooting
        • HTC Vive Headsets
          • HTC Vive
            • Troubleshooting
          • HTC Vive Pro
          • HTC Vive Cosmos
            • Troubleshooting
      • Twinmotion VR
        • Twinmotion VR: Features
        • Twinmotion VR: Troubleshooting
      • Virtual Reality Experiences
        • Unreal Engine
          • Unreal Engine Primer
            • Process: Level Building, Playing & Packaging
            • Actors: Components, Content and Editors
            • Materials & Textures
            • Lighting & Mobility
            • Player: VR and non-VR
            • Interactivity & Blueprints
          • Unreal Engine: Guides
            • Setting up a VR-ready File & Templates
            • Creating a Basic VR Experience
            • Custom Collision and Navigation
            • UV and Lightmaps
            • Outputting Content
            • Unreal Troubleshooting
            • Point Cloud Visualisation
          • VR: Video Tutorial Series
            • Exporting from Rhino
            • Model Preparation in 3DS Max
            • Unreal Engine
      • Designing in Virtual Reality
        • Gravity Sketch
          • Quick Start
        • Masterpiece Creator
    • Student Contact
      • VR: Intro Sessions
  • Sensing
    • Body Tracking
      • Usage
        • Technical Specifications
      • Data Analysis in Grasshopper
        • Analysis Examples
      • Animated Point Clouds(UE)
  • ROBOTICS
    • Robotic Dog
      • Operational Health & Safety
      • Robot Dog Setup
      • Operation Setup
        • Operation Basics
        • Arm Mode
        • Programming Mode
        • Mapping Mode
      • Advanced Operations
      • Expansion Equipment / Attachments
      • Basic Simulation
      • Troubleshooting
Powered by GitBook
On this page
  • Render Capture
  • Camera Creation
  • Adjust Camera
  • Camera Capture
  • Video Animation
  • Sequencer basics
  • Camera and Rail Setup
  • Level Sequencer
  • Outputting Video

Was this helpful?

  1. Virtual Reality |VR|
  2. Guides
  3. Virtual Reality Experiences
  4. Unreal Engine
  5. Unreal Engine: Guides

Outputting Content

Guides for outputting images from Unreal Engine.

PreviousUV and LightmapsNextUnreal Troubleshooting

Last updated 4 years ago

Was this helpful?

This is a Work in Progress article

Render Capture

By using a Cinematic Camera, it can render out images using physical camera parameters as well as post-processing image effects.

Camera Creation

  1. Using [Place] mode, under the Cinematic tab, drag and drop a Cine Camera Actor into the Level.

  2. In the World Outliner, [RMB] the Cine Camera Actor to bring up its context menu, and choose[Pilot]. This gives allows you to move the lens of the camera using the viewport.

  3. Pilot the Camera as you would navigate the viewport to place it in the desired location.

  4. To stop piloting the Camera, [Eject] (1) from it using the new buttons available in the viewport, pictured below. Alternatively, [RMB] the camera from the World Outliner and [Eject].

  5. To toggle between the Cinematic Camera view and the standard viewport view, use the [Toggle] (2)

  6. This Camera can be locked in place so it won't be accidentally moved via [RMB > Transform > Lock Actor Movement].

Adjust Camera

With the Cinematic Camera Actor selected, the Details panel will reveal a host of different functions to adjust how the Level looks when using this Camera:

  • Current Camera Settings for replicating physical cameras

  • Colour Grading

  • Film for filmic settings

  • Lens for further camera settings, lens effects such as chromatic aberration, and adding layers like dirt and lens flares,

  • Rendering Features for changing up how the light and rendering functions, such as screen space reflections and ambient occlusion

Camera Capture

  1. When the Level is ready to be rendered, [Build Lighting]at its maximum Production quality. This can be accessed through the [Build Menu > Lighting Quality].

  2. Press[G]on your keyboard to enable Game View, this will hide all icons that represent aspects of your Actors like Lights.

  3. In the World Outliner, [RMB] the Cine Camera Actor to bring up its context menu, and choose[Pilot] to pilot the Camera again.

  4. In the [Viewport Options](1), select [High Resolution Screenshot](2),

  5. Depending on the size of the image desired, adjust the Screenshot Size Multiplier (3). This process is dependent on the power of the graphics card of your machine, where very high multipliers may fail.

  6. Press the Capture button (4), to capture an image from the Camera.

  7. A prompt in the bottom right of Unreal Engine will provide the link to the file. Alternatively it can be found in the project folder through ...(ProjectFolder)/Saved/Screenshots/

Video Animation

This section is a Work in Progress section.

Sequencer basics

Camera and Rail Setup

  1. Drag out a camera rig rail asset

  2. Drag out a cine camera actor

  3. Attach the camera to the rig rail.

  4. Details of cineactor camera, set XYZ for 0,0,60

  5. With the camerarigrail selected, adjust the movement spline.

Level Sequencer

  1. From [cinematics], add level sequence

  2. Select camerarigrail and cineactor in the level, then click the add to add both.

  3. Add a keyframe using the keyframe thingy

  4. On the rigrail click the +track to add current position on rail and a keyframe for the initial value.

  5. 0 and 1 value represents start and end.

  6. move the timeline marker around, add a keyframe with the changed current position.

  7. cineactor: actor to track and add a z offset.

Outputting Video

  1. In the sequencer, click render movie

  2. Select video sequence.