Live FX Documentation
  • 📚Getting Started
    • Introduction
      • Feature Comparison
      • Download & Install
      • Settings
        • System Settings
          • Custom Commands
          • Advanced
        • User Settings
        • Video IO Settings
          • Black Magic Troubleshooting
      • Helpful Shortcuts
        • Construct Shortcuts
        • Player Shortcuts
        • Viewport Shortcuts
        • ColorFX Shortcuts
        • Stage Lights Shortcuts
    • The Basics
      • Project Settings
      • Change Shot Framerate and Resolution
      • Change the Shot Length
      • Timecode
      • Working with Layers
      • General Tips
    • Helpful Links
    • First Time Start-Up
    • User Interface
  • ⚡Quickstart Example Projects
    • Simple Video Playback
    • Green Screen with 360 Background
    • LED Wall - Projection Mapping
    • LED Wall - Without Projection Mapping
    • Packaged Unreal Engine project Example
  • 📼Video Playback
    • Video-IO Settings
    • Playback h.264 files
    • Import Media and Maintain Folder Structure
  • 🎥Camera Tracking
    • Camera and Lens Calibration
    • Camera Trackers
      • REtracker Bliss
      • ZEISS CinCraft Scenario
      • MoSys
      • Stype
      • OptiTrack
      • HTC Mars Camtrack
      • Free D
      • iPhone Apps
    • How to apply camera tracking to a layer, image, or mask
    • How to manually adjust camera tracking speed and delay
    • How to delay the Inner Frustum
    • How to apply FIZ (Zoom and Focus) from Camera Tracking
  • Motion Control
    • Mark Roberts Motion Control (MRMC)
    • Motorized Precision
    • eMotimo
    • SISU
  • 🟩Green Screen Workflow
    • Qualifiers
    • Green Screen with Set Extension Workflow
    • Working with Ultimatte
  • 📺LED Workflow
    • Setting up an LED Wall
      • Set Up Nvidia Mosaic
      • Nvidia Multi-GPU Configuration
    • Stage Manager
    • How to make the inner frustum green
    • Switcher Node
    • Tips when using Projection Mapping
    • Using HDR
    • Projection Mapping Tutorials
      • Part 1: General Introduction to Projection Mapping
      • Part 2: : Projection Mapping - Media Types & Projection Models
      • Part 3: Projection Mapping on multiple walls
      • Part 4: Projection Mapping with Unreal Engine
      • Part 5: Set Extensions
  • 💡Lighting
    • Getting started with Image Based Lighting
    • Lighting Brands
      • Aputure
      • Creamsource
      • Kino Flo
        • Mimik 120
        • Freestyle
      • Prolycht
      • Quasar Science
    • How to Sample Multiple Video Sources through Videowall
    • Rec2020 Example Project
      • Megapixel HELIOS settings for Mimik
      • Blackmagic UltraStudio 4k Mini Settings
      • Setup Display Colorspace
      • Video IO Settings
      • Create and Set up your Project
      • Create and set up your Shot
      • Fix Jitter
  • 🎮Unreal Engine
    • Known Issues with Live FX <> Unreal
    • Set up Unreal Engine with Live FX
    • Play your Unreal Engine scene as a Game
    • How to Package an Unreal Engine Project for Live FX
      • Command Line Arguments
      • Before You Package
    • Unreal Web Remote Control (Optional)
    • Control UE through OSC
    • Take Recorder
    • Working with Sequencer
    • Unreal Optimization
      • Console Commands / Command Line Arguments
      • List of all Console Variables and Commands
        • Scalability Groups Console Variables
        • Renderer Console Variables
      • Baking Light
      • DLSS
  • ©️Cuebric Workflow
    • Projection Mapping with Cuebric Files
    • Non-Projection Mapping with Cuebric
  • 🧊2.5d Workflow
    • Projection Mapping 2.5d Workflow
    • Non-Projection Mapping 2.5d Workflow
  • 🖥️Multi-Computer Workflows
    • Sync Players
    • Sync Projects across Multiple Nodes
  • 🌅Live Looks - Live Grade LED Walls
    • How to Purchase and Install Live Looks
    • How to set up Live Looks with Brompton
    • Live Grading with Live Looks
  • 🗃️Compositing
    • Working with the Alpha Channel
    • Re-Map EXR channels
    • Working with Mattes
    • Opacity and Blend Modes
  • 🆘General Troubleshooting
    • 📰Licensing the Software
    • 🕙Dealing with Delay
    • 🖥️Networking Tips
    • ❓Frequently Asked Questions (FAQs)
      • Does Live FX support Unreal 5.3?
      • Why won't Live FX open when I have a second monitor hooked up?
      • How do I switch the toolset from Live FX to Scratch?
      • Where is the record button?
      • Why is my image dark (or why are my colors wrong)?
      • How do I close Live FX?
      • How do I minimize Live FX?
      • What is SCRATCH? What is Assimilate? Are they the same thing as Live FX?
      • How do I change the mouse from moving in circles to left/right?
    • Stuttering Playback
  • 📘User Guide (Old)
Powered by GitBook
On this page
  1. Unreal Engine

Known Issues with Live FX <> Unreal

PreviousUnreal EngineNextSet up Unreal Engine with Live FX

Last updated 11 months ago

Live FX cannot be used with AR or XR workflows currently. There is no known way to get an object with a clean alpha channel from Unreal Engine Texture share or otherwise. You can likely develop your own pipeline using 3rd party plug-ins to make this work, but we currently do not have a documented way forward.

Live FX cannot be used with Unreal Engine and Green Screen to record the composite or the source capture without serious consequences in post-production. If you want to use Live FX and Unreal with Green Screen, you should plan to use Take Recorder in Unreal Engine to record the tracking data, so that you can re-render and get desired results directly from Unreal Engine.

Live FX with Unreal Engine, for ICVFX captured in camera can be a viable solution with proper testing in advance and with the assumption that you never have to go back and re-render anything from the UE project.

There are several known issues that hopefully we can address shortly, but you should be aware of them before starting down the Live FX > Unreal Engine road.

  1. Multi-Node setups are not easily setup or at least not well defined. It may be possible to set up a Multi-User Session and have one machine render the outer frustum and one machine ender the inner frustum, but this workflow has not fully been tested. So for now, we can only do inner frustum with real-time rendering.

  2. You should test the Live Link auto-record and auto scene-take before using it in production, it currently gets off by at least one take after your first recording. For example, if you set the Slate to 1a Take 1 in Live FX, it will update the UE Scene to 1a Take 1. But when you press record and then stop the recording in Live FX, Live FX ups the take once, then in UE ups the take again, so in Live FX you'd be on 1a Take 2 and in UE you'd be on 1a Take 3. One thing you can do is to use the Auto Record function but do not use the auto-update scene-take function and just be VERY diligent in setting the correct Scene and Take in both Live FX and Unreal Engine.

  3. In Live FX, if the timecode source is sdi and there is no timecode coming through that sdi signal, it may lock the Unreal Engine Live Link.

  4. You cannot record the Full Shot option from the Record settings and maintain a decent frame rate. If you want to record the composite you would need to find another way of doing so. One recommendation is to use an external recorder as your dual head, and record the composite onto that.

  5. You cannot currently record the Source Capture and expect to be able to re-render or do anything meaningful with the tracking data. The way Live FX handles the tracking data, and the way Unreal Engine works, when you go to playback your source captured tracking data from Live FX, it may likely be a different part of your Unreal Sequence that plays back, and you may get different results (due to performance or otherwise). A very simple example is if you use a cloud system, what you record on the day may be at a drastically different part of the cloud animation if you go back to the source capture. You would also need to have Unreal Engine playing live.

  6. Currently, there is no pipeline to import the USD that is recorded from Live FX and get that into Unreal Engine.

  7. The only way to control custom things in your scene directly in Live FX is a . You are better off building your own OSC interface using something like Open Stage Control if you need to fully control UE outside of UE.

🎮
short-term hack
Cannot Record the Full Shot