Live FX Documentation
  • 📚Getting Started
    • Introduction
      • Feature Comparison
      • Download & Install
      • Settings
        • System Settings
          • Custom Commands
          • Advanced
        • User Settings
        • Video IO Settings
          • Black Magic Troubleshooting
      • Helpful Shortcuts
        • Construct Shortcuts
        • Player Shortcuts
        • Viewport Shortcuts
        • ColorFX Shortcuts
        • Stage Lights Shortcuts
    • The Basics
      • Project Settings
      • Change Shot Framerate and Resolution
      • Change the Shot Length
      • Timecode
      • Working with Layers
      • General Tips
    • Helpful Links
    • First Time Start-Up
    • User Interface
  • ⚡Quickstart Example Projects
    • Simple Video Playback
    • Green Screen with 360 Background
    • LED Wall - Projection Mapping
    • LED Wall - Without Projection Mapping
    • Packaged Unreal Engine project Example
  • 📼Video Playback
    • Video-IO Settings
    • Playback h.264 files
    • Import Media and Maintain Folder Structure
  • 🎥Camera Tracking
    • Camera and Lens Calibration
    • Camera Trackers
      • REtracker Bliss
      • ZEISS CinCraft Scenario
      • MoSys
      • Stype
      • OptiTrack
      • HTC Mars Camtrack
      • Free D
      • iPhone Apps
    • How to apply camera tracking to a layer, image, or mask
    • How to manually adjust camera tracking speed and delay
    • How to delay the Inner Frustum
    • How to apply FIZ (Zoom and Focus) from Camera Tracking
  • Motion Control
    • Mark Roberts Motion Control (MRMC)
    • Motorized Precision
    • eMotimo
    • SISU
  • 🟩Green Screen Workflow
    • Qualifiers
    • Green Screen with Set Extension Workflow
    • Working with Ultimatte
  • 📺LED Workflow
    • Setting up an LED Wall
      • Set Up Nvidia Mosaic
      • Nvidia Multi-GPU Configuration
    • Stage Manager
    • How to make the inner frustum green
    • Switcher Node
    • Tips when using Projection Mapping
    • Using HDR
    • Projection Mapping Tutorials
      • Part 1: General Introduction to Projection Mapping
      • Part 2: : Projection Mapping - Media Types & Projection Models
      • Part 3: Projection Mapping on multiple walls
      • Part 4: Projection Mapping with Unreal Engine
      • Part 5: Set Extensions
  • 💡Lighting
    • Getting started with Image Based Lighting
    • Lighting Brands
      • Aputure
      • Creamsource
      • Kino Flo
        • Mimik 120
        • Freestyle
      • Prolycht
      • Quasar Science
    • How to Sample Multiple Video Sources through Videowall
    • Rec2020 Example Project
      • Megapixel HELIOS settings for Mimik
      • Blackmagic UltraStudio 4k Mini Settings
      • Setup Display Colorspace
      • Video IO Settings
      • Create and Set up your Project
      • Create and set up your Shot
      • Fix Jitter
  • 🎮Unreal Engine
    • Known Issues with Live FX <> Unreal
    • Set up Unreal Engine with Live FX
    • Play your Unreal Engine scene as a Game
    • How to Package an Unreal Engine Project for Live FX
      • Command Line Arguments
      • Before You Package
    • Unreal Web Remote Control (Optional)
    • Control UE through OSC
    • Take Recorder
    • Working with Sequencer
    • Unreal Optimization
      • Console Commands / Command Line Arguments
      • List of all Console Variables and Commands
        • Scalability Groups Console Variables
        • Renderer Console Variables
      • Baking Light
      • DLSS
  • ©️Cuebric Workflow
    • Projection Mapping with Cuebric Files
    • Non-Projection Mapping with Cuebric
  • 🧊2.5d Workflow
    • Projection Mapping 2.5d Workflow
    • Non-Projection Mapping 2.5d Workflow
  • 🖥️Multi-Computer Workflows
    • Sync Players
    • Sync Projects across Multiple Nodes
  • 🌅Live Looks - Live Grade LED Walls
    • How to Purchase and Install Live Looks
    • How to set up Live Looks with Brompton
    • Live Grading with Live Looks
  • 🗃️Compositing
    • Working with the Alpha Channel
    • Re-Map EXR channels
    • Working with Mattes
    • Opacity and Blend Modes
  • 🆘General Troubleshooting
    • 📰Licensing the Software
    • 🕙Dealing with Delay
    • 🖥️Networking Tips
    • ❓Frequently Asked Questions (FAQs)
      • Does Live FX support Unreal 5.3?
      • Why won't Live FX open when I have a second monitor hooked up?
      • How do I switch the toolset from Live FX to Scratch?
      • Where is the record button?
      • Why is my image dark (or why are my colors wrong)?
      • How do I close Live FX?
      • How do I minimize Live FX?
      • What is SCRATCH? What is Assimilate? Are they the same thing as Live FX?
      • How do I change the mouse from moving in circles to left/right?
    • Stuttering Playback
  • 📘User Guide (Old)
Powered by GitBook
On this page
  1. Video Playback

Video-IO Settings

PreviousVideo PlaybackNextPlayback h.264 files

Last updated 11 months ago

The iron rule of thumb is that all YUV (or YCbCr) signals shall be legal range (and vice versa, RGB signals shall be full range ;-). So selecting a YUV output signal, but then choosing an RGB matrix cannot possibly work. selecting a YUV matrix gets you there halfway, but as you will find out, will clip certain parts of the signal at top and bottom (the green bars & green patch).

To elaborate a bit more: Live FX internally works in full range RGB (as any respectable finishing tool / live compositor would). So when setting your output to YUV, Live FX has to apply a matrix to go from RGB to YUV. And that matrix also contains a conversion from full to video range. This is what you select in the second dropdown (below the "Format" one in the upper right corner of the menu). And obviously, if you chose to output a YUV signal, you should then choose the corresponding YUV matrix (and in video range, please) along with it. In your case, that would be Rec2020 Video.

The alternative is to select RGB as the output and choose the RGB full matrix along with it. While the RGB signal is technically cleaner (no chroma subsampling, no matrix conversions), I doubt it will make any difference of significance for IBL. So feel free to choose whatever you think works best.

Now for the lower half of the video-io dialog: There you can override the RGB to YUV Matrix for every output channel. As long as the dropdown there says , it will refer to what you set as the matrix in the upper right corner. But you can override that setting, by selecting a different matrix for any of the output channels.

For input channels, it works similarly. Of course, Live FX can't "define" what arrives in any given input channel. But you can help "interpret" the incoming signal, by setting the corresponding dropdown for that channel to the correct matrix. That being said, whatever you set the matrix to for an input channel, it will not prevent any clipping, as that would already occur on the "sender's" side, when making an SDI signal from it.

The unwritten rule of video signal distribution:

YUV / YCbCr signals are to be encoded, decoded and interpreted as video range. RGB signals are to be encoded, decoded and interpreted as full range.

As long as you don't deviate from this rule, you will live a happy life.

📼
Video-IO menu: The format an default YUV Matrix are chosen in the upper right corner of the menu. In the lower half, the Matrix can be overridden for any of the input / output channels.