Q: What is the latest version of Pixotope?
The current version of Pixotope is version 23.3.1
It is built using Unreal Engine 5.3.1
Q: What is Pixotope?
Pixotope is a next-generation virtual production system, radically advancing performance, usability and business model for cross-reality content production.
Pixotope harnesses the full power of the Unreal Engine to create anything from a single camera project, to a live multi-camera virtual studio production, all at a cinematic quality. …And it only gets better with an integrated tracking system. Pixotope Tracking offers a robust solution for camera & talent tracking.
Pixotope is an open software-based solution for rapidly creating virtual studios, augmented reality (AR), LED based (XR) and next-generation on-air graphics. It utilizes powerful commodity hardware and is specifically designed to connect with partner technologies and external data sources.
Pixotope is built for on-air use enabling rapid design and deployment of virtual, augmented, or mixed reality content. It enables users to configure, create, and control, any kind of virtual production from a single user interface in live broadcast environments.
A highly attractive subscription model ensures that Pixotope can easily be deployed across an organization and scaled to the specific requirements of each project.
Pixotope solves the complexities of creating content in a world of merging realities and changing business models.
Get a more in-depth Overview of Pixotope
Q: What are the main features of Pixotope?
- VS, AR, & XR production-specific toolsets
- Support for all major real-time camera tracking protocols
- Integrated tracking options with automated procedures for color & spacial calibration
- Support for broadcast industry-standard video formats, including UHD/4K and HDR
- HDR 32 bit linear color processing and rendering environment
- Easy to use internal chroma keyer - applied to any media input
- GPU based Internal video processing system for color correction, mask adjustments, and image effects
- Runs on commodity hardware, including professional and high-end game GPUs
- Non-destructive internal compositing assuring that video is not affected by graphic system anti-aliasing
- Integrated Nvidia DLSS for image supersampling and upscaling
- Internal or External compositing pipelines
- Support for standard real-time camera tracking products and protocols
- Seamless WYSIWYG workflow from editor to live broadcast, with live SDI and tracking within the editor
- Easy to use web-based customer customizable control panels
- Support for many Aja & Blackmagic Design video cards
- Support for NDI & SRT videoIO
- Support for file-based input
- Multi-camera support
- Timecode support with LTC timecode from embedded VITC, LTC or external analog timecode sources.
- Timecode locked tracking for many tracking solutions
- External low latency and high accuracy API server for data integration and remote control
- Body Pose Estimation for automatic presenter placement in green/blue screen environments
- Support for industry-standard content generation workflows using FBX, openEXR, etc.
- User-controllable video and tracking delay capability
- Real-time photorealistic CGI characters and environments using Unreal Render engine
Features unique to Unreal Engine based systems
- Photoreal Rendering in real-time: Take advantage of features like raytracing, Lumen & Nanite to achieve Hollywood-quality visuals out of the box
- Blueprints: create content without coding
- VFX & particle systems: simulate any kind of natural or unnatural phenomena e.g. rain, smoke, fire, hair, cloth, explosions, cars, rigid bodies
- Film-quality post-process effects: simulate the properties of a real camera
- Extensive animation toolset: create cinema-quality animated characters
- Sequencer: state-of-the-art non-linear editor for animation and cinematics
- Built for VR, AR, and XR: multi-platform usage of content, including mobile phones
- Terrain & foliage: create large, open-world environments with forests and grass
- Advanced AI: give AI-controlled characters increased spatial awareness of the world around
- Unreal Audio Engine: enhance your project’s audio with revolutionary features including real-time synthesis, dynamic DSP effects, and physical audio propagation modeling
- Native DMX controls to blur the line between real and virtual worlds.
- Marketplace Ecosystem: the Unreal Engine Marketplace has thousands of high-quality assets and plugins to accelerate production and bring new functionality to your work
Learn more about the Components of Pixotope
Q: What does a standard Pixotope license package consist of?
- 1x Live License will provide:
- Access to Director
- Video Input/Output and Camera Tracking input
- Use of Pixotope Engine and WYSIWYG Editor of Live environments
- 2x Artist license providing:
- Limited access to Director - Manage Project files and Launch Editor
- Use of Pixotope Editor for Artists Environment
- 1x Control license providing:
- Access to Director
Q: What support options do you offer?
Pixotope Subscriptions come with a Professional Service Level Objective (SLO). Support is via a ticketing system which can be accessed by
- Online Cloud account → Support (Left Panel) → Create New Ticket (Top-right corner)
- Directly from within the Director panel (Help → Report Issue)
The Professional SLO offers the following features:
- online ticket portal
- Documentation Help Centre
- Example Projects
- Software Updates
- Remote Desktop Support (TeamViewer)
- 24/7 Support with a 1 hour initial response time
- 24 hour weekend support for Urgent Tickets with a 1 hour response time
Q: What are the machine requirements?
Q: Where do I manage my subscription and licenses?
You can view your subscription and licenses by logging in to your account at login.pixotope.com.
Learn more about Manage users and licenses
Q: Does Pixotope support floating licenses?
Licenses are assigned to individual users. The admin can create as many users and assign as many available licenses to them as they like. This allows to move licenses seamlessly between users or to set up a shared "group" user containing multiple licenses. Moving licenses around is not supported in offline or air gapped modes, as these effectively lock the license to your machine.
Pixotope does currently not support an onsite floating license server.
Learn more about License modes
Q: I have 5 artists in my studio. Do we need a separate login for each user?
No, the customer admin can assign 5 licenses to 1 user account and the artists can share the login credentials for this account.
Learn more about Manage users and licenses
Q: How do I download Pixotope?
As a Pixotope user, you will receive a username and password from your customer admin that allows you to login into login.pixotope.com and download the Pixotope installer, demo content, and updates.
Q: Do I need an internet connection to use Pixotope?
Learn more about License modes
Q: What do I do if the machine my offline license was on dies?
Please contact us via email@example.com if this happens.
Q: Do you supply any demo levels with Pixotope?
Yes. A simple demo project is included in the Pixotope installer. More advanced examples that show typical AR, VS, & XR workflows can be downloaded from the Pixotope Cloud at login.pixotope.com.
See what Example content is available.
Q: How do I set up Pixotope?
Q: What Video I/O cards does Pixotope support?
We currently support the following AJA cards:
- Kona 4
- Kona 5
- Corvid 44
- Corvid 44 12G (from Pixotope 1.4.0 and up)
- Corvid 88
- AJA Io 4k Plus
We also support Blackmagic Design video cards:
- DeckLink 8K pro
- DeckLink Duo 2
- Decklink Quad 2
- DeckLink SDI 4K
We do not currently support any other manufacturer's video cards.
Q: Does Pixotope support single link UHD?
Yes, Pixotope supports single link 12G UHD video using the AJA Kona 5, Corvid 44 12G and Io 4K Plus. It also supports 12G UHD on the Blackmagic Design DeckLink 8K Pro, or 6G UHD on the DeckLink SDI 4K. We do not support dual or quad link UHD.
Q: Do you support the Nvidia 4090 GPUs?
After initial testing of the 4090 GPUs we found they give a significant improvement in performance over the RTX 3090 series. We would recommend 4090 GPUS for UHD and high-end productions.
Q: Do you support AMD Ryzen CPUs?
Our tests indicate that high performance AMD Ryzen CPUs are a valid alternative to Intel-based system. In certain configurations they can outperform Intel CPUs, and with a PCIe4 compatible motherboard can increase GPU performance.
*Ryzen 5 and 7 requiring a motherboard that supports 16x + 8x PCI-E connectors to run GPU and Video I/O card at max bandwidth.
Q: Do you support AMD GPUs?
Pixotope does not support AMD graphics cards. Please use NVIDIA graphics cards. Please refer to the System requirements for the latest recommendations.
Q: Do you have recommendations for how to set up the hardware and software environment?
Yes, we have a comprehensive guide here which describes everything you need to know about setting up the hardware and software, and has guidelines for diagnosing common issues.
Q: Which tracking systems are supported?
- FreeD D1 *
- Stype HF 1.1
- SMT TP11
- MoSys F4 1.7 StarTracker
- MoSys F4 1.7 Mechanical *
- Ross UX v2
- Ncam SDK
- Ncam Lite *
- Spidercam Frame B
- Telemetrics 0rad8
- Manual *
* (with local lens file)
Q: Can I easily position the tracked camera in my scene?
Yes, the tracked camera can be relocated in the Scene by using the Camera Root actor. This is calibrated in real world units.
Q: Can Pixotope handle offsets for a camera mount?
Yes, Pixotope supports a simple Camera Mount template system for setting the location of locked-off, PTZ or jib-mounted cameras so that the tracking is related to the exact position in the environment.
Q: Does Pixotope support video delay?
Yes, video input, output, tracking, and api commands can all be delayed individually. Go to SETUP → Calibrate → Syncing.
Learn more about syncing in Calibrate syncing
Q: Does Pixotope support interlaced video input and output signals?
Learn more about how to work with interlaced video in the Video IO section
Q: Do you support HDR?
Learn more about HDR and Color Management
Q: Do you support audio input and output over SDI?
Pixotope supports embedded audio output from the Scene with all AJA and Blackmagic Design cards. Please note that the audio output will be embedded in ALL audio pairs of the SDI output.
AES/EBU audio output, separated from the SDI video output, is currently supported only by the AJA Kona 4 and Kona 5 cards with the 4K firmware. Corvid series cards do not support AES/EBU audio.
Pixotope also supports audio output over any analog audio interface installed or connected to the render machine. A good solution for monitoring or outputting balanced analog audio is to use a USB D/I box.
Pixotope does NOT support audio input, embedded in SDI, passthrough to the SDI output. Audio output is currently limited to audio from the scene, generated in UE.
Q: Can Pixotope machines sync projects and assets between each other?
Yes, Asset Hub is a feature included in Pixotope that automatically synchronizes any content of a specified folder between all connected machines.
Learn more about Asset Hub here: File syncing using Asset Hub
You may also use a source control system such as Perforce, Git, or SVN.
Q: Can Pixotope show debug outputs of alpha masks, spill masks or other masks?
Yes. Masks can be viewed from the Utilities panel in the Editor and the alpha mask can be output to SDI from the Keyer panel in Director.
Learn more about the Utilities panel in the editor here: Utilities panel
Learn more about the Keyer panel in Director here: Pixotope Keyer
Q: Can you do lens calibration in Pixotope?
We recommend that the tracking vendor calibrates the lenses for your setup. For systems that don't provide lens tracking data, we have a lens file format and we can supply template lenses in many cases. Lens files can also be manually tuned.
Q: Can I have multiple main camera inputs active at the same time on one render machine?
No, you can only have 1 active main camera per render machine at a time. You can take additional camera feeds as an untracked Media Input.
In XR Edition, you can configure multiple tracked camera inputs and switch between them prior to the render/engine using the Camera Input Switching feature. This feature is included without additional costs.
Q: Do you support LTC timecode input?
Yes, using embedded digital LTC timecode in VITC or LTC in the SDI signal, or external analog audio input via the LTC port of the AJA card. LTC is also supported as embedded digital LTC by the Blackmagic Design video cards.
Q: How many keyed inputs do you support per engine?
Pixotope supports keying on any input to the engine.
Q: Do you support third party plugins of Unreal?
Third-party Unreal plugins which are compiled for the version of Unreal that Pixotope is built on (4.27.1 or 5.0.3) should run with no modifications necessary.
We also provide an SDK for users who wish to re-compile or write their own custom plugins in C++. Please contact your sales representative for more details.
Keep in mind that some plugins, such as those that attempt to control the videoIO or certain material functions, may conflict with Pixotope.
Q: Which cameras are supported?
Pixotope supports all cameras that can produce a standard broadcast video output through SDI video.
Q: Can I use Windows 11?
Yes, Windows 11 is supported.
Q: Does Pixotope support NDI?
Yes, Pixotope supports NDI® 5.
NDI® (Network Device Interface) is a high performance standard that allows anyone to use real time, ultra low latency video on existing IP video networks.
Find more information on http://ndi.tv/
Q: What is the networking setup needed for Pixotope?
Pixotope requires a main network adaptor and ip address to be chosen. This is the address that will be used to talk with the Datahub service. If you wish to change this ip address or select a different network adapter, we suggest that you change the ip address, and then restart Director, selecting the new desired network adaptor / ip address.
We also recommend an additional network adapter with a static IP on an isolated network exclusively for tracking data.