Prisim | Unreal Engine
Quote from User on 19. October 2020, 6:28Hello,
Is there anyway to integrate prism into unreal engine?
Is making a plugin the way to go? if so is there a plugin template or something to get started?Thanks!
Hello,
Is there anyway to integrate prism into unreal engine?
Is making a plugin the way to go? if so is there a plugin template or something to get started?
Thanks!
Quote from RichardF on 19. October 2020, 8:27Hey,
Unreal is a very important topic and it will be supported by Prism in the future. I started looking into it already, but it's too early to be able to give any guidelines how to integrate Prism into Unreal.
I'm creating a Prism app plugin for it like for any other DCC. You can do that in the Plugins tab of the Prism Settings window. You can take a look at the existing app plugins, but because Unreal is quite different from all other currently supported DCCs, there is no perfect template.
You'd need to develop some new ideas how Prism can be integrated into Unreal. I'd be happy to discuss possible ways how that could look like.
Cheers,
Richard
Hey,
Unreal is a very important topic and it will be supported by Prism in the future. I started looking into it already, but it's too early to be able to give any guidelines how to integrate Prism into Unreal.
I'm creating a Prism app plugin for it like for any other DCC. You can do that in the Plugins tab of the Prism Settings window. You can take a look at the existing app plugins, but because Unreal is quite different from all other currently supported DCCs, there is no perfect template.
You'd need to develop some new ideas how Prism can be integrated into Unreal. I'd be happy to discuss possible ways how that could look like.
Cheers,
Richard
Quote from User on 19. October 2020, 12:28hi!
I think for starters potentially implementing the "state manager" functionality so you can import published stuff and then the render management functions so you can render out image sequences of your published stuff, convert to mp4 and so on. Most data does not come back once it goes into unreal.
So maybe initially treating unreal as a lookdev/scene assembly/render dcc, instead of the entire beast of an app that it is. Does that make sense? What were you thinking?
Thanks
hi!
I think for starters potentially implementing the "state manager" functionality so you can import published stuff and then the render management functions so you can render out image sequences of your published stuff, convert to mp4 and so on. Most data does not come back once it goes into unreal.
So maybe initially treating unreal as a lookdev/scene assembly/render dcc, instead of the entire beast of an app that it is. Does that make sense? What were you thinking?
Thanks
Quote from RichardF on 23. October 2020, 8:52I agree that importing and rendering are the most important tasks in the beginning.
I was thinking about about bringing in whole sequences at once and placing them correctly in the sequencer. For bringing in shots I'd like to utilize USD, which would bring in all required assets for a shot automatically. Not sure yet if the State Manager would be the right tool for that.
When doing the layout in Unreal it would be great to export the cameras to Houdini and Maya, but that doesn't have the highest priority in the beginning.
Another question would be where the Unreal project would live in a Prism project. For now I only consider to have one Unreal project per Prism project, but it would live outside of the existing asset and shot folders. Maybe it would get it's own tab in the Project Browser.
I agree that importing and rendering are the most important tasks in the beginning.
I was thinking about about bringing in whole sequences at once and placing them correctly in the sequencer. For bringing in shots I'd like to utilize USD, which would bring in all required assets for a shot automatically. Not sure yet if the State Manager would be the right tool for that.
When doing the layout in Unreal it would be great to export the cameras to Houdini and Maya, but that doesn't have the highest priority in the beginning.
Another question would be where the Unreal project would live in a Prism project. For now I only consider to have one Unreal project per Prism project, but it would live outside of the existing asset and shot folders. Maybe it would get it's own tab in the Project Browser.
Quote from User on 25. October 2020, 2:38Just for reference, here are the relative new features for 4.26:
source: https://forums.unrealengine.com/unreal-engine/announcements-and-releases/1814986-unreal-engine-4-26-preview
Cinematic and Virtual Production Updates:
- Movie Render Queue Improvements (Beta). We’ve continued work on the new Movie Render Queue, giving you new options to control its operation and to customize the images it produces.
- Render passes. You can now choose to export media using only selected render passes, including ObjectId, Motion Vectors, Z-Depth, World Position, World Normals, Ambient Occlusion, and Reflections. This opens up a wide range of possibilities for compositing images generated by Unreal Engine in external applications.
- Scripting and interop. We’ve added scripting hooks for Python and Blueprint, which you can use to integrate your rendering with distributed compute systems and render farms such as AWS Thinkbox Deadline.
- It’s now also possible to use Python scripts to render from the command line without human intervention, allowing you to integrate movie rendering into larger-scale pipelines.
- Runtime rendering. You can now use the Render Manager to generate still or sequence images at runtime, not just in the Unreal Editor.
- Pro codecs. The Movie Render Queue now supports pro codecs, including Apple Pro Res and Avid DNxHR.
- Multi-channel EXRs. You can now export media in multi-channel EXR format, packing multiple channels such as base color, ambient occlusion, and reflections into a single image file.
- EXR compression. When you export to EXR format, you can now compress the resulting files using your choice of compression formats.
- Final Cut Pro EDLs. Create Edit Decision Lists (EDLs) in an industry standard format.
- Sequencer Improvements:
- Nonlinear Animation. This collection of features improves Sequencer tools to create, modify, join, and blend animation assets in order to quickly author new animation cinematics for virtual productions and games. This will reduce the need to use external tools for animation authoring and blending. These improvements include blending the root motion of skeletal animation sequences; joint matching for better blending and placement; skeleton animation preview for precise placement; and Control Rig FK / IK integration.
- Quality of Life Improvements. There are many quality of life improvements in Sequencer this release, including UX/UI, pipeline, evaluation, and Take Recorder.
- Pro codecs. Sequencer can now play back media using pro codecs, including Apple Pro Res and Avid DNxHR.
- Camera Cuts. Take Recorder can now record camera cuts.
- Editor scripting. You can now use Blueprint and Python to access and control selections for sections, tracks, folders, and objects.
- High Quality Media Export (HQME) Workflow Improvements. Several workflow improvements were made to HQME including: support for Final Cut Pro XML ELDs; open Color IO integration; run-time support for implementation into users' projects; and support for the render farm plug-in Deadline on the Epic Marketplace.
- OCIO support in Editor (Beta). You can now guarantee a consistent color space for all the work you do in Unreal Engine by applying an Open Color IO (OCIO) profile to the Unreal Editor viewport and to the media you export through the new Movie Render Queue.
- 3D Text Improvements. We’ve improved the Text 3D Actor to offer additional bevel options, a 3D outline mode, and joined cursive letters for Arabic text.
- DMX Improvements. This release features DMX system quality of life enhancements, including UX/UI updates, performance improvements, and architecture updates for DMX.
- DMX Monitor. The DMX Monitor update provides a solution for visualizing the incoming universe and the packet data received on that universe.
- Fixture Type Panel. The Fixture Type Panel update redesigns the UI to break-up the Fixture Type Entities and Functions into separate sections with their specific properties.
- Fixture Patch Panel. The Fixture Patch Panel update will provide users a visual representation of the patches and their assigned channel ranges and conflicts. Additionally, this improvement allows users to drag and drop functionality to change starting addresses rather than inputting values or dragging sliders.
- DMX Button. This update will add a DMX Button to the toolbar for displaying a new window with the Monitor and Output Console.
- DMX Attribute Mapping. Standardized naming convention that globally exposes a set of user defined fixture properties for easy lookup and usage. This system helps to reduce the complexity that comes with seemingly infinite number of externally defined attribute names as seen through imported, externally created, GDTF files.
- Output Console. This revamp includes UX/UI updates to the Output Console.
- Controllers Panel. The Controllers Panel will now include other popular communication modes, such as Multicast and Unicast.
- DMX Matrix Support. This addition will help integrate matrix fixtures.
- DMX Pixel Mapping. This update enables the ability to translate a pixel buffer (render target) to a DMX stream. Pixel mapping will allow the use of live render target texture data to drive DMX fixtures or low resolution LED panels and devices.
- DMX Sequencer Integration, Recording, and Playback. Sequencer is a powerful feature that easily enables animation and event triggering. This custom DMX integration into sequencer allows developers to use curves and sub-sequencing to program and control DMX without the need for Blueprints or code. Recording incoming DMX for editing and playback. Allow users to listen for incoming DMX and record the data as new keyframes in a level sequencer. Level sequence can then be replayed, edited, and shared.
- DMX Enabled Fixtures Blueprints (Beta). This update provides an improved set of DMX based fixtures and VFX Blueprints that users can use within existing projects or use as foundations for creating new ones.
- Remote Control API (Beta). With the Remote Control API, you can create a web app to control your Unreal scene remotely. In this update, the Remote Control API is a fully compliant REST API with GET, PUT, POST, and DELETE access.
- WebSocket connections using the API can be persistent in order to receive live data without closing the connection.
- The web server can now be run in a packaged app.
- Added the Remote Control Panel to show exposed controls in the scene.
- Added an API for the Remote Control Panel to improve queries in the scene.
- Live Link XR Plugin. Using the OpenXR framework, Live Link now has support for XR devices, providing a lightweight and accessible tracking system in Unreal Engine.
- nDisplay Improvements. Several features have been added to improve the experience of using nDisplay, including:
- Deterministic rendering for real-time features such as Chaos to improve visual coherency when rendering across cluster render nodes. (Beta)
- You can now use the JSON file format for the nDisplay configuration file.
- You can now use the binary format for cluster events to improve data throughput and latency.
- Improvements to Nvidia’s SwapSync API for synchronization to avoid special cases of tearing.
- You can now leverage Nvidia’s NVLink for multi-GPU systems to use one GPU to render a viewport and to copy the frame to another GPU to display.
- When scaling nDisplay to a large LED Volume, performance can be improved by leveraging the multi-GPU system. For Virtual Production and in-camera VFX scenarios, this means the inner frustum can now be rendered on a second GPU. (Beta)
- Added integration and support for DomeProjection technology in Unreal Engine for projector warping and soft edge blending on large dome surfaces. (Experimental)
- Inter-Process GPU Texture Sharing (Beta). Efficiently send and receive GPU texture data of any kind, resolution, and format between Unreal Engine and other processes while bypassing the CPU. Supports synchronization mechanisms and thread barriers so that coherency is kept between shared applications. This feature is available through nDisplay as well as standalone.
- In-Camera VFX Improvements. We have made improvements to Color Correction Volumes so that more volumes can be in a scene.
- Timecode Improvements (Beta). We have added several improvements when using timecode, including:
- You can now export media with an embedded timecode.
- When recording timecode, missing frame errors are logged with a timestamp.
- When evaluating the last frame data of the recording, the engine’s delta times and the action’s delta times are now smoothed to remove jitter.
Just for reference, here are the relative new features for 4.26:
Cinematic and Virtual Production Updates:
- Movie Render Queue Improvements (Beta). We’ve continued work on the new Movie Render Queue, giving you new options to control its operation and to customize the images it produces.
- Render passes. You can now choose to export media using only selected render passes, including ObjectId, Motion Vectors, Z-Depth, World Position, World Normals, Ambient Occlusion, and Reflections. This opens up a wide range of possibilities for compositing images generated by Unreal Engine in external applications.
- Scripting and interop. We’ve added scripting hooks for Python and Blueprint, which you can use to integrate your rendering with distributed compute systems and render farms such as AWS Thinkbox Deadline.
- It’s now also possible to use Python scripts to render from the command line without human intervention, allowing you to integrate movie rendering into larger-scale pipelines.
- Runtime rendering. You can now use the Render Manager to generate still or sequence images at runtime, not just in the Unreal Editor.
- Pro codecs. The Movie Render Queue now supports pro codecs, including Apple Pro Res and Avid DNxHR.
- Multi-channel EXRs. You can now export media in multi-channel EXR format, packing multiple channels such as base color, ambient occlusion, and reflections into a single image file.
- EXR compression. When you export to EXR format, you can now compress the resulting files using your choice of compression formats.
- Final Cut Pro EDLs. Create Edit Decision Lists (EDLs) in an industry standard format.
- Sequencer Improvements:
- Nonlinear Animation. This collection of features improves Sequencer tools to create, modify, join, and blend animation assets in order to quickly author new animation cinematics for virtual productions and games. This will reduce the need to use external tools for animation authoring and blending. These improvements include blending the root motion of skeletal animation sequences; joint matching for better blending and placement; skeleton animation preview for precise placement; and Control Rig FK / IK integration.
- Quality of Life Improvements. There are many quality of life improvements in Sequencer this release, including UX/UI, pipeline, evaluation, and Take Recorder.
- Pro codecs. Sequencer can now play back media using pro codecs, including Apple Pro Res and Avid DNxHR.
- Camera Cuts. Take Recorder can now record camera cuts.
- Editor scripting. You can now use Blueprint and Python to access and control selections for sections, tracks, folders, and objects.
- High Quality Media Export (HQME) Workflow Improvements. Several workflow improvements were made to HQME including: support for Final Cut Pro XML ELDs; open Color IO integration; run-time support for implementation into users' projects; and support for the render farm plug-in Deadline on the Epic Marketplace.
- OCIO support in Editor (Beta). You can now guarantee a consistent color space for all the work you do in Unreal Engine by applying an Open Color IO (OCIO) profile to the Unreal Editor viewport and to the media you export through the new Movie Render Queue.
- 3D Text Improvements. We’ve improved the Text 3D Actor to offer additional bevel options, a 3D outline mode, and joined cursive letters for Arabic text.
- DMX Improvements. This release features DMX system quality of life enhancements, including UX/UI updates, performance improvements, and architecture updates for DMX.
- DMX Monitor. The DMX Monitor update provides a solution for visualizing the incoming universe and the packet data received on that universe.
- Fixture Type Panel. The Fixture Type Panel update redesigns the UI to break-up the Fixture Type Entities and Functions into separate sections with their specific properties.
- Fixture Patch Panel. The Fixture Patch Panel update will provide users a visual representation of the patches and their assigned channel ranges and conflicts. Additionally, this improvement allows users to drag and drop functionality to change starting addresses rather than inputting values or dragging sliders.
- DMX Button. This update will add a DMX Button to the toolbar for displaying a new window with the Monitor and Output Console.
- DMX Attribute Mapping. Standardized naming convention that globally exposes a set of user defined fixture properties for easy lookup and usage. This system helps to reduce the complexity that comes with seemingly infinite number of externally defined attribute names as seen through imported, externally created, GDTF files.
- Output Console. This revamp includes UX/UI updates to the Output Console.
- Controllers Panel. The Controllers Panel will now include other popular communication modes, such as Multicast and Unicast.
- DMX Matrix Support. This addition will help integrate matrix fixtures.
- DMX Pixel Mapping. This update enables the ability to translate a pixel buffer (render target) to a DMX stream. Pixel mapping will allow the use of live render target texture data to drive DMX fixtures or low resolution LED panels and devices.
- DMX Sequencer Integration, Recording, and Playback. Sequencer is a powerful feature that easily enables animation and event triggering. This custom DMX integration into sequencer allows developers to use curves and sub-sequencing to program and control DMX without the need for Blueprints or code. Recording incoming DMX for editing and playback. Allow users to listen for incoming DMX and record the data as new keyframes in a level sequencer. Level sequence can then be replayed, edited, and shared.
- DMX Enabled Fixtures Blueprints (Beta). This update provides an improved set of DMX based fixtures and VFX Blueprints that users can use within existing projects or use as foundations for creating new ones.
- Remote Control API (Beta). With the Remote Control API, you can create a web app to control your Unreal scene remotely. In this update, the Remote Control API is a fully compliant REST API with GET, PUT, POST, and DELETE access.
- WebSocket connections using the API can be persistent in order to receive live data without closing the connection.
- The web server can now be run in a packaged app.
- Added the Remote Control Panel to show exposed controls in the scene.
- Added an API for the Remote Control Panel to improve queries in the scene.
- Live Link XR Plugin. Using the OpenXR framework, Live Link now has support for XR devices, providing a lightweight and accessible tracking system in Unreal Engine.
- nDisplay Improvements. Several features have been added to improve the experience of using nDisplay, including:
- Deterministic rendering for real-time features such as Chaos to improve visual coherency when rendering across cluster render nodes. (Beta)
- You can now use the JSON file format for the nDisplay configuration file.
- You can now use the binary format for cluster events to improve data throughput and latency.
- Improvements to Nvidia’s SwapSync API for synchronization to avoid special cases of tearing.
- You can now leverage Nvidia’s NVLink for multi-GPU systems to use one GPU to render a viewport and to copy the frame to another GPU to display.
- When scaling nDisplay to a large LED Volume, performance can be improved by leveraging the multi-GPU system. For Virtual Production and in-camera VFX scenarios, this means the inner frustum can now be rendered on a second GPU. (Beta)
- Added integration and support for DomeProjection technology in Unreal Engine for projector warping and soft edge blending on large dome surfaces. (Experimental)
- Inter-Process GPU Texture Sharing (Beta). Efficiently send and receive GPU texture data of any kind, resolution, and format between Unreal Engine and other processes while bypassing the CPU. Supports synchronization mechanisms and thread barriers so that coherency is kept between shared applications. This feature is available through nDisplay as well as standalone.
- In-Camera VFX Improvements. We have made improvements to Color Correction Volumes so that more volumes can be in a scene.
- Timecode Improvements (Beta). We have added several improvements when using timecode, including:
- You can now export media with an embedded timecode.
- When recording timecode, missing frame errors are logged with a timestamp.
- When evaluating the last frame data of the recording, the engine’s delta times and the action’s delta times are now smoothed to remove jitter.
Quote from RichardF on 26. October 2020, 5:54It's really impressive how many new features they are adding in every release. Especially the Sequencer scripting and command line rendering additions are really important to me.
With all these additions in 4.26 it wouldn't make much sense to support 4.25. But I'm also curious about their plans for 5.0.
It's really impressive how many new features they are adding in every release. Especially the Sequencer scripting and command line rendering additions are really important to me.
With all these additions in 4.26 it wouldn't make much sense to support 4.25. But I'm also curious about their plans for 5.0.
Quote from User on 26. October 2020, 23:14Yeah I know what you mean.
4.26 is at preview 4 now and I think for VFX stuff it's easier to jump to the latest release than stick with the old ones. Also in 4.26 you can now stream alembic files which means that you don't need to import them into and convert to .uasset(I believe), I think this is a big deal for vfx pipes.I'm looking into USD atm, have you thought much about prism and usd?
Yeah I know what you mean.
4.26 is at preview 4 now and I think for VFX stuff it's easier to jump to the latest release than stick with the old ones. Also in 4.26 you can now stream alembic files which means that you don't need to import them into and convert to .uasset(I believe), I think this is a big deal for vfx pipes.
I'm looking into USD atm, have you thought much about prism and usd?
Quote from RichardF on 27. October 2020, 5:51I thought alembic streaming was already added a few versions ago. But either way it's definitely my preferred way to handle alembics.
I'm looking into USD at the moment and working on some first prototypes (not Unreal related), but it's to early to announce any details about that yet.
I thought alembic streaming was already added a few versions ago. But either way it's definitely my preferred way to handle alembics.
I'm looking into USD at the moment and working on some first prototypes (not Unreal related), but it's to early to announce any details about that yet.
Quote from User on 27. October 2020, 6:35Oh I think now you don’t need to do the whole geometry cache import thing and can just reference the alembic file?
https://m.youtube.com/watch?v=OchRhhRaE5s&t=1333
yeah I’m struggling to export a usd file from any version of unreal with any useful data 😛
Oh I think now you don’t need to do the whole geometry cache import thing and can just reference the alembic file?
yeah I’m struggling to export a usd file from any version of unreal with any useful data 😛
Quote from RichardF on 27. October 2020, 8:14Ah I guess you are right then. What I read was for streaming geometry in older versions, but not specifically alembics. Will take a look at that presentation for more details.
So far I only imported some Maya usds in Unreal, but didn't try to export anything. If I'll find a way I'll let you know.
Ah I guess you are right then. What I read was for streaming geometry in older versions, but not specifically alembics. Will take a look at that presentation for more details.
So far I only imported some Maya usds in Unreal, but didn't try to export anything. If I'll find a way I'll let you know.