Skip to content

Commit

Permalink
[Anat3] Rewrote MRQ section (incl. latests devs)
Browse files Browse the repository at this point in the history
  • Loading branch information
brifsttar committed Jan 30, 2024
1 parent c8493f0 commit 930a1b0
Showing 1 changed file with 9 additions and 7 deletions.
16 changes: 9 additions & 7 deletions _posts/2024-01-01-anatomy-3.md
Original file line number Diff line number Diff line change
Expand Up @@ -51,7 +51,7 @@ So I used a random rest stop from around my childhood home as reference, and jus

Then I imported that in Unreal, did a bit of back and forth between the tools to fine tune some things (mostly lane markings). In Unreal, I added some props and foliage and that was it!

Shout out to [Marketplace](https://www.unrealengine.com/marketplace) creator [Der Sky](https://www.unrealengine.com/marketplace/en-US/profile/Der+Sky), whose "[European Collection: French Highway](https://www.unrealengine.com/marketplace/en-US/product/european-collection-french-highway)" product is absolutely perfect for this use case. Go check their products, if you need road props and signs, they most likely have what you want.
Shout-out to [Marketplace](https://www.unrealengine.com/marketplace) creator [Der Sky](https://www.unrealengine.com/marketplace/en-US/profile/Der+Sky), whose "[European Collection: French Highway](https://www.unrealengine.com/marketplace/en-US/product/european-collection-french-highway)" product is absolutely perfect for this use case. Go check their products, if you need road props and signs, they most likely have what you want.

# Scenario

Expand Down Expand Up @@ -123,16 +123,18 @@ One little caveat is that this doesn't handle cars already being on the overtaki

Since this scenario is fully-autonomous with no user input, and the target hardware is a single computer with three monitors, we can render the scenario into a video instead of having it run in realtime. This has multiple benefits: improved visual quality, perfect framerate, etc.

This part turned out to be quite complex, and for a rather unexpected reason: sound!
This part turned out to be quite complex, and I ended up doing a lot of back and forth between the legacy renderer, and the newer [Movie Render Queue](https://docs.unrealengine.com/5.3/en-US/render-cinematics-in-unreal-engine/) (MRQ). In the end, I chose MRQ, but the process wasn't as smooth as I hoped it would be.

At first, I really wanted to try out the new [nDisplay Movie Render Queue pipeline](https://dev.epicgames.com/community/learning/tutorials/9VX5/unreal-engine-export-ndisplay-renders-using-mrq), which sounds exactly like what I want. Even though we have a single computer, we still have three displays, so being able to correctly configure viewports would be great.
Overall, Movie Render Queue is really an improvement over the legacy pipeline, and using it is pretty straightfoward. One thing that was really appealing was the [nDisplay Movie Render Queue pipeline](https://dev.epicgames.com/community/learning/tutorials/9VX5/unreal-engine-export-ndisplay-renders-using-mrq), which allows rendering video for all viewport of any [nDisplay](https://docs.unrealengine.com/5.3/en-US/ndisplay-overview-for-unreal-engine/) cluster.

However... It seems that [Movie Render Queue](https://docs.unrealengine.com/5.3/en-US/render-cinematics-in-unreal-engine/) doesn't support [spatialized sounds](https://forums.unrealengine.com/t/cant-render-out-sound-from-ue5-using-render-queue-or-movie-scene-capture-reliably/577782), which obviously is an issue for us, as all our sounds are spatialized.
However, since the MRQ-nDisplay is a relatively new addition to Unreal, it has some limitations that I discovered along the way. First, the [command line](https://docs.unrealengine.com/5.3/en-US/using-command-line-rendering-with-move-render-queue-in-unreal-engine/) doesn't seem to support it, which is a shame because I had already taught our [Discord bot](/whats-new-2023-05/#bots) the new `!render` command. Second, the [encoder](https://docs.unrealengine.com/5.3/en-US/cinematic-rendering-export-formats-in-unreal-engine/#commandlineencoder) also doesn't seem to support nDisplay rendering, meaning that I had to encode myself (which is rather easy using [FFmpeg](https://ffmpeg.org/)). But since I couldn't use the command line in the first place, I couldn't automatically trigger the encoder afterwards, so nightly renders were impossible.

So I went back to the [legacy renderer](https://docs.unrealengine.com/4.26/en-US/AnimatingObjects/Sequencer/Workflow/RenderAndExport/RenderMovies/), which is way less fancy, doesn't have nDisplay support, but at least works with spatialized audio. Or at least, that's what I thought? I didn't have much time to investigate the issue, but the generated sound track was captured *after* the end of the level sequence, which meant the sound was pretty useless since it didn't actually record anything happening during the video.
And last but not least, Movie Render Queue doesn't support [spatialized sounds](https://forums.unrealengine.com/t/cant-render-out-sound-from-ue5-using-render-queue-or-movie-scene-capture-reliably/577782), which obviously is an issue for us, as all our sounds are spatialized. As a workaround, I de-spatialized all ego-vehicle sound, and said goodbye to all other vehicles' sound (which is not that big of a deal for a highway scenario).

In the end, I decided to switch back to Movie Render Queue and de-spatialize all ego-sound so that they would be recorded. That meant I lost all other sounds, but since it only included non-ego vehicles, it's not that big of an issue for this highway scenario.
So the rendering required more work than expected, but the end result was really good. It seems Epic has a lot more work planned for the Movie Render Queue, including a very interesting [Movie Render Graph](https://portal.productboard.com/epicgames/1-unreal-engine-public-roadmap/c/1365-movie-render-graph).

And as usual: clicking buttons to render, package or build isn't my thing, so our [Discord bot](/whats-new-2023-05/#bots) learned the new `!render` command to render a [Sequence](https://docs.unrealengine.com/5.3/en-US/unreal-engine-sequencer-movie-tool-overview/). I could have gone a bit further and have it take as argument the scenario CSV file, so that researchers could just type `!render my_scenario.csv` and get a video without ever touching Unreal. But for now, it's definitely not worth implementing that, so maybe for the next project!
# Conclusion

This project was a lot of fun to make, and I learned quite a lot of things in both RoadRunner and Unreal. I feel like a broken record, but those two tools really are empowering us so much to do amazing things. I already started using my new MRQ skills: for documentation purposes, I decided to render movies of all our current (and future) scenarios. See you for the next one!

[vd]: /virtual-driver/

0 comments on commit 930a1b0

Please sign in to comment.