Virtual Production Studio Tour Deep Tech Dive (Unreal Engine, Aximmetry)

2023 ж. 1 Қар.
14 776 Рет қаралды

Step into the world of virtual production where green screen meets real-time rendering in our latest educational video. This isn't a showcase of polished final products; it's a hands-on, solo deep dive into the nuts and bolts of creating virtual environments on a green screen, specifically tailored for educators, tech enthusiasts, and digital creators.
In this detailed presentation, I take you through:
The Green Screen Foundation: I break down the setup and techniques behind our powerful green screen capabilities, explaining how we craft immersive virtual worlds from a blank canvas.
The Gear That Gets Us There: Discover the specialized hardware that works in unison with our green screen to capture and composite scenes in intricate detail.
Software In Sync: Learn about how we harness the power of Unreal Engine for real-time rendering, allowing for the instant creation of educational content and live-action composites.
Workflow in Action: I walk you through our step-by-step process, from calibration to live filming, demonstrating how real-time technology brings efficiency and flexibility to content creation.
Practical Use-Cases: Watch as I put our studio to the test, filming educational content live, and revealing the techniques that make our virtual production process both dynamic and engaging.
This solo journey is an in-depth exploration aimed at demystifying the complex interplay between green screen technology and real-time graphics. Whether you're a creator looking to implement virtual production techniques or an educator seeking to innovate your content creation, this video will provide valuable insights and practical knowledge.
#VirtualProduction #GreenScreenTech #UnrealEngine #SoloDeepDive #EducationalFilmmaking #RealTimeRendering

Пікірлер
  • Pretty slick setup! The prompter in the thumbnail caught my eye. 😎

    @FluidPrompter@FluidPrompter28 күн бұрын
  • Thank you. Some helpful stuff here. Cheers 👍🏽

    @thebuzzmeade@thebuzzmeade5 ай бұрын
  • Great behind the scene views. I am almost at that point and looking forward to start getting bogged down in production problems!

    @Justin_Allen@Justin_Allen6 ай бұрын
    • Thanks! There's always problems. The trick is to keep pushing forward with your ideas and vision, there's always a way!

      @cookseyyy@cookseyyy6 ай бұрын
  • How do you select what pass to record on each HyperDeck? For example green screen on one and final composite on another?

    @sybexstudio6318@sybexstudio63182 ай бұрын
  • Thank you for sharing! I am trying to figure out: is the vive rovers capable to read the lens information via usb in realtime, or not. Did you try to work with photo lenses? Is that possible?

    @brocastteam@brocastteam5 ай бұрын
    • We're using a different system for the lens information (vanishing Point Viper). Photo lenses are difficult to work with as they don't have hard stops. We get around this with photo prime lenses by just faking the DOF.

      @cookseyyy@cookseyyy5 ай бұрын
  • I spent some time going back through your video and wanted to ask about your decision to use Aximmetry. You only mentioned you are using it for compositing...do you use it for anything else? I have chosen to bring in Ultimatte 12 4K systems vs Aximmetry. I went back and forth for awhile on this decision.

    @Justin_Allen@Justin_Allen5 ай бұрын
    • I think it's great for compositing. the controls aren't as intuitive as some others but I've found it to be more than capable in lots of situations for compositing.

      @cookseyyy@cookseyyy3 ай бұрын
    • @@cookseyyy Thanks. I keep bouncing between Aximmetry and Offworld Live's compositing tool. Have you evaluated OWL's tool by chance?

      @Justin_Allen@Justin_Allen3 ай бұрын
  • Hey, great content! Wondering how do you get focus/zoom data to aximmetry?

    @Serjakovs@Serjakovs4 ай бұрын
    • We're currently using the VAnishing Point Vipre system which has encoders that read the position of the focus and zoom rings on the lens. However we might be switching to the ReTracker system in the near future.

      @cookseyyy@cookseyyy3 ай бұрын
  • Do you need GenLock for the Synch? Because the C70 has timecode in/out depending on your preference

    @derherrdirector@derherrdirector3 ай бұрын
    • It depends on the tracking system. The Vive Mars system needs genlock to stay in sync so the C70s aren't technically in sync. However this only matters if you plan to move the cameras independently. We typically only move our A camera which is an Ursa 12k which does have sync so we just use that.

      @cookseyyy@cookseyyy3 ай бұрын
  • Hi Matthew, What a nice in depth tech dive. Nice improved workflow. One question. At 10:26 you change the world origin with an app. Pretty handy. Would you like to share the app? Or is it a native unreal or aximmetry app? Cheers

    @endorfineproducties9499@endorfineproducties94994 ай бұрын
    • Yep, I'm using touch OSC to create some custom control panels that I can use to send OSC data to Aximmetry. I'm then using a camera mover component to change the transform values of the origin over time. you can obviously set them directly but I find it helpful to have a controller to do it on the fly. I also sometimes use a game controller to do the same thing.

      @cookseyyy@cookseyyy3 ай бұрын
  • How do you set up the Vive tracker system for recording VR environments?

    @ScorpioKing3000@ScorpioKing30006 ай бұрын
    • There's lots of helpful resources on youtube, depending on the exact system you're using. In my case we're using the Vive mars system in combination with Aximmetry, but that's by no means the only way.

      @cookseyyy@cookseyyy6 ай бұрын
  • How do you getting the Timecode from the cams and the Mars synced up together in Unreal at realtime. We using Deck Link Cards in combination with Tentacles to send a timeclock into the cams and the mars, but the cams are always not synced because the Blackmagic Media Bundle doesn't get a timeclock. 😔

    @marcusunivers@marcusunivers3 ай бұрын
    • It's not the timecode that matters so much as genlock. Genlock ensures that every frame is in sync with the tracking data coming in, otherwise they are coming in at different rates and will drift over time, even if you set it up correctly. You do need timecode however if you want to do post comp work. That means that your tracking data and video will have the same timestamps making it much easier to synchronize them when you're doing compositing etc.

      @cookseyyy@cookseyyy3 ай бұрын
    • ​@@cookseyyy Ah I understand. So the timecode is good for later syncing in post. 👍 But to sync the scene with the camera, after I recorded it on the cam with timeclock, how do I get the same timecode from something like a tentacle sync into the scene recording in aximmetry or even Unreal Engine directly? I mean I can plug the same timecode from the tentacle sync into my PC Audiojack (there are in sync with eachother so cams and pc getting the same audio timecode). But now I need to get the audio into aximmetry and convert the L audio channel into a timecode and feed it in someway into the scene recording inside of aximmetry to sync it up later in post. How are you handling timecode data in aximmetry or Unreal? Are you providing each timecode seperately or is there a way to use the left channel of the SDI Cam Signal as timecode in aximmetry?😅

      @marcusunivers@marcusunivers3 ай бұрын
    • @univers Unreal does receive timecode input. In content browser find blackmagic timecode provider. After setting it up, go to project setting and change timecode setting accordingly

      @maybelleyo@maybelleyo2 ай бұрын
  • Is there a live input HDRI application that can feed some log 360 images into a real-time rendering application, to get GI like lighting. I.e. for changes in lighting onset to procedurally relight?

    @mrrafsk@mrrafsk3 ай бұрын
    • I know a company called Antilatency are working on something like this. But they are doing it in relation to your lighting setup too, so you have a virtual HDRI capture for your environment and then it sends that data to your lighting to match. Your idea is cool though, and would work really well for AR graphics.

      @cookseyyy@cookseyyy3 ай бұрын
    • @@cookseyyy thanks for the lead. Can't see anything on their website, but I'll dig deeper Found some papers of people in universities doing this but no commercial application. A challenge when doing stage performances (inc fake hologram) is syncing up the lighting onset to the CG. I see some people have developed DMX to Unreal bridges, but I can't find realtime HDRI's.

      @mrrafsk@mrrafsk3 ай бұрын
  • Works with Mac ?

    @JairoAndresAmaya@JairoAndresAmaya4 ай бұрын
  • Hello ! I’m building a professional studio like this but I need help. Can we connect ?

    @JairoAndresAmaya@JairoAndresAmaya4 ай бұрын
    • of course. Feel free to DM me. Where's your studio based?

      @cookseyyy@cookseyyy3 ай бұрын
    • SANTA ANA , CA , I would like connect with you @@cookseyyy

      @JairoAndresAmaya@JairoAndresAmaya3 ай бұрын
KZhead